The entropy per coordinate of a random vector is highly constrained under convexity conditions

Sergey G Bobkov, Mokshay Madiman

Research output: Contribution to journalArticlepeer-review

45 Scopus citations


The entropy per coordinate in a log-concave random vector of any dimension with given density at the mode is shown to have a range of just 1. Uniform distributions on convex bodies are at the lower end of this range, the distribution with i.i.d. exponentially distributed coordinates is at the upper end, and the normal is exactly in the middle. Thus, in terms of the amount of randomness as measured by entropy per coordinate, any log-concave random vector of any dimension contains randomness that differs from that in the normal random variable with the same maximal density value by at most 1/2. As applications, we obtain an information-theoretic formulation of the famous hyperplane conjecture in convex geometry, entropy bounds for certain infinitely divisible distributions, and quantitative estimates for the behavior of the density at the mode on convolution. More generally, one may consider so-called convex or hyperbolic probability measures on Euclidean spaces; we give new constraints on entropy per coordinate for this class of measures, which generalize our results under the log-concavity assumption, expose the extremal role of multivariate Pareto-type distributions, and give some applications.

Original languageEnglish (US)
Article number5961831
Pages (from-to)4940-4954
Number of pages15
JournalIEEE Transactions on Information Theory
Issue number8
StatePublished - Aug 1 2011


  • Convex measures
  • inequalities
  • log-concave
  • maximum entropy
  • slicing problem

Fingerprint Dive into the research topics of 'The entropy per coordinate of a random vector is highly constrained under convexity conditions'. Together they form a unique fingerprint.

Cite this