The entropy per coordinate of a random vector is highly constrained under convexity conditions

Sergey G Bobkov, Mokshay Madiman

Research output: Contribution to journalArticlepeer-review

52 Scopus citations


The entropy per coordinate in a log-concave random vector of any dimension with given density at the mode is shown to have a range of just 1. Uniform distributions on convex bodies are at the lower end of this range, the distribution with i.i.d. exponentially distributed coordinates is at the upper end, and the normal is exactly in the middle. Thus, in terms of the amount of randomness as measured by entropy per coordinate, any log-concave random vector of any dimension contains randomness that differs from that in the normal random variable with the same maximal density value by at most 1/2. As applications, we obtain an information-theoretic formulation of the famous hyperplane conjecture in convex geometry, entropy bounds for certain infinitely divisible distributions, and quantitative estimates for the behavior of the density at the mode on convolution. More generally, one may consider so-called convex or hyperbolic probability measures on Euclidean spaces; we give new constraints on entropy per coordinate for this class of measures, which generalize our results under the log-concavity assumption, expose the extremal role of multivariate Pareto-type distributions, and give some applications.

Original languageEnglish (US)
Article number5961831
Pages (from-to)4940-4954
Number of pages15
JournalIEEE Transactions on Information Theory
Issue number8
StatePublished - Aug 2011

Bibliographical note

Funding Information:
Manuscript received October 13, 2009; revised January 12, 2011; accepted March 28, 2011. Date of current version July 29, 2011. S. Bobkov was supported by NSF Grant DMS-0706866. M. Madiman was supported in part by a Junior Faculty Fellowship from Yale University and in part by the NSF CAREER Grant DMS-1056996. The material in this paper was presented in part at the 2010 IEEE International Symposium on Information Theory, Austin, TX, June 2010. S. Bobkov is with the School of Mathematics, University of Minnesota, Minneapolis, MN 55455 USA (e-mail: M. Madiman is with the Department of Statistics, Yale University, New Haven, CT 06511 USA (e-mail: Communicated by S. Diggavi, Associate Editor for Shannon Theory. Digital Object Identifier 10.1109/TIT.2011.2158475


  • Convex measures
  • inequalities
  • log-concave
  • maximum entropy
  • slicing problem


Dive into the research topics of 'The entropy per coordinate of a random vector is highly constrained under convexity conditions'. Together they form a unique fingerprint.

Cite this