Further Investigations of the Maximum Entropy of the Sum of Two Dependent Random Variables

Jiange Li, James Melbourne

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Scopus citations

Abstract

Cover and Zhang proved a certain reversal of the Entropy Power Inequality for the sum of (possibly dependent) random variables possessing the same log-concave density, and what is more that log-concave densities were the only densities that satisfied such an inequality. In this work the authors consider the analogous reversal of recent Renyi Entropy Power Inequalities for random vectors and again show that not only do they hold for s-concave densities, but that s-concave densities are characterized by satisfying said inequalities.

Original languageEnglish (US)
Title of host publication2018 IEEE International Symposium on Information Theory, ISIT 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1969-1972
Number of pages4
ISBN (Print)9781538647806
DOIs
StatePublished - Aug 15 2018
Event2018 IEEE International Symposium on Information Theory, ISIT 2018 - Vail, United States
Duration: Jun 17 2018Jun 22 2018

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
Volume2018-June
ISSN (Print)2157-8095

Other

Other2018 IEEE International Symposium on Information Theory, ISIT 2018
CountryUnited States
CityVail
Period6/17/186/22/18

Keywords

  • Convex measures
  • Renyi entropy
  • Reverse entropy power inequality

Fingerprint Dive into the research topics of 'Further Investigations of the Maximum Entropy of the Sum of Two Dependent Random Variables'. Together they form a unique fingerprint.

Cite this