Error Bounds on a Mixed Entropy Inequality

James Melbourne, Saurav Talukdar, Shreyas Bhaban, Murti V. Salapaka

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Scopus citations

Abstract

Motivated by the entropy computations relevant to the evaluation of decrease in entropy in bit reset operations, the authors investigate the deficit in an entropic inequality involving two independent random variables, one continuous and the other discrete. In the case where the continuous random variable is Gaussian, we derive strong quantitative bounds on the deficit in the inequality. More explicitly it is shown that the decay of the deficit is sub-Gaussian with respect to the reciprocal of the standard deviation of the Gaussian variable. What is more, up to rational terms these results are shown to be sharp.

Original languageEnglish (US)
Title of host publication2018 IEEE International Symposium on Information Theory, ISIT 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1973-1977
Number of pages5
ISBN (Print)9781538647806
DOIs
StatePublished - Aug 15 2018
Event2018 IEEE International Symposium on Information Theory, ISIT 2018 - Vail, United States
Duration: Jun 17 2018Jun 22 2018

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
Volume2018-June
ISSN (Print)2157-8095

Other

Other2018 IEEE International Symposium on Information Theory, ISIT 2018
Country/TerritoryUnited States
CityVail
Period6/17/186/22/18

Bibliographical note

Publisher Copyright:
© 2018 IEEE.

Fingerprint

Dive into the research topics of 'Error Bounds on a Mixed Entropy Inequality'. Together they form a unique fingerprint.

Cite this