Mixture distributions are extensively used as a modeling tool in diverse areas from machine learning to communications engineering to physics, and obtaining bounds on the entropy of mixture distributions is of fundamental importance in many of these applications. This article provides sharp bounds on the entropy concavity deficit, which is the difference between the differential entropy of the mixture and the weighted sum of differential entropies of constituent components. Toward establishing lower and upper bounds on the concavity deficit, results that are of importance in their own right are obtained. In order to obtain nontrivial upper bounds, properties of the skew-divergence are developed and notions of 'skew' $f$ -divergences are introduced; a reverse Pinsker inequality and a bound on Jensen-Shannon divergence are obtained along the way. Complementary lower bounds are derived with special attention paid to the case that corresponds to independent summation of a continuous and a discrete random variable. Several applications of the bounds are delineated, including to mutual information of additive noise channels, thermodynamics of computation, and functional inequalities.
Bibliographical noteFunding Information:
This work was supported by the National Science Foundation under Grant 1462862 (CMMI), Grant 1544721 (CNS), and Grant 1248100 (DMS). The work of Mokshay Madiman was supported by the National Science Foundation under Grant DMS-1409504. The work of Murti V. Salapaka was supported by the National Science Foundation under Grant CMMI-1462862, Grant CNS 1544721, and Grant ECCS 1809194 (Energy Efficiency in Computing Logical Operations: Fundamental Limits With and Without Feedback).
- Mixture distributions
- differential entropy