Back to Search Start Over

A New Entropy Power Inequality for Integer-Valued Random Variables

Authors :
Saeid Haghighatshoar
Emmanuel Abbe
Emre Telatar
Source :
ISIT
Publication Year :
2014
Publisher :
Institute of Electrical and Electronics Engineers (IEEE), 2014.

Abstract

The entropy power inequality (EPI) provides lower bounds on the differential entropy of the sum of two independent real-valued random variables in terms of the individual entropies. Versions of the EPI for discrete random variables have been obtained for special families of distributions with the differential entropy replaced by the discrete entropy, but no universal inequality is known (beyond trivial ones). More recently, the sumset theory for the entropy function provides a sharp inequality $H(X+X')-H(X)\geq 1/2 -o(1)$ when $X,X'$ are i.i.d. with high entropy. This paper provides the inequality $H(X+X')-H(X) \geq g(H(X))$, where $X,X'$ are arbitrary i.i.d. integer-valued random variables and where $g$ is a universal strictly positive function on $\mR_+$ satisfying $g(0)=0$. Extensions to non identically distributed random variables and to conditional entropies are also obtained.<br />Comment: 10 pages, 1 figure

Details

ISSN :
15579654 and 00189448
Volume :
60
Database :
OpenAIRE
Journal :
IEEE Transactions on Information Theory
Accession number :
edsair.doi.dedup.....032c6339f041ccd05384fd9168ead2d1
Full Text :
https://doi.org/10.1109/tit.2014.2317181