Back to Search
Start Over
A New Entropy Power Inequality for Integer-Valued Random Variables
- Source :
- ISIT
- Publication Year :
- 2014
- Publisher :
- Institute of Electrical and Electronics Engineers (IEEE), 2014.
-
Abstract
- The entropy power inequality (EPI) provides lower bounds on the differential entropy of the sum of two independent real-valued random variables in terms of the individual entropies. Versions of the EPI for discrete random variables have been obtained for special families of distributions with the differential entropy replaced by the discrete entropy, but no universal inequality is known (beyond trivial ones). More recently, the sumset theory for the entropy function provides a sharp inequality $H(X+X')-H(X)\geq 1/2 -o(1)$ when $X,X'$ are i.i.d. with high entropy. This paper provides the inequality $H(X+X')-H(X) \geq g(H(X))$, where $X,X'$ are arbitrary i.i.d. integer-valued random variables and where $g$ is a universal strictly positive function on $\mR_+$ satisfying $g(0)=0$. Extensions to non identically distributed random variables and to conditional entropies are also obtained.<br />Comment: 10 pages, 1 figure
- Subjects :
- FOS: Computer and information sciences
Conditional entropy
Discrete mathematics
Computer Science - Information Theory
Information Theory (cs.IT)
Min entropy
Library and Information Sciences
Mrs. Gerber's Lemma
doubling constant
Computer Science Applications
Binary entropy function
Combinatorics
Entropy power inequality
Differential entropy
Conditional quantum entropy
Maximum entropy probability distribution
Entropy inequalities
Transfer entropy
Gibbs' inequality
Shannon sumset theory
entropy power inequality
Joint quantum entropy
Information Systems
Mathematics
Subjects
Details
- ISSN :
- 15579654 and 00189448
- Volume :
- 60
- Database :
- OpenAIRE
- Journal :
- IEEE Transactions on Information Theory
- Accession number :
- edsair.doi.dedup.....032c6339f041ccd05384fd9168ead2d1
- Full Text :
- https://doi.org/10.1109/tit.2014.2317181