Back to Search Start Over

Bounds on the Entropy of a Function of a Random Variable and Their Applications

Authors :
Luisa Gargano
Ferdinando Cicalese
Ugo Vaccaro
Source :
IEEE Transactions on Information Theory. 64:2220-2230
Publication Year :
2018
Publisher :
Institute of Electrical and Electronics Engineers (IEEE), 2018.

Abstract

It is well known that the entropy $H(X)$ of a discrete random variable $X$ is always greater than or equal to the entropy $H(f(X))$ of a function $f$ of $X$, with equality if and only if $f$ is one-to-one. In this paper, we give tight bounds on $H(f(X))$ when the function $f$ is not one-to-one, and we illustrate a few scenarios where this matters. As an intermediate step towards our main result, we derive a lower bound on the entropy of a probability distribution, when only a bound on the ratio between the maximal and minimal probabilities is known. The lower bound improves on previous results in the literature, and it could find applications outside the present scenario.<br />Comment: Accepted for publications to IEEE Transactions on Information Theory

Details

ISSN :
15579654 and 00189448
Volume :
64
Database :
OpenAIRE
Journal :
IEEE Transactions on Information Theory
Accession number :
edsair.doi.dedup.....a7074dbd8d47b01429ed97114b41311f