Back to Search Start Over

On Rényi Entropy Power Inequalities.

Authors :
Ram, Eshed
Sason, Igal
Source :
IEEE Transactions on Information Theory. Dec2016, Vol. 62 Issue 12, p6800-6815. 16p.
Publication Year :
2016

Abstract

This paper gives improved Rényi entropy power inequalities (R-EPIs). Consider a sum Sn = \sum k=1^{n} Xk of n independent continuous random vectors taking values on {\mathbb {R}}^{d} , and let \alpha \in [1, \infty ] . An R-EPI provides a lower bound on the order- \alpha Rényi entropy power of S_{n} that, up to a multiplicative constant (which may depend in general on n, \alpha , d$ ), is equal to the sum of the order- \alpha $ Rényi entropy powers of the n$ random vectors . For $\alpha =1$ , the R-EPI coincides with the well-known entropy power inequality by Shannon. The first improved R-EPI is obtained by tightening the recent R-EPI by Bobkov and Chistyakov, which relies on the sharpened Young’s inequality. A further improvement of the R-EPI also relies on convex optimization and results on rank-one modification of a real-valued diagonal matrix. [ABSTRACT FROM PUBLISHER]

Details

Language :
English
ISSN :
00189448
Volume :
62
Issue :
12
Database :
Academic Search Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
119616368
Full Text :
https://doi.org/10.1109/TIT.2016.2616135