Back to Search Start Over

Mismatched Estimation and Relative Entropy.

Authors :
VerdĂș, Sergio
Source :
IEEE Transactions on Information Theory; Aug2010, Vol. 56 Issue 8, p3712-3720, 9p
Publication Year :
2010

Abstract

A random variable with distribution P is observed in Gaussian noise and is estimated by a mismatched minimum meansquare estimator that assumes that the distribution is Q, instead of P. This paper shows that the integral over all signal-to-noise ratios (SNRs) of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy D(P||Q) (in nats). This representation of relative entropy can be generalized to nonreal-valued random variables, and can be particularized to give new general representations of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189448
Volume :
56
Issue :
8
Database :
Complementary Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
52654409
Full Text :
https://doi.org/10.1109/TIT.2010.2050800