1. MMSE Bounds for Additive Noise Channels Under Kullback–Leibler Divergence Constraints on the Input Distribution.
- Author
-
Dytso, Alex, Faus, Michael, Zoubir, Abdelhak M., and Poor, H. Vincent
- Subjects
INFORMATION measurement ,ESTIMATION theory ,INFORMATION theory ,MINI-Mental State Examination ,COVARIANCE matrices ,NOISE ,GAUSSIAN distribution - Abstract
Upper and lower bounds on the minimum mean square error for additive noise channels are derived when the input distribution is constrained to be close to a Gaussian reference distribution in terms of the Kullback–Leibler divergence. The upper bound is tight and is attained by a Gaussian distribution whose mean is identical to that of the reference distribution and whose covariance matrix is defined implicitly via a system of non-linear equations. The estimator that attains the upper bound is identified as a minimax optimal estimator that is robust against deviations from the assumed prior. The lower bound provides an alternative to well-known inequalities in estimation and information theory—such as the Cramér–Rao lower bound, Stam's inequality, or the entropy power inequality—that is potentially tighter and defined for a larger class of input distributions. Several examples of applications in signal processing and information theory illustrate the usefulness of the proposed bounds in practice. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF