Back to Search Start Over

Minimum Rates of Approximate Sufficient Statistics.

Authors :
Hayashi, Masahito
Tan, Vincent Y. F.
Source :
IEEE Transactions on Information Theory; Feb2018, Vol. 64 Issue 2, p875-888, 14p
Publication Year :
2018

Abstract

Given a sufficient statistic for a parametric family of distributions, one can estimate the parameter without access to the data. However, the memory or code size for storing the sufficient statistic may nonetheless still be prohibitive. Indeed, for $n$ independent samples drawn from a k$ -nomial distribution with d=k-1$ degrees of freedom, the length of the code scales as d\log n+O(1) . We consider errors measured according to the relative entropy and variational distance criteria. For the code constructions, we leverage Rissanen’s minimum description length principle, which yields a non-vanishing error measured according to the relative entropy. For the converse parts, we use Clarke and Barron’s formula for the relative entropy of a parameterized distribution and the corresponding mixture distribution. However, this method only yields a weak converse for the variational distance. We develop new techniques to achieve vanishing errors, and we also prove strong converses. The latter means that even if the code is allowed to have a non-vanishing error, its length must still be at least ({d}/{2})\log n . [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189448
Volume :
64
Issue :
2
Database :
Complementary Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
127408993
Full Text :
https://doi.org/10.1109/TIT.2017.2775612