Back to Search Start Over

Analysis of KNN Information Estimators for Smooth Distributions.

Authors :
Zhao, Puning
Lai, Lifeng
Source :
IEEE Transactions on Information Theory. Jun2020, Vol. 66 Issue 6, p3798-3826. 29p.
Publication Year :
2020

Abstract

KSG mutual information estimator, which is based on the distances of each sample to its $k$ -th nearest neighbor, is widely used to estimate mutual information between two continuous random variables. Existing work has analyzed the convergence rate of this estimator for random variables whose densities are bounded away from zero in its support. In practice, however, KSG estimator also performs well for a much broader class of distributions, including not only those with bounded support and densities bounded away from zero, but also those with bounded support but densities approaching zero, and those with unbounded support. In this paper, we analyze the convergence rate of the error of KSG estimator for smooth distributions, whose support of density can be both bounded and unbounded. As KSG mutual information estimator can be viewed as an adaptive recombination of KL entropy estimators, in our analysis, we also provide convergence analysis of KL entropy estimator for a broad class of distributions. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189448
Volume :
66
Issue :
6
Database :
Academic Search Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
143457055
Full Text :
https://doi.org/10.1109/TIT.2019.2945041