Back to Search Start Over

Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage

Authors :
Laura Sacerdote
Roberta Sirovich
Maria Teresa Giraudo
Source :
Entropy, Vol 15, Iss 12, Pp 5154-5177 (2013), Entropy; Volume 15; Issue 12; Pages: 5154-5177
Publication Year :
2013

Abstract

A new, non–parametric and binless estimator for the mutual information of a d–dimensional random vector is proposed. First of all, an equation that links the mutual information to the entropy of a suitable random vector with uniformly distributed components is deduced. When d = 2 this equation reduces to the well known connection between mutual information and entropy of the copula function associated to the original random variables. Hence, the problem of estimating the mutual information of the original random vector is reduced to the estimation of the entropy of a random vector obtained through a multidimensional transformation. The estimator we propose is a two–step method: first estimate the transformation and obtain the transformed sample, then estimate its entropy. The properties of the new estimator are discussed through simulation examples and its performances are compared to those of the best estimators in the literature. The precision of the estimator converges to values of the same order of magnitude of the best estimator tested. However, the new estimator is unbiased even for larger dimensions and smaller sample sizes, while the other tested estimators show a bias in these cases.

Details

Language :
English
Database :
OpenAIRE
Journal :
Entropy, Vol 15, Iss 12, Pp 5154-5177 (2013), Entropy; Volume 15; Issue 12; Pages: 5154-5177
Accession number :
edsair.doi.dedup.....85220643f402a6c3246e424e5cc61267