Back to Search Start Over

Application of offset estimator of differential entropy and mutual information with multivariate data

Authors :
Iván Marín-Franch
Martín Sanz-Sabater
David H. Foster
Emanuele Frontoni
Source :
Experimental Results, Vol 3 (2022)
Publication Year :
2022
Publisher :
Cambridge University Press, 2022.

Abstract

Numerical estimators of differential entropy and mutual information can be slow to converge as sample size increases. The offset Kozachenko–Leonenko (KLo) method described here implements an offset version of the Kozachenko–Leonenko estimator that can markedly improve convergence. Its use is illustrated in applications to the comparison of trivariate data from successive scene color images and the comparison of univariate data from stereophonic music tracks. Publicly available code for KLo estimation of both differential entropy and mutual information is provided for R, Python, and MATLAB computing environments at https://github.com/imarinfr/klo.

Details

Language :
English
ISSN :
2516712X
Volume :
3
Database :
Directory of Open Access Journals
Journal :
Experimental Results
Publication Type :
Academic Journal
Accession number :
edsdoj.13f6ac7bd60348fc97ba24c25203d66a
Document Type :
article
Full Text :
https://doi.org/10.1017/exp.2022.14