Back to Search Start Over

Entropy Estimators for Markovian Sequences: A Comparative Analysis.

Authors :
De Gregorio, Juan
Sánchez, David
Toral, Raúl
Source :
Entropy; Jan2024, Vol. 26 Issue 1, p79, 26p
Publication Year :
2024

Abstract

Entropy estimation is a fundamental problem in information theory that has applications in various fields, including physics, biology, and computer science. Estimating the entropy of discrete sequences can be challenging due to limited data and the lack of unbiased estimators. Most existing entropy estimators are designed for sequences of independent events and their performances vary depending on the system being studied and the available data size. In this work, we compare different entropy estimators and their performance when applied to Markovian sequences. Specifically, we analyze both binary Markovian sequences and Markovian systems in the undersampled regime. We calculate the bias, standard deviation, and mean squared error for some of the most widely employed estimators. We discuss the limitations of entropy estimation as a function of the transition probabilities of the Markov processes and the sample size. Overall, this paper provides a comprehensive comparison of entropy estimators and their performance in estimating entropy for systems with memory, which can be useful for researchers and practitioners in various fields. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10994300
Volume :
26
Issue :
1
Database :
Complementary Index
Journal :
Entropy
Publication Type :
Academic Journal
Accession number :
175047929
Full Text :
https://doi.org/10.3390/e26010079