101. Delay independence of mutual-information rate of two symbolic sequences
- Author
-
Annick Lesne, Jean-Luc Blanc, and Laurent Pezard
- Subjects
Stochastic Processes ,Time Factors ,Theoretical computer science ,Finite-state machine ,Transmission delay ,Mutual information ,Models, Theoretical ,Markov Chains ,Variable (computer science) ,Transmission (telecommunications) ,Algorithm ,Independence (probability theory) ,Block (data storage) ,Mathematics ,Data compression - Abstract
Introduced by Shannon as a ``rate of actual transmission,'' mutual information rate (MIR) is an extension of mutual information to a pair of dynamical processes. We show a delay-independence theorem, according to which MIR is not sensitive to a time shift between the two processes. Numerical studies of several benchmark situations confirm that this theoretical asymptotic property remains valid for realistic finite sequences. Estimations based on block entropies and a causal state machine algorithm perform better than an estimation based on a Lempel-Ziv compression algorithm provided that block length and maximum history length, respectively, can be chosen larger than the delay. MIR is thus a relevant index for measuring nonlinear correlations between two experimental or simulated sequences when the transmission delay (in input-output devices) or dephasing (in coupled systems) is variable or unknown.
- Published
- 2011
- Full Text
- View/download PDF