Back to Search
Start Over
Information content of signals using correlation function expansions of the entropy
- Source :
- Physical Review E. 56:4052-4067
- Publication Year :
- 1997
- Publisher :
- American Physical Society (APS), 1997.
-
Abstract
- Formally exact series expressions are derived for the entropy (information content) of a time series or signal by making systematic expansions for the higher-order correlation functions using generalized Kirkwood and Markov superpositions. Termination of the series after two or three terms provides tractable and accurate approximations for calculating the entropy. Signals generated by a Gaussian random process are simulated using Lorentzian and Gaussian spectral densities (exponential and Gaussian covariance functions) and the entropy is calculated as a function of the correlation length. The validity of the truncated Kirkwood expansion is restricted to weakly correlated signals, whereas the truncated Markov expansion is uniformly accurate; the leading two terms yield the entropy exactly in the limits of both weak and strong correlations. The concept of entropy for a continuous signal is explored in detail and it is shown that it depends upon the level of digitization and the frequency of sampling. The limiting forms are analyzed for a continuous signal with exponentially decaying covariance, for which explicit results can be obtained. Explicit results are also obtained for the binary discrete case that is isomorphic to the Ising spin lattice model.
Details
- ISSN :
- 10953787 and 1063651X
- Volume :
- 56
- Database :
- OpenAIRE
- Journal :
- Physical Review E
- Accession number :
- edsair.doi...........cef04d1e8be9d4c766fd1dbc143e5e4a