21 results on '"Entropy rate"'
Search Results
2. Why Shape Coding? Asymptotic Analysis of the Entropy Rate for Digital Images
- Author
-
Xin, Gangtao, Fan, Pingyi, Ben Letaief, Khaled, Xin, Gangtao, Fan, Pingyi, and Ben Letaief, Khaled
- Abstract
This paper focuses on the ultimate limit theory of image compression. It proves that for an image source, there exists a coding method with shapes that can achieve the entropy rate under a certain condition where the shape-pixel ratio in the encoder/decoder is (Formula presented.). Based on the new finding, an image coding framework with shapes is proposed and proved to be asymptotically optimal for stationary and ergodic processes. Moreover, the condition (Formula presented.) of shape-pixel ratio in the encoder/decoder has been confirmed in the image database MNIST, which illustrates the soft compression with shape coding is a near-optimal scheme for lossless compression of images. © 2022 by the authors.
- Published
- 2023
3. Multiscale Entropy Approaches and Their Applications.
- Author
-
Humeau-Heurtier, Anne and Humeau-Heurtier, Anne
- Subjects
History of engineering & technology ,Alzheimer disease ,CPD ,Cramér-Rao Lower Bound ,EEG ,Fisher ratio ,HMSVM ,HRV ,Holter ,ICEEMDAN ,Multiscale Permutation Entropy ,PD ,RCMDE ,RR interval ,SVM ,Voronoi decomposition ,Wikipedia ,aging ,ambient temperature ,approximate entropy ,asynchrony ,bearing fault diagnosis ,biometric characterization ,brain complexity ,cardiac autonomic neuropathy ,cardiac risk stratification ,center of pressure ,clock drawing test ,complexity ,composite cross-sample entropy ,consolidation ,copula density ,corsi block tapping test ,coupling ,cross-approximate entropy ,cross-conditional entropy ,cross-distribution entropy ,cross-entropy ,cross-fuzzy entropy ,cross-sample entropy ,default mode network ,dependency structures ,diabetes ,digit span test ,dynamic functional connectivity ,edge complexity ,electrocardiogram ,electroencephalogram ,embodied media ,ensemble empirical mode decomposition ,entropy ,entropy rate ,episodic memory ,estimator variance ,eye movement events detection ,fMRI ,falls ,fault diagnosis ,financial time series ,finite-length signals ,fluid intelligence ,functional near infra-red spectroscopy ,fuzzy entropy ,gait ,heart rate variability ,heart rate variability (HRV) ,heart sound ,human behavior ,humanoid ,local robust principal component analysis ,long term monitoring ,magnetoencephalogram ,medical information ,memory effect ,mental workload ,missing values ,motif ,multi-component signal ,multi-scale ,multi-scale dispersion entropy ,multi-scale entropy ,multi-scale entropy (MSE) ,multi-scale permutation entropy ,multifractal spectrum ,multilevel entropy map ,multiscale analysis ,multiscale cross-entropy ,multiscale distribution entropy ,multiscale entropy ,multiscale indices ,multiscale time irreversibility ,multivariate data ,multivariate multiscale dispersion entropy ,multivariate time series ,network complexity ,node complexity ,nonlinear analysis time series analysis ,nonlinear dynamics ,ordinal patterns ,page view ,permutation entropy ,physiological data ,postural control ,postural stability index ,predictability ,prefrontal cortex ,preterm neonate ,pulse interval ,resting state ,resting-state functional magnetic resonance imaging ,sample entropy ,short-term inter-beat interval ,signal complexity ,sleep staging ,stability states ,structural health monitoring ,systolic arterial pressure (SAP) ,systolic blood pressure ,tele-communication ,telemetry ,tensor decomposition ,thermoregulation ,time-scale decomposition ,variational mode decomposition ,vasopressin ,vector autoregressive fractionally integrated (VARFI) models ,weak fault - Abstract
Summary: Multiscale entropy (MSE) measures to evaluate the complexity of time series by taking into account the multiple time scales in physical systems were proposed in the early 2000s. Since then, these approaches have received a great deal of attention and have been used in a wide range of applications. Multivariate approaches have also been developed. The algorithms for an MSE approach are composed of two main steps: (i) a coarse-graining procedure to represent the system's dynamics on different scales and (ii) the entropy computation for the original signal and for the coarse-grained time series to evaluate the irregularity for each scale. Moreover, different entropy measures have been associated with the coarse-graining approach, each one having its advantages and drawbacks. In this Special Issue, we gathered 24 papers focusing on either the theory or applications of MSE approaches. These papers can be divided into two groups: papers that propose new developments in entropy-based measures or improve the understanding of existing ones (9 papers) and papers that propose new applications of existing entropy-based measures (14 papers). Moreover, one paper presents a review of cross-entropy methods and their multiscale approaches.
4. Explicit Renyi Entropy for Hidden Markov Models
- Author
-
Breitner, Joachim, Skorski, Maciej, Breitner, Joachim, and Skorski, Maciej
- Published
- 2020
5. Applications of Information Dynamics to the Study of Nanopores
- Author
-
Gilpin, Claire, Martens, Craig1, Gilpin, Claire, Gilpin, Claire, Martens, Craig1, and Gilpin, Claire
- Abstract
Over the previous three decades both experimental and theoretical research into nanopores has been gaining momentum. It has been discovered that nanopores play an important role in controlling important molecular and cellular scale physiological processes. It has also been discovered that both synthetic and biotic nanopores may have groundbreaking potential for both biomedical devices and scientific research instruments. In particular, nanopores are currently being studied for their potentially cost effective application to DNA sequencing and protein, drug, and pathogen sensing. Additionally, research into the time-dependent electrical properties of nanopores may aid in our understanding and ability to model the behavior of physiological nanoscale membrane ion channels. Recent advances in information theory, particularly the development of time-dependent measures of Shannon entropies, have opened the door to studying these nanoscale systems from a new angle. In this work we will share results of the novel application of these techniques, highlighting their ability to track autonomous fluctuations in nanopore currents. We will also discuss a proposed extension of these techniques that may allow short-time scale prediction of current fluctuations in the future. Additionally, we will discuss a process for testing for potentially interesting nonlinear structure in nanopore interevent interval sequences, where the events are current fluctuations. Lastly, we will discuss some potential future research directions in light of what we have learned.
- Published
- 2018
6. Spatially Adaptive Analysis and Segmentation of Polarimetric SAR Data
- Author
-
Wang, Wei and Wang, Wei
- Abstract
In recent years, Polarimetric Synthetic Aperture Radar (PolSAR) has been one of the most important instruments for earth observation, and is increasingly used in various remote sensing applications. Statistical modelling and scattering analysis are two main ways for PolSAR data interpretation, and have been intensively investigated in the past two decades. Moreover, spatial analysis was applied in the analysis of PolSAR data and found to be beneficial to achieve more accurate interpretation results. This thesis focuses on extracting typical spatial information, i.e., edges and regions by exploring the statistical characteristics of PolSAR data. The existing spatial analysing methods are mainly based on the complex Wishart distribution, which well characterizes the inherent statistical features in homogeneous areas. However, the non-Gaussian models can give better representation of the PolSAR statistics, and therefore have the potential to improve the performance of spatial analysis, especially in heterogeneous areas. In addition, the traditional fixed-shape windows cannot accurately estimate the distribution parameter in some complicated areas, leading to the loss of the refined spatial details. Furthermore, many of the existing methods are not spatially adaptive so that the obtained results are promising in some areas whereas unsatisfactory in other areas. Therefore, this thesis is dedicated to extracting spatial information by applying the non-Gaussian statistical models and spatially adaptive strategies. The specific objectives of the thesis include: (1) to develop reliable edge detection method, (2) to develop spatially adaptive superpixel generation method, and (3) to investigate a new framework of region-based segmentation. Automatic edge detection plays a fundamental role in spatial analysis, whereas the performance of classical PolSAR edge detection methods is limited by the fixed-shape windows. Paper 1 investigates an enhanced edge detection method using th, QC 20171123
- Published
- 2017
7. Superpixel Segmentation of Polarimetric SAR Data Based on Integrated Distance Measure and Entropy Rate Method
- Author
-
Wang, Wei, Xiang, Deliang, Ban, Yifang, Zhang, Jun, Wan, Jianwei, Wang, Wei, Xiang, Deliang, Ban, Yifang, Zhang, Jun, and Wan, Jianwei
- Abstract
This paper proposes to integrate two different distances to measure the dissimilarity between neighboring pixels in PolSAR images, and introduces the entropy rate method into PolSAR image superpixel segmentation. Since the Gaussian model is commonly used for homogeneous scenes and less suitable for heterogeneous scenes, we adopt the spherically invariant random vector (SIRV) model to describe the back-scattering characteristics in heterogeneous areas. Moreover, a directional span-driven adaptive (DSDA) region is proposed such that it contains independent and identically distributed samples only, thus it can obtain accurate estimation of the distribution parameters. Using the DSDA region, the Wishart distance and SIRV distance are calculated, and then combined together through a homogeneity measurement. Therefore, the integrated distance takes advantage of the SIRV model and the Gaussian model, and suits both homogeneous and heterogeneous areas. Finally, based on the integrated distance, the superpixel segments are generated using the entropy rate framework. The experimental results on ESAR and PiSAR L-band datasets show that the proposed method can generate homogeneity-adaptive segments, resulting in smooth representation of the land covers in homogeneous areas, and better preserved details in heterogeneous areas., QC 20171123
- Published
- 2017
- Full Text
- View/download PDF
8. Leveraging Environmental Correlations: The Thermodynamics of Requisite Variety
- Author
-
Boyd, AB, Boyd, AB, Mandal, D, Crutchfield, JP, Boyd, AB, Boyd, AB, Mandal, D, and Crutchfield, JP
- Abstract
Key to biological success, the requisite variety that confronts an adaptive organism is the set of detectable, accessible, and controllable states in its environment. We analyze its role in the thermodynamic functioning of information ratchets—a form of autonomous Maxwellian Demon capable of exploiting fluctuations in an external information reservoir to harvest useful work from a thermal bath. This establishes a quantitative paradigm for understanding how adaptive agents leverage structured thermal environments for their own thermodynamic benefit. General ratchets behave as memoryful communication channels, interacting with their environment sequentially and storing results to an output. The bulk of thermal ratchets analyzed to date, however, assume memoryless environments that generate input signals without temporal correlations. Employing computational mechanics and a new information-processing Second Law of Thermodynamics (IPSL) we remove these restrictions, analyzing general finite-state ratchets interacting with structured environments that generate correlated input signals. On the one hand, we demonstrate that a ratchet need not have memory to exploit an uncorrelated environment. On the other, and more appropriate to biological adaptation, we show that a ratchet must have memory to most effectively leverage structure and correlation in its environment. The lesson is that to optimally harvest work a ratchet’s memory must reflect the input generator’s memory. Finally, we investigate achieving the IPSL bounds on the amount of work a ratchet can extract from its environment, discovering that finite-state, optimal ratchets are unable to reach these bounds. In contrast, we show that infinite-state ratchets can go well beyond these bounds by utilizing their own infinite “negentropy”. We conclude with an outline of the collective thermodynamics of information-ratchet swarms.
- Published
- 2017
9. An Entropy Estimate of Written Language and Twitter Language : A Comparison between English and Swedish
- Author
-
Juhlin, Sanna and Juhlin, Sanna
- Abstract
The purpose of this study is to estimate and compare the entropy and redundancy of written English and Swedish. We also investigate and compare the entropy and redundancy of Twitter language. This is done by extracting n consecutive characters called n-grams and calculating their frequencies. No precise values are obtained, due to the amount of text being finite, while the entropy is estimated for text length tending towards infinity. However we do obtain results for n = 1,...,6 and the results show that written Swedish has higher entropy than written English and that the redundancy is lower for Swedish language. When comparing Twitter with the standard languages we find that for Twitter, the entropy is higher and the redundancy is lower.
- Published
- 2017
10. The entropy of words-learnability and expressivity across more than 1000 languages
- Author
-
Universitat Politècnica de Catalunya. Departament de Ciències de la Computació, Universitat Politècnica de Catalunya. LARCA - Laboratori d'Algorísmia Relacional, Complexitat i Aprenentatge, Bentz, Chris, Alikaniotis, Dimitrios, Cysouw, Michael, Ferrer Cancho, Ramon, Universitat Politècnica de Catalunya. Departament de Ciències de la Computació, Universitat Politècnica de Catalunya. LARCA - Laboratori d'Algorísmia Relacional, Complexitat i Aprenentatge, Bentz, Chris, Alikaniotis, Dimitrios, Cysouw, Michael, and Ferrer Cancho, Ramon
- Abstract
The choice associated with words is a fundamental property of natural languages. It lies at the heart of quantitative linguistics, computational linguistics and language sciences more generally. Information theory gives us tools at hand to measure precisely the average amount of choice associated with words: the word entropy. Here, we use three parallel corpora, encompassing ca. 450 million words in 1916 texts and 1259 languages, to tackle some of the major conceptual and practical problems of word entropy estimation: dependence on text size, register, style and estimation method, as well as non-independence of words in co-text. We present two main findings: Firstly, word entropies display relatively narrow, unimodal distributions. There is no language in our sample with a unigram entropy of less than six bits/word. We argue that this is in line with information-theoretic models of communication. Languages are held in a narrow range by two fundamental pressures: word learnability and word expressivity, with a potential bias towards expressivity. Secondly, there is a strong linear relationship between unigram entropies and entropy rates. The entropy difference between words with and without co-textual information is narrowly distributed around ca. three bits/word. In other words, knowing the preceding text reduces the uncertainty of words by roughly the same amount across languages of the world., Peer Reviewed, Postprint (published version)
- Published
- 2017
11. Identifying functional thermodynamics in autonomous Maxwellian ratchets
- Author
-
Boyd, AB, Boyd, AB, Mandal, D, Crutchfield, JP, Boyd, AB, Boyd, AB, Mandal, D, and Crutchfield, JP
- Abstract
We introduce a family of Maxwellian Demons for which correlations among information bearing degrees of freedom can be calculated exactly and in compact analytical form. This allows one to precisely determine Demon functional thermodynamic operating regimes, when previous methods either misclassify or simply fail due to approximations they invoke. This reveals that these Demons are more functional than previous candidates. They too behave either as engines, lifting a mass against gravity by extracting energy from a single heat reservoir, or as Landauer erasers, consuming external work to remove information from a sequence of binary symbols by decreasing their individual uncertainty. Going beyond these, our Demon exhibits a new functionality that erases bits not by simply decreasing individual-symbol uncertainty, but by increasing inter-bit correlations (that is, by adding temporal order) while increasing single-symbol uncertainty. In all cases, but especially in the new erasure regime, exactly accounting for informational correlations leads to tight bounds on Demon performance, expressed as a refined Second Law of thermodynamics that relies on the Kolmogorov-Sinai entropy for dynamical processes and not on changes purely in system configurational entropy, as previously employed. We rigorously derive the refined Second Law under minimal assumptions and so it applies quite broadly - for Demons with and without memory and input sequences that are correlated or not. We note that general Maxwellian Demons readily violate previously proposed, alternative such bounds, while the current bound still holds. As such, it broadly describes the minimal energetic cost of any computation by a thermodynamic system.
- Published
- 2016
12. Entropy rates of physiological aging on microscopy
- Author
-
Pham, Tuan D. and Pham, Tuan D.
- Abstract
This paper presents a method for computing entropy rates of images by modeling a stationary Markov chain constructed from a weighted graph. The proposed method was applied to the quantification of the complex behavior of the growing rates of physiological aging of Caenorhabditis elegans (C. elegans) on microscopic images, which has been considered as one of the most challenging problems in the search for metrics that can be used for identifying differences among stages in high- throughput and high-content images of physiological aging.
- Published
- 2016
- Full Text
- View/download PDF
13. Information anatomy of stochastic equilibria
- Author
-
Marzen, S, Marzen, S, Crutchfield, JP, Marzen, S, Marzen, S, and Crutchfield, JP
- Abstract
A stochastic nonlinear dynamical system generates information, as measured by its entropy rate. Some-the ephemeral information-is dissipated and some-the bound information-is actively stored and so affects future behavior. We derive analytic expressions for the ephemeral and bound information in the limit of infinitesimal time discretization for two classical systems that exhibit dynamical equilibria: first-order Langevin equations (i) where the drift is the gradient of an analytic potential function and the diffusion matrix is invertible and (ii) with a linear drift term (Ornstein-Uhlenbeck), but a noninvertible diffusion matrix. In both cases, the bound information is sensitive to the drift and diffusion, while the ephemeral information is sensitive only to the diffusion matrix and not to the drift. Notably, this information anatomy changes discontinuously as any of the diffusion coefficients vanishes, indicating that it is very sensitive to the noise structure. We then calculate the information anatomy of the stochastic cusp catastrophe and of particles diffusing in a heat bath in the overdamped limit, both examples of stochastic gradient descent on a potential landscape. Finally, we use our methods to calculate and compare approximations for the time-local predictive information for adaptive agents.
- Published
- 2014
14. Chaos forgets and remembers: Measuring information creation, destruction, and storage
- Author
-
James, RG, James, RG, Burke, K, Crutchfield, JP, James, RG, James, RG, Burke, K, and Crutchfield, JP
- Abstract
The hallmark of deterministic chaos is that it creates information - the rate being given by the Kolmogorov-Sinai metric entropy. Since its introduction half a century ago, the metric entropy has been used as a unitary quantity to measure a system's intrinsic unpredictability. Here, we show that it naturally decomposes into two structurally meaningful components: A portion of the created information - the ephemeral information - is forgotten and a portion - the bound information - is remembered. The bound information is a new kind of intrinsic computation that differs fundamentally from information creation: it measures the rate of active information storage. We show that it can be directly and accurately calculated via symbolic dynamics, revealing a hitherto unknown richness in how dynamical systems compute. © 2014 Elsevier B.V.
- Published
- 2014
15. Infinite excess entropy processes with countable-state generators
- Author
-
Travers, NF, Travers, NF, Crutchfield, JP, Travers, NF, Travers, NF, and Crutchfield, JP
- Abstract
We present two examples of finite-alphabet, infinite excess entropy processes generated by stationary hidden Markov models (HMMs) with countable state sets. The first, simpler example is not ergodic, but the second is. These are the first explicit constructions of processes of this type. © 2014 by the authors.
- Published
- 2014
16. Joint Source-Channel Coding Using Multiple Label Mapping
- Author
-
Tervo, Valtteri, Matsumoto, Tad, Karjalainen, Juha, Tervo, Valtteri, Matsumoto, Tad, and Karjalainen, Juha
- Abstract
This paper proposes a technique to compress the data with equal length code words. A novel source coding technique, multiple label mapping (MLM), is introduced. With MLM it is possible to produce a source code which uses equal length code words. Moreover, it is shown that with the MLM technique, it is possible to achieve near limit compression without using variable length coding (VLC). However this requires that the source probability grouping is performed so that after MLM each code word has almost equal appearance probability, and that full a priori feedback is available. Numerical results demonstrate proper operability of the proposed system., identifier:https://dspace.jaist.ac.jp/dspace/handle/10119/9818
- Published
- 2010
17. Use of probabilistic and deterministic measures to identify unfavorable earthquake records
- Author
-
20155055, Moustafa, Abbas, Takewaki, Izuru, 20155055, Moustafa, Abbas, and Takewaki, Izuru
- Abstract
This study introduces measures to identify resonant (concentration of energy in a single or a few frequencies) or unfavorable earthquake ground motions. Probabilistic measures based on the entropy rate and the geometric properties of the power spectral density function (PSDF) of the ground acceleration are developed first. Subsequently, deterministic measures for the frequency content of the ground acceleration are also developed. These measures are then used for identifying resonance and criticality in stochastic earthquake models and 110 acceleration records measured at rock, stiff, medium and soft soil sites. The unfavorable earthquake record for a given structure is defined as the record having a narrow frequency content and dominant frequency close to the structure fundamental natural frequency. Accordingly, the measures developed in this study may provide a basis for selecting records that are capable of producing the highest structural response. Numerical verifications are provided on damage caused to structures by identified resonant records.
- Published
- 2009
18. Prediction, Retrodiction, and the Amount of Information Stored in the Present
- Author
-
Ellison, Christopher J., Ellison, Christopher J., Mahoney, John R., Crutchfield, James P., Ellison, Christopher J., Ellison, Christopher J., Mahoney, John R., and Crutchfield, James P.
- Abstract
We introduce an ambidextrous view of stochastic dynamical systems, comparing their forward-time and reverse-time representations and then integrating them into a single time-symmetric representation. The perspective is useful theoretically, computationally, and conceptually. Mathematically, we prove that the excess entropy—a familiar measure of organization in complex systems—is the mutual information not only between the past and future, but also between the predictive and retrodictive causal states. Practically, we exploit the connection between prediction and retrodiction to directly calculate the excess entropy. Conceptually, these lead one to discover new system measures for stochastic dynamical systems: crypticity (information accessibility) and causal irreversibility. Ultimately, we introduce a time-symmetric representation that unifies all of these quantities, compressing the two directional representations into one. The resulting compression offers a new conception of the amount of information stored in the present.
- Published
- 2009
19. Use of probabilistic and deterministic measures to identify unfavorable earthquake records
- Author
-
Moustafa, Abbas, Takewaki, Izuru, Moustafa, Abbas, and Takewaki, Izuru
- Published
- 2009
20. Inferring Markov chains: Bayesian estimation, model comparison, entropy rate, and out-of-class modeling
- Author
-
Strelioff, Christopher C., Strelioff, Christopher C., Crutchfield, James P., Huebler, Alfred W., Strelioff, Christopher C., Strelioff, Christopher C., Crutchfield, James P., and Huebler, Alfred W.
- Abstract
Markov chains are a natural and well understood tool for describing one-dimensional patterns in time or space. We show how to infer kth order Markov chains, for arbitrary k, from finite data by applying Bayesian methods to both parameter estimation and model-order selection. Extending existing results for multinomial models of discrete data, we connect inference to statistical mechanics through information-theoretic (type theory) techniques. We establish a direct relationship between Bayesian evidence and the partition function which allows for straightforward calculation of the expectation and variance of the conditional relative entropy and the source entropy rate. Finally, we introduce a method that uses finite data-size scaling with model-order comparison to infer the structure of out-of-class processes.
- Published
- 2007
21. Entropy and alternative entropy functionals of fractional Gaussian noise as the functions of Hurst index
- Author
-
Malyarenko, Anatoliy, Mishura, Yuliia, Ralchenko, Kostiantyn, Shklyar, Sergiy, Malyarenko, Anatoliy, Mishura, Yuliia, Ralchenko, Kostiantyn, and Shklyar, Sergiy
- Abstract
This paper is devoted to the study of the properties of entropy as a function of the Hurst index, which corresponds to the fractional Gaussian noise. Since the entropy of the Gaussian vector depends on the determinant of the covariance matrix, and the behavior of this determinant as a function of the Hurst index is rather difficult to study analytically at high dimensions, we also consider simple alternative entropy functionals, whose behavior, on the one hand, mimics the behavior of entropy and, on the other hand, is not difficult to study. Asymptotic behavior of the normalized entropy (so called entropy rate) is also studied for the entropy and for the alternative functionals.
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.