32 results on '"Entropy rate"'
Search Results
2. Symbolic diffusion entropy rate of chaotic time series as a surrogate measure for the largest Lyapunov exponent
- Author
-
Takaya Miyano and Kota Shiozawa
- Subjects
Sequence ,Series (mathematics) ,Chaotic ,Lyapunov exponent ,01 natural sciences ,010305 fluids & plasmas ,Nonlinear Sciences::Chaotic Dynamics ,symbols.namesake ,Phase space ,0103 physical sciences ,Attractor ,symbols ,Piecewise ,Statistical physics ,010306 general physics ,Entropy rate ,Mathematics - Abstract
Existing methods for estimating the largest Lyapunov exponent from a time series rely on the rate of separation of initially nearby trajectories reconstructed from the time series in phase space. According to Ueda, chaotic dynamical behavior is viewed as a manifestation of random transitions between unstable periodic orbits in a chaotic attractor, which are triggered by perturbations due to experimental observation or the roundoff error characteristic of the computing machine, and consequently consists of a sequence of piecewise deterministic processes instead of an entirely deterministic process. Chaotic trajectories might have no physical reality. Here, we propose a mathematical method for estimating a surrogate measure for the largest Lyapunov exponent on the basis of the random diffusion of the symbols generated from a time series in a chaotic attractor, without resorting to initially nearby trajectories. We apply the proposed method to numerical time series generated by chaotic flow models and verify its validity.
- Published
- 2019
3. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle
- Author
-
Stefan Thurner, Rudolf Hanel, and Bernat Corominas-Murtra
- Subjects
Principle of maximum entropy ,Min entropy ,Thermodynamics ,Information theory ,01 natural sciences ,Thermodynamic system ,010305 fluids & plasmas ,symbols.namesake ,0103 physical sciences ,Boltzmann constant ,Maximum entropy probability distribution ,symbols ,Ergodic theory ,010306 general physics ,Entropy rate ,Mathematics - Abstract
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H(p)=-∑_{i}p_{i}logp_{i}. For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as S_{EXT} for extensive entropy, S_{IT} for the source information rate in information theory, and S_{MEP} for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.
- Published
- 2017
4. Extensivity and additivity of the Kolmogorov-Sinai entropy for simple fluids
- Author
-
Jason R. Green, Anthony Costa, and Moupriya Das
- Subjects
Thermodynamic state ,Lyapunov exponent ,01 natural sciences ,010305 fluids & plasmas ,symbols.namesake ,Additive function ,Quantum mechanics ,0103 physical sciences ,Exponent ,symbols ,Statistical physics ,van der Waals force ,010306 general physics ,Maxima ,Scaling ,Entropy rate ,Mathematics - Abstract
According to the van der Waals picture, attractive and repulsive forces play distinct roles in the structure of simple fluids. Here, we examine their roles in dynamics; specifically, in the degree of deterministic chaos using the Kolmogorov-Sinai (KS) entropy rate and the spectra of Lyapunov exponents. With computer simulations of three-dimensional Lennard-Jones and Weeks-Chandler-Andersen fluids, we find repulsive forces dictate these dynamical properties, with attractive forces reducing the KS entropy at a given thermodynamic state. Regardless of interparticle forces, the maximal Lyapunov exponent is intensive for systems ranging from 200 to 2000 particles. Our finite-size scaling analysis also shows that the KS entropy is both extensive (a linear function of system-size) and additive. Both temperature and density control the "dynamical chemical potential," the rate of linear growth of the KS entropy with system size. At fixed system-size, both the KS entropy and the largest exponent exhibit a maximum as a function of density. We attribute the maxima to the competition between two effects: as particles are forced to be in closer proximity, there is an enhancement from the sharp curvature of the repulsive potential and a suppression from the diminishing free volume and particle mobility. The extensivity and additivity of the KS entropy and the intensivity of the largest Lyapunov exponent, however, hold over a range of temperatures and densities across the liquid and liquid-vapor coexistence regimes.
- Published
- 2017
5. Free energy and entropy production rate for a Brownian particle that walks on overdamped medium
- Author
-
Mesfin Asfaw Taye
- Subjects
Physics ,Statistical Mechanics (cond-mat.stat-mech) ,Fluctuation theorem ,Entropy production ,Configuration entropy ,Maximum entropy thermodynamics ,FOS: Physical sciences ,01 natural sciences ,010305 fluids & plasmas ,Classical mechanics ,0103 physical sciences ,Maximum entropy probability distribution ,Statistical physics ,Entropy (energy dispersal) ,010306 general physics ,Boltzmann's entropy formula ,Condensed Matter - Statistical Mechanics ,Entropy rate - Abstract
We derive general expressions for the free energy, entropy production and entropy extraction rates for a Brownian particle that walks in a viscous medium where the dynamics of its motion is governed by the Langevin equation. It is shown that when the system is out of equilibrium, it constantly produces entropy and at the same time extracts entropy out of the system. Its entropy production and extraction rates decrease in time and saturate to a constant value. In long time limit, the rate of entropy production balances the rate of entropy extraction and at equilibrium both entropy production and extraction rates become zero. Moreover, considering different model systems, not only we investigate how various thermodynamic quantities behave in time but also we discuss the fluctuation theorem in detail., Comment: arXiv admin note: text overlap with arXiv:1507.01791
- Published
- 2016
6. Critical time scale of coarse-graining entropy production
- Author
-
Jang-il Sohn
- Subjects
Physics ,Critical time ,Entropy production ,Configuration entropy ,Data_CODINGANDINFORMATIONTHEORY ,Random walk ,01 natural sciences ,010305 fluids & plasmas ,0103 physical sciences ,Statistical physics ,Granularity ,010306 general physics ,Entropy (arrow of time) ,Entropy rate - Abstract
We study coarse-grained entropy production in an asymmetric random walk system on a periodic one-dimensional lattice. In coarse-grained systems, the original dynamics are unavoidably destroyed, but the coarse-grained entropy production is not hidden below the critical time-scale separation. The hidden entropy production is rapidly increasing near the critical time-scale separation.
- Published
- 2016
7. Proper encoding for snapshot-entropy scaling in two-dimensional classical spin models
- Author
-
Hiroaki Matsueda and Dai Ozaki
- Subjects
Binary entropy function ,Conditional quantum entropy ,Quantum mechanics ,Maximum entropy probability distribution ,Configuration entropy ,TheoryofComputation_GENERAL ,Data_CODINGANDINFORMATIONTHEORY ,Quantum entanglement ,Statistical physics ,Quantum relative entropy ,Entropy rate ,Joint quantum entropy ,Mathematics - Abstract
We reexamine the snapshot entropy of the Ising and three-states Potts models on the L×L square lattice. Focusing on how to encode the spin snapshot, we find that the entropy at T(c) scales asymptotically as S∼(1/3)lnL that strongly reminds us of the entanglement entropy in one-dimensional quantum critical systems. This finding seems to support that the snapshot entropy after the proper encoding is related to the holographic entanglement entropy. On the other hand, the anomalous scaling S(χ)∼χ(η)lnχ for the coarse-grained snapshot entropy holds even for the proper encoding. These features originate in the fact that the largest singular value of the snapshot matrix is regulated by the proper encoding.
- Published
- 2015
8. Stochastic entropy production arising from nonstationary thermal transport
- Author
-
Henry J. Charlesworth, Zachary P. L. Laker, and Ian J. Ford
- Subjects
Statistical Mechanics (cond-mat.stat-mech) ,Entropy production ,Principle of maximum entropy ,Configuration entropy ,FOS: Physical sciences ,Thermodynamics ,Maximum entropy spectral estimation ,Entropy (classical thermodynamics) ,Maximum entropy probability distribution ,Statistical physics ,Condensed Matter - Statistical Mechanics ,Entropy rate ,Joint quantum entropy ,Mathematics - Abstract
We compute statistical properties of the stochastic entropy production associated with the nonstationary transport of heat through a system coupled to a time dependent nonisothermal heat bath. We study the one-dimensional stochastic evolution of a bound particle in such an environment by solving the appropriate Langevin equation numerically, and by using an approximate analytic solution to the Kramers equation to determine the behavior of an ensemble of systems. We express the total stochastic entropy production in terms of a relaxational or nonadiabatic part together with two components of housekeeping entropy production and determine the distributions for each, demonstrating the importance of all three contributions for this system. We compare the results with an approximate analytic model of the mean behavior and we further demonstrate that the total entropy production and the relaxational component approximately satisfy detailed fluctuation relations for certain time intervals. Finally, we comment on the resemblance between the procedure for solving the Kramers equation and a constrained extremization, with respect to the probability density function, of the spatial density of the mean rate of production of stochastic entropy.
- Published
- 2015
9. Exact analytical thermodynamic expressions for a Brownian heat engine
- Author
-
Mesfin Asfaw Taye
- Subjects
H-theorem ,Entropy production ,Configuration entropy ,Maximum entropy probability distribution ,Maximum entropy thermodynamics ,Thermodynamics ,Boltzmann's entropy formula ,Entropy rate ,Joint quantum entropy ,Mathematics - Abstract
The nonequilibrium thermodynamics feature of a Brownian motor operating between two different heat baths is explored as a function of time t. Using the Gibbs entropy and Schnakenberg microscopic stochastic approach, we find exact closed form expressions for the free energy, the rate of entropy production, and the rate of entropy flow from the system to the outside. We show that when the system is out of equilibrium, it constantly produces entropy and at the same time extracts entropy out of the system. Its entropy production and extraction rates decrease in time and saturate to a constant value. In the long time limit, the rate of entropy production balances the rate of entropy extraction, and at equilibrium both entropy production and extraction rates become zero. Furthermore, via the present model, many thermodynamic theories can be checked.
- Published
- 2015
10. Analysis of the phase transition in the two-dimensional Ising ferromagnet using a Lempel-Ziv string-parsing scheme and black-box data-compression utilities
- Author
-
Alexander K. Hartmann and Oliver Melchert
- Subjects
Conditional entropy ,Binary entropy function ,Discrete mathematics ,Cross entropy ,Conditional quantum entropy ,Min entropy ,Transfer entropy ,Entropy rate ,Joint quantum entropy ,Mathematics - Abstract
In this work we consider information-theoretic observables to analyze short symbolic sequences, comprising time series that represent the orientation of a single spin in a two-dimensional (2D) Ising ferromagnet on a square lattice of size ${L}^{2}={128}^{2}$ for different system temperatures $T$. The latter were chosen from an interval enclosing the critical point ${T}_{\mathrm{c}}$ of the model. At small temperatures the sequences are thus very regular; at high temperatures they are maximally random. In the vicinity of the critical point, nontrivial, long-range correlations appear. Here we implement estimators for the entropy rate, excess entropy (i.e., ``complexity''), and multi-information. First, we implement a Lempel-Ziv string-parsing scheme, providing seemingly elaborate entropy rate and multi-information estimates and an approximate estimator for the excess entropy. Furthermore, we apply easy-to-use black-box data-compression utilities, providing approximate estimators only. For comparison and to yield results for benchmarking purposes, we implement the information-theoretic observables also based on the well-established $M$-block Shannon entropy, which is more tedious to apply compared to the first two ``algorithmic'' entropy estimation procedures. To test how well one can exploit the potential of such data-compression techniques, we aim at detecting the critical point of the 2D Ising ferromagnet. Among the above observables, the multi-information, which is known to exhibit an isolated peak at the critical point, is very easy to replicate by means of both efficient algorithmic entropy estimation procedures. Finally, we assess how good the various algorithmic entropy estimates compare to the more conventional block entropy estimates and illustrate a simple modification that yields enhanced results.
- Published
- 2015
11. Inherent randomness of evolving populations
- Author
-
Marc Harper
- Subjects
FOS: Computer and information sciences ,Computer Science - Information Theory ,Population Dynamics ,Population ,Markov process ,Dynamical Systems (math.DS) ,symbols.namesake ,Statistics ,FOS: Mathematics ,Moran process ,Animals ,Humans ,Quantitative Biology::Populations and Evolution ,Computer Simulation ,Statistical physics ,Mathematics - Dynamical Systems ,Quantitative Biology - Populations and Evolution ,education ,Entropy rate ,Randomness ,Mathematics ,Stochastic Processes ,education.field_of_study ,Models, Statistical ,Models, Genetic ,Stochastic process ,Information Theory (cs.IT) ,Population size ,Genetic Drift ,Populations and Evolution (q-bio.PE) ,Genetics, Population ,FOS: Biological sciences ,Bounded function ,symbols ,Genetic Fitness ,91A22, 94A15 - Abstract
The entropy rates of the Wright-Fisher process, the Moran process, and generalizations are computed and used to compare these processes and their dependence on standard evolutionary parameters. Entropy rates are measures of the variation dependent on both short-run and long-run behavior, and allow the relationships between mutation, selection, and population size to be examined. Bounds for the entropy rate are given for the Moran process (independent of population size) and for the Wright-Fisher process (bounded for fixed population size). A generational Moran process is also presented for comparison to the Wright-Fisher Process. Results include analytic results and computational extensions.
- Published
- 2014
12. Fluctuation theorems for entropy production in open systems
- Author
-
Jürgen Vollmer, Lamberto Rondoni, and Tamás Tél
- Subjects
Dynamical systems theory ,Entropy production ,Fluctuation theorem ,H-theorem ,Maximum entropy thermodynamics ,Statistical physics ,Entropy (arrow of time) ,Joint quantum entropy ,Entropy rate ,Mathematics - Abstract
We derive a fluctuation theorem to describe entropy fluctuations in steady states of systems with density gradients due to open boundaries. The fluctuations are related to the growth rate of the phase-space density, instead of the phase-space contraction rate. Explicit derivations are presented for a multibaker map, but the arguments are rather general, and should hold for a much wider class of dynamical systems. A comparison with recent results for stochastic systems is also given.
- Published
- 2000
13. Entropy production and phase space volume contraction
- Author
-
David Daems and Grégoire Nicolis
- Subjects
Physics ,Classical mechanics ,H-theorem ,Entropy production ,Configuration entropy ,Dissipative system ,Maximum entropy thermodynamics ,Non-equilibrium thermodynamics ,Statistical physics ,Joint quantum entropy ,Entropy rate - Abstract
We inquire whether the connection between entropy production and phase space volume contraction rate reported recently for a class of thermostatted systems is an intrinsic property of a wide class of dynamical systems, or the result of the particular algorithm devised for thermostatting a system of interacting particles obeying, in the presence of nonequilibrium constraints, a time-reversible, dissipative dynamics. A nonequilibrium thermodynamics based on the balance equation for information entropy is developed for dissipative dynamical systems subjected, in addition, to a stochastic forcing. This latter accounts for the thermodynamic fluctuations accompanying the reduced description of the thermostat by a dissipative perturbation, for the interaction between the system and the external reservoirs or for perturbations of external origin. Entropy flux and entropy productionlike terms depending on the characteristics of the dynamics in phase space, particularly the rate of phase space volume contraction, are identified. Their connections with irreversible thermodynamics are explored. In particular, for thermostatted systems we find, without invoking an ad hoc conservation law between the system and the reservoir, that information entropy production is related to the opposite of the rate of phase space volume contraction to the second order in the distance from equilibrium.
- Published
- 1999
14. Calculating topological entropy for transient chaos with an application to communicating with chaos
- Author
-
Edward Ott, Joeri Jacobs, and Brian R. Hunt
- Subjects
Nonlinear Sciences::Chaotic Dynamics ,Discrete mathematics ,Binary entropy function ,Principle of maximum entropy ,Attractor ,Statistical physics ,Topological entropy ,Entropy (energy dispersal) ,Topological entropy in physics ,Joint quantum entropy ,Entropy rate ,Mathematics - Abstract
Recent work on communicating with chaos provides a practical motivation for being able to determine numerically the topological entropy for chaotic invariant sets. In this paper we discuss numerical methods for evaluating topological entropy. To assess the accuracy and convergence of the methods, we test them in situations where the topological entropy is known independently. We also discuss the entropy of invariant chaotic saddles formed by those points in a given attractor that never visit some forbidden {open_quotes}gap{close_quotes} region. Such gaps have been proposed as a means of providing noise immunity in schemes for communication with chaos, and we discuss the dependence of the topological entropy on the size of the gap. {copyright} {ital 1998} {ital The American Physical Society}
- Published
- 1998
15. Entropy and entropy production in simple stochastic models
- Author
-
Toyonori Munakata, Tadahiko Shiotani, and Akito Igarashi
- Subjects
Physics ,Binary entropy function ,Entropy production ,H-theorem ,Configuration entropy ,Maximum entropy thermodynamics ,Statistical physics ,Entropy (energy dispersal) ,Boltzmann's entropy formula ,Entropy rate - Published
- 1998
16. Information content of signals using correlation function expansions of the entropy
- Author
-
Phil Attard, Owen G. Jepps, and Stjepan Marčelja
- Subjects
Differential entropy ,Entropy power inequality ,Principle of maximum entropy ,Maximum entropy probability distribution ,Mathematical analysis ,Maximum entropy spectral estimation ,Joint entropy ,Entropy rate ,Joint quantum entropy ,Mathematics - Abstract
Formally exact series expressions are derived for the entropy (information content) of a time series or signal by making systematic expansions for the higher-order correlation functions using generalized Kirkwood and Markov superpositions. Termination of the series after two or three terms provides tractable and accurate approximations for calculating the entropy. Signals generated by a Gaussian random process are simulated using Lorentzian and Gaussian spectral densities (exponential and Gaussian covariance functions) and the entropy is calculated as a function of the correlation length. The validity of the truncated Kirkwood expansion is restricted to weakly correlated signals, whereas the truncated Markov expansion is uniformly accurate; the leading two terms yield the entropy exactly in the limits of both weak and strong correlations. The concept of entropy for a continuous signal is explored in detail and it is shown that it depends upon the level of digitization and the frequency of sampling. The limiting forms are analyzed for a continuous signal with exponentially decaying covariance, for which explicit results can be obtained. Explicit results are also obtained for the binary discrete case that is isomorphic to the Ising spin lattice model.
- Published
- 1997
17. Entropy production and nonlinear Fokker-Planck equations
- Author
-
Gabriela A. Casas, Fernando D. Nobre, and Evaldo M. F. Curado
- Subjects
Nonlinear system ,Entropy (classical thermodynamics) ,H-theorem ,Entropy production ,Tsallis entropy ,Configuration entropy ,Statistical physics ,Entropy rate ,Joint quantum entropy ,Mathematics - Abstract
The entropy time rate of systems described by nonlinear Fokker-Planck equations--which are directly related to generalized entropic forms--is analyzed. Both entropy production, associated with irreversible processes, and entropy flux from the system to its surroundings are studied. Some examples of known generalized entropic forms are considered, and particularly, the flux and production of the Boltzmann-Gibbs entropy, obtained from the linear Fokker-Planck equation, are recovered as particular cases. Since nonlinear Fokker-Planck equations are appropriate for the dynamical behavior of several physical phenomena in nature, like many within the realm of complex systems, the present analysis should be applicable to irreversible processes in a large class of nonlinear systems, such as those described by Tsallis and Kaniadakis entropies.
- Published
- 2012
18. Nonequilibrium fluctuation theorem for systems under discrete and continuous feedback control
- Author
-
Anupam Kundu
- Subjects
Statistical Mechanics (cond-mat.stat-mech) ,Entropy production ,Fluctuation theorem ,Principle of maximum entropy ,Maximum entropy thermodynamics ,FOS: Physical sciences ,Control theory ,Maximum entropy probability distribution ,Applied mathematics ,Entropy (arrow of time) ,Condensed Matter - Statistical Mechanics ,Entropy rate ,Joint quantum entropy ,Mathematics - Abstract
Without violating causality, we allow performing measurements in time reverse process of a feedback manipulated stochastic system. As a result we come across an entropy production due to the measurement process. This entropy production, in addition to the usual system and medium entropy production, constitutes the total entropy roduction of the combined system of the reservoir, the system and the feedback controller. We show that this total entropy production of "full" system satisfies an integrated fluctuation theorem as well as a detailed fluctuation theorem as expected. We illustrate and verify this idea through explicit calculation and direct simulation in two examples., Comment: 7 pages, 5 figures
- Published
- 2012
19. Entropy production in multiple scattering of light by a spatially random medium
- Author
-
Dominique J. Bicout and Christian Brosseau
- Subjects
Physics ,Entropy production ,Quantum mechanics ,Principle of maximum entropy ,Configuration entropy ,Maximum entropy spectral estimation ,Residual entropy ,Entropy rate ,Quantum relative entropy ,Joint quantum entropy - Abstract
This study reports on the problem of entropy production due to multiple scattering of light by a spatially random medium composed of uncorrelated and noninteracting spherical dielectric particles. The degree of polarization P of light, in the form of plane waves, is of the nature of an order parameter for the ensemble of realizations of the fluctuating optical field. The radiation entropy takes a form analogous to the entropy of one-dimensional Ising (two-level) spin systems in contact with a heat bath. On the basis of this analysis, the degree of polarization has a different thermodynamic significance. It is argued that within this representation, one may define an effective polarization temperature \ensuremath{\tau}; we then show how \ensuremath{\tau} depends on the degree of polarization. Light transmitted through a multiple scattering medium is depolarized by decorrelation of the phases of the electric field components and its polarization entropy increases. The effects of size of the spherical particles and of the optical depth on entropy production are studied numerically, using the Mie theory, via the Monte Carlo method. An attempt is made to interpret these results in terms of the minimization procedure (minimum entropy production) that plays a fundamental role in classical irreversible thermodynamics. One of the most remarkable aspects of this problem, where no energy exchange between radiation and scatterer takes place, is that the stationary state corresponds both to the state of minimum production of radiation entropy and to the state of maximum entropy. Thermodynamically, multiple scattering can be viewed as an order-disorder transition using the spin model. It is also emphasized that the system will tend to evolve towards a ``higher polarization temperature'' state. We briefly comment on the use of our treatment in interpreting the irreversibility in a scattering process.
- Published
- 1994
20. Maximum-likelihood estimation of the entropy of an attractor
- Author
-
JC Jaap Schouten, Floris Takens, and C.M. van den Bleek
- Subjects
Differential entropy ,Rényi entropy ,Principle of maximum entropy ,Maximum entropy probability distribution ,Statistics ,Applied mathematics ,Min entropy ,Maximum entropy spectral estimation ,Joint entropy ,Entropy rate ,Mathematics - Abstract
In this paper, a maximum-likelihood estimate of the (Kolmogorov) entropy of an attractor is proposed that can be obtained directly from a time series. Also, the relative standard deviation of the entropy estimate is derived; it is dependent on the entropy and on the number of samples used in the estimation.
- Published
- 1994
21. Entropy rate of nonequilibrium growing networks
- Author
-
Kun Zhao, Simone Severini, Ginestra Bianconi, and Arda Halu
- Subjects
Statistical Mechanics (cond-mat.stat-mech) ,Principle of maximum entropy ,FOS: Physical sciences ,Disordered Systems and Neural Networks (cond-mat.dis-nn) ,Maximum entropy spectral estimation ,Condensed Matter - Disordered Systems and Neural Networks ,Complex network ,Degree distribution ,Binary entropy function ,Tree network ,Transfer entropy ,Statistical physics ,Condensed Matter - Statistical Mechanics ,Entropy rate - Abstract
New entropy measures have been recently introduced for the quantification of the complexity of networks. Most of these entropy measures apply to static networks or to dynamical processes defined on static complex networks. In this paper we define the entropy rate of growing network models. This entropy rate quantifies how many labeled networks are typically generated by the growing network models. We analytically evaluate the difference between the entropy rate of growing tree network models and the entropy of tree networks that have the same asymptotic degree distribution. We find that the growing networks with linear preferential attachment generated by dynamical models are exponentially less than the static networks with the same degree distribution for a large variety of relevant growing network models. We study the entropy rate for growing network models showing structural phase transitions including models with non-linear preferential attachment. Finally, we bring numerical evidence that the entropy rate above and below the structural phase transitions follow a different scaling with the network size., (8 pages, 4 figures)
- Published
- 2011
22. Local kinetic interpretation of entropy production through reversed diffusion
- Author
-
John Mattingly, Edoardo Daly, Amilcare Porporato, Peter Kramer, and Massimo Cassiani
- Subjects
Stochastic differential equation ,Girsanov theorem ,Diffusion equation ,Entropy production ,Anomalous diffusion ,Diffusion ,Local time ,Statistical physics ,Entropy rate ,Mathematics - Abstract
The time reversal of stochastic diffusion processes is revisited with emphasis on the physical meaning of the time-reversed drift and the noise prescription in the case of multiplicative noise. The local kinematics and mechanics of free diffusion are linked to the hydrodynamic description. These properties also provide an interpretation of the Pope-Ching formula for the steady-state probability density function along with a geometric interpretation of the fluctuation-dissipation relation. Finally, the statistics of the local entropy production rate of diffusion are discussed in the light of local diffusion properties, and a stochastic differential equation for entropy production is obtained using the Girsanov theorem for reversed diffusion. The results are illustrated for the Ornstein-Uhlenbeck process.
- Published
- 2011
23. Entropy rate estimates from mutual information
- Author
-
L. C. McKay-Jones, P.-M. Binder, and Brian Wissman
- Subjects
Nonlinear Sciences::Chaotic Dynamics ,Conditional entropy ,Rényi entropy ,Principle of maximum entropy ,Statistics ,Maximum entropy probability distribution ,Applied mathematics ,Mutual information ,Quantum mutual information ,Joint entropy ,Entropy rate ,Mathematics - Abstract
We show how to estimate the Kolmogorov-Sinai entropy rate for chaotic systems using the mutual information function, easily obtainable from experimental time series. We state the conditions under which the relationship is exact, and explore the usefulness of the approach for both maps and flows. We also explore refinements of the method, and study its convergence properties as a function of time series length.
- Published
- 2011
24. Entropy production in nonequilibrium steady states: A different approach and an exactly solvable canonical model
- Author
-
Daniel ben-Avraham, Michel Pleimling, and Sven Dorosz
- Subjects
Models, Statistical ,Time Factors ,Statistical Mechanics (cond-mat.stat-mech) ,H-theorem ,Entropy production ,Entropy ,Physics ,Maximum entropy thermodynamics ,FOS: Physical sciences ,Models, Theoretical ,Entropy in thermodynamics and information theory ,01 natural sciences ,010305 fluids & plasmas ,0103 physical sciences ,Maximum entropy probability distribution ,Computer Simulation ,Statistical physics ,010306 general physics ,Entropy (arrow of time) ,Algorithms ,Condensed Matter - Statistical Mechanics ,Joint quantum entropy ,Entropy rate ,Mathematics - Abstract
We discuss entropy production in nonequilibrium steady states by focusing on paths obtained by sampling at regular (small) intervals, instead of sampling on each change of the system's state. This allows us to study directly entropy production in systems with microscopic irreversibility, for the first time. The two sampling methods are equivalent, otherwise, and the fluctuation theorem holds also for the novel paths. We focus on a fully irreversible three-state loop, as a canonical model of microscopic irreversibility, finding its entropy distribution, rate of entropy pr oduction, and large deviation function in closed analytical form, and showing that the widely observed kink in the large deviation function arises solely f rom microscopic irreversibility., 4 pages, 3 figures, submitted to Phys. Rev. Lett
- Published
- 2011
25. Transformation properties of entropy production
- Author
-
Grégoire Nicolis
- Subjects
Physics ,Entropy production ,H-theorem ,Configuration entropy ,Maximum entropy probability distribution ,Maximum entropy thermodynamics ,Thermodynamics ,Statistical physics ,Entropy (arrow of time) ,Joint quantum entropy ,Entropy rate - Abstract
The transformation properties of entropy production under phase-space partitioning, lumping, or the elimination of intermediate steps and variables in the presence of widely separated time scales are studied. Conditions are derived under which dissipation remains invariant. In systems subjected to external periodic driving the adiabatic, asymptotic, and transient entropy productions are evaluated and the extent to which they can be separately non-negative is determined.
- Published
- 2011
26. Magnetic-field-induced breakdown of equivalence of multidimensional motion
- Author
-
Bidhan Chandra Bag, Monoj Kumar Sen, and Alendu Baura
- Subjects
Physics ,Classical mechanics ,Stochastic process ,Phase space ,Balance equation ,Entropy (information theory) ,Non-equilibrium thermodynamics ,Statistical physics ,Entropy rate ,Brownian motion ,Magnetic field - Abstract
In this paper, we have studied Brownian motion in multidimension phase space in presence of a magnetic field. The nonequilibrium behavior of thermodynamically inspired quantities along the individual component of motion has been studied in detail. Based on the Fokker-Planck description of the stochastic process and entropy balance equation, we have calculated information entropy production and entropy flux at nonequilibrium state. The dependence of these quantities on time, magnetic field, and thermal bath is studied. In this context, we have observed that there exists extremum behavior in the dynamics and the applied magnetic field breaks the equivalence in motion of the components in the nonequilibrium state.
- Published
- 2010
27. Publisher's Note: Inferring Markov chains: Bayesian estimation, model comparison, entropy rate, and out-of-class modeling [Phys. Rev. E76, 011106 (2007)]
- Author
-
James P. Crutchfield, Alfred Hubler, and Christopher C. Strelioff
- Subjects
Markov kernel ,Markov chain ,Maximum-entropy Markov model ,Econometrics ,Markov property ,Statistical physics ,Markov model ,Entropy rate ,Variable-order Bayesian network ,Causal Markov condition ,Mathematics - Published
- 2007
28. Kirchhoff’s loop law and the maximum entropy production principle
- Author
-
Paško Županović, Davor Juretić, and Srećko Botrić
- Subjects
Statistical Mechanics (cond-mat.stat-mech) ,H-theorem ,Configuration entropy ,Maximum entropy thermodynamics ,FOS: Physical sciences ,Maximum entropy spectral estimation ,entropy production ,Kirchhoff's law ,electric network ,Law ,Maximum entropy probability distribution ,Entropy (arrow of time) ,Condensed Matter - Statistical Mechanics ,Joint quantum entropy ,Entropy rate ,Mathematics - Abstract
In contrast to the standard derivation of Kirchhoff's loop law, which invokes electric potential, we show, for the linear planar electric network in a stationary state at the fixed temperature,that loop law can be derived from the maximum entropy production principle. This means that the currents in network branches are distributed in such a way as to achieve the state of maximum entropy production., revtex4, 5 pages, 2 figures
- Published
- 2004
29. Stability of Tsallis entropy and instabilities of Rényi and normalized Tsallis entropies: A basis forq-exponential distributions
- Author
-
Sumiyoshi Abe
- Subjects
Rényi entropy ,Statistical Mechanics (cond-mat.stat-mech) ,Principle of maximum entropy ,Tsallis entropy ,Maximum entropy probability distribution ,Maximum entropy thermodynamics ,FOS: Physical sciences ,Min entropy ,Statistical physics ,Condensed Matter - Statistical Mechanics ,Entropy rate ,Joint quantum entropy ,Mathematics - Abstract
The q-exponential distributions, which are generalizations of the Zipf-Mandelbrot power-law distribution, are frequently encountered in complex systems at their stationary states. From the viewpoint of the principle of maximum entropy, they can apparently be derived from three different generalized entropies: the Renyi entropy, the Tsallis entropy, and the normalized Tsallis entropy. Accordingly, mere fittings of observed data by the q-exponential distributions do not lead to identification of the correct physical entropy. Here, stabilities of these entropies, i.e., their behaviors under arbitrary small deformation of a distribution, are examined. It is shown that, among the three, the Tsallis entropy is stable and can provide an entropic basis for the q-exponential distributions, whereas the others are unstable and cannot represent any experimentally observable quantities., 20 pages, no figures, the disappeared "primes" on the distributions are added. Also, Eq. (65) is corrected
- Published
- 2002
30. Entropy of electromagnetic polarization
- Author
-
Y. Zimmels
- Subjects
Physics ,Classical mechanics ,H-theorem ,Configuration entropy ,Maximum entropy probability distribution ,Statistical physics ,Entropy in thermodynamics and information theory ,Entropy (arrow of time) ,Residual entropy ,Entropy rate ,Joint quantum entropy - Abstract
The entropy of electromagnetic polarization is considered in this paper. It is shown that unless the non-field entropy, and not the total entropy, is used as the independent variable in the expression for the internal energy, the first law is violated and the meaning of heat flow, as given by the second law, is contradicted. The total entropy and its field and non-field components are shown to be state functions. The field entropy comprises contributions from the field generated by the contents of the system and stored within as well as outside its boundaries. The contribution of the field stored outside the system boundaries is derived and demonstrated for the case of a uniformly polarized sphere. Finally, expressions are derived for field entropies and entropy densities, in composite systems, using the concept of interaction entropy. The results are shown to be fundamentally different compared to those used in the current literature.
- Published
- 2002
31. Periodic orbits and topological entropy of delayed maps
- Author
-
E. . Ferretti Manffra, Holger Kantz, and Wolfram Just
- Subjects
Pure mathematics ,Dynamical systems theory ,Attractor ,Metric (mathematics) ,Mathematical analysis ,Topological entropy ,Limit (mathematics) ,Topological entropy in physics ,Joint quantum entropy ,Entropy rate ,Mathematics - Abstract
The periodic orbits of a nonlinear dynamical system provide valuable insight into the topological and metric properties of its chaotic attractors. In this paper we describe general properties of periodic orbits of dynamical systems with feedback delay. In the case of delayed maps, these properties enable us to provide general arguments about the boundedness of the topological entropy in the high delay limit. As a consequence, all the metric entropies can be shown to be bounded in this limit. The general considerations are illustrated in the cases of Bernoulli-like and Hénon-like delayed maps.
- Published
- 2001
32. Exponential decay of relative entropies to the Kolmogorov-Sinai entropy for the standard map
- Author
-
Wojciech Słomczyński and Karol Życzkowski
- Subjects
Nonlinear Sciences::Chaotic Dynamics ,Entropy (classical thermodynamics) ,Exponential growth ,Maximum entropy probability distribution ,Thermodynamics ,Min entropy ,Standard map ,Statistical physics ,Exponential decay ,Entropy rate ,Quantum relative entropy ,Mathematics - Abstract
It is shown how to obtain a fast convergence of the relative dynamical entropies to the limiting Kolmogorov-Sinai entropy for the generating partition of the standard map. (c) 1995 The American Physical Society
- Published
- 1995
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.