Back to Search
Start Over
Maximum entropy principle and power-law tailed distributions
- Source :
- Eur. Phys. J. B 70, 3-13 (2009)
- Publication Year :
- 2009
-
Abstract
- In ordinary statistical mechanics the Boltzmann-Shannon entropy is related to the Maxwell-Bolzmann distribution $p_i$ by means of a twofold link. The first link is differential and is offered by the Jaynes Maximum Entropy Principle. The second link is algebraic and imposes that both the entropy and the distribution must be expressed in terms of the same function in direct and inverse form. Indeed, the Maxwell-Boltzmann distribution $p_i$ is expressed in terms of the exponential function, while the Boltzmann-Shannon entropy is defined as the mean value of $-\ln(p_i)$. In generalized statistical mechanics the second link is customarily relaxed. Here we consider the question if and how is it possible to select generalized statistical theories in which the above mentioned twofold link between entropy and the distribution function continues to hold, such as in the case of ordinary statistical mechanics. Within this scenario, there emerge new couples of direct-inverse functions, i.e. generalized logarithms $\Lambda(x)$ and generalized exponentials $\Lambda^{-1}(x)$, defining coherent and self-consistent generalized statistical theories. Interestingly, all these theories preserve the main features of ordinary statistical mechanics, and predict distribution functions presenting power-law tails. Furthermore, the obtained generalized entropies are both thermodynamically and Lesche stable.<br />Comment: To appear in EPJB. 11 journal pages, 100 references
Details
- Database :
- arXiv
- Journal :
- Eur. Phys. J. B 70, 3-13 (2009)
- Publication Type :
- Report
- Accession number :
- edsarx.0904.4180
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.1140/epjb/e2009-00161-0