scispace - formally typeset
Open AccessJournal ArticleDOI

When do generalized entropies apply? How phase space volume determines entropy

Reads0
Chats0
TLDR
In this article, the authors show that generalized entropies can only exist when the dynamically (statistically) relevant fraction of degrees of freedom in the system vanishes in the thermodynamic limit.
Abstract
We show how the dependence of phase space volume Ω(N) on system size N uniquely determines the extensive entropy of a classical system. We give a concise criterion when this entropy is not of Boltzmann-Gibbs type but has to assume a generalized (non-additive) form. We show that generalized entropies can only exist when the dynamically (statistically) relevant fraction of degrees of freedom in the system vanishes in the thermodynamic limit. These are systems where the bulk of the degrees of freedom is frozen and statistically inactive. Systems governed by generalized entropies are therefore systems whose phase space volume effectively collapses to a lower-dimensional "surface". We illustrate these results in three concrete examples: accelerating random walks, a microcanonical spin system on networks and constrained binomial processes. These examples suggest that a wide class of systems with "surface-dominant" statistics might in fact require generalized entropies, including self-organized critical systems such as sandpiles, anomalous diffusion, and systems with topological defects such as vortices, domains, or instantons.

read more

Citations
More filters
Journal ArticleDOI

Black hole thermodynamical entropy

TL;DR: In this paper, a generalized Boltzmann-Gibbs (BG) entropy was introduced for the Schwarzschild black hole and the area law, which can solve the thermodynamic puzzle.
Journal ArticleDOI

A Brief Review of Generalized Entropies

TL;DR: This review focuses on the so-called generalized entropies, which from a mathematical point of view are nonnegative functions defined on probability distributions that satisfy the first three Shannon–Khinchin axioms: continuity, maximality and expansibility.
Journal ArticleDOI

Understanding scaling through history-dependent processes with collapsing sample space

TL;DR: It is demonstrated that sample-space-reducing (SSR) processes necessarily lead to Zipf's law in the rank distributions of their outcomes, and several applications showing how SSR processes can be used to understand Zipf’s law in word frequencies are discussed.
Journal ArticleDOI

Maximum Entropy Principle in Statistical Inference: Case for Non-Shannonian Entropies

TL;DR: It is shown that the Shore-Johnson axioms for the maximum entropy principle in statistical estimation theory account for a considerably wider class of entropic functional than previously thought.
Journal ArticleDOI

Co-Evolutionary Mechanisms of Emotional Bursts in Online Social Dynamics and Networks

TL;DR: It was found that, in its own way, each mechanism leads to a reduced phase space of the emotion components when the collective dynamics takes place, and that a non-additive entropy describes emotion dynamics.
References
More filters
Book

Nonextensive Entropy: Interdisciplinary Applications

TL;DR: A great variety of complex phenomena in many scientific fields exhibit power-law behavior, reflecting a hierarchical or fractal structure as mentioned in this paper, and these phenomena seem to be susceptible to description using approaches drawn from thermodynamics or statistical mechanics, particularly approaches involving the maximization of entropy and of Boltzmann-Gibbs statistical mechanics.
Journal ArticleDOI

Asymptotically scale-invariant occupancy of phase space makes the entropy Sq extensive

TL;DR: It is conjecture that these mechanisms are deeply related to the very frequent emergence, in natural and artificial complex systems, of scale-free structures and to their connections with nonextensive statistical mechanics.
Journal ArticleDOI

Generalized entropies and the transformation group of superstatistics

TL;DR: In this article, the first three Shannon-Khinchin axioms are assumed to hold, and it is shown that for a given distribution there are two different ways to construct the entropy.