# The maximum entropy principle: a generalized constraint-based entropy

30 May 2009-Modern Physics Letters B (World Scientific Publishing Company)-Vol. 23, Iss: 13, pp 1715-1721

TL;DR: A new generalized constraint-based entropy is derived on the basis of the maximum entropy principle which reduces to the form of Shannon entropy in a limiting case.

Abstract: We have derived a new generalized constraint-based entropy on the basis of the maximum entropy principle. The new entropy which is very similar to, but different from Havrda and Charvat entropy and Tsallis entropy, reduces to the form of Shannon entropy in a limiting case. The characteristics properties of this new entropy have been pointed out.

##### References

More filters

••

TL;DR: In this paper, a generalized form of entropy was proposed for the Boltzmann-Gibbs statistics with the q→1 limit, and the main properties associated with this entropy were established, particularly those corresponding to the microcanonical and canonical ensembles.

Abstract: With the use of a quantity normally scaled in multifractals, a generalized form is postulated for entropy, namelyS
q
≡k [1 – ∑
i=1
W
p
i
q
]/(q-1), whereq∈ℝ characterizes the generalization andp
i are the probabilities associated withW (microscopic) configurations (W∈ℕ). The main properties associated with this entropy are established, particularly those corresponding to the microcanonical and canonical ensembles. The Boltzmann-Gibbs statistics is recovered as theq→1 limit.

8,239 citations

•

TL;DR: This paper has been digitized, optimized for electronic delivery and stamped with digital signature within the project DML-CZ: The Czech Digital Mathematics Library.

Abstract: Institute of Mathematics of the Academy of Sciences of the Czech Republic provides access to digitized documents strictly for personal use. Each copy of any part of this document must contain these Terms of use. This paper has been digitized, optimized for electronic delivery and stamped with digital signature within the project DML-CZ: The Czech Digital Mathematics Library

1,009 citations

••

01 Jan 1992TL;DR: A state-of-the-art description of the theory and applications of the various entropy optimization principles is given and the relation between information-theoretic entropy and thermodynamic entropy is specially recalled in the context of the more general relationship that exist between what are designated as primary and secondary entropies.

Abstract: A state-of-the-art description of the theory and applications of the various entropy optimization principles is given. These principles include Jaynes’ maximum entropy principle (MaxEnt), Kullback’s minimum cross-entropy principle (MinxEnt), generalised maximum entropy and minimum cross-entropy principles, inverse entropy optimization principles, minimum interdependence principle, minimax entropy principle and finally, the dual entropy optimization principles. The relation between information-theoretic entropy and thermodynamic entropy is specially recalled in the context of the more general relationship that exist between what are designated as primary and secondary entropies.

379 citations

••

TL;DR: It is shown how this generalization that unifies Renyi and Tsallis entropy in a coherent picture naturally comes into being if the q-formalism of generalized logarithm and exponential functions is used, and how together with Sharma–Mittal's measure another possible extension emerges which however does not obey a pseudo-additive law and lacks of other properties relevant for a generalized thermostatistics.

Abstract: Tsallis and Renyi entropy measures are two possible different generalizations of the Boltzmann–Gibbs entropy (or Shannon's information) but are not generalizations of each others. It is however the Sharma–Mittal measure, which was already defined in 1975 [J. Math. Sci. 10 (1975) 28] and which received attention only recently as an application in statistical mechanics [Physica A 285 (2000) 351, Eur. Phys. J. B 30 (2002) 543] that provides one possible unification. We will show how this generalization that unifies Renyi and Tsallis entropy in a coherent picture naturally comes into being if the q-formalism of generalized logarithm and exponential functions is used, how together with Sharma–Mittal's measure another possible extension emerges which however does not obey a pseudo-additive law and lacks of other properties relevant for a generalized thermostatistics, and how the relation between all these information measures is best understood when described in terms of a particular logarithmic Kolmogorov–Nagumo average.

173 citations

••

TL;DR: By solving a differential-functional equation inposed by the MaxEnt principle, a class of two-parameter deformed logarithms are obtained and the corresponding two- parameter generalized trace-form entropies are constructed.

Abstract: By solving a differential-functional equation inposed by the MaxEnt principle we obtain a class of two-parameter deformed logarithms and construct the corresponding two-parameter generalized trace-form entropies. Generalized distributions follow from these generalized entropies in the same fashion as the Gaussian distribution follows from the Shannon entropy, which is a special limiting case of the family. We determine the region of parameters where the deformed logarithm conserves the most important properties of the logarithm, and show that important existing generalizations of the entropy are included as special cases in this two-parameter class.

83 citations