Topic
Maximum entropy thermodynamics
About: Maximum entropy thermodynamics is a research topic. Over the lifetime, 2694 publications have been published within this topic receiving 95455 citations.
Papers published on a yearly basis
Papers
More filters
TL;DR: In this article, the authors consider statistical mechanics as a form of statistical inference rather than as a physical theory, and show that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle.
Abstract: Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.
12,099 citations
TL;DR: In this paper, the concept of black-hole entropy was introduced as a measure of information about a black hole interior which is inaccessible to an exterior observer, and it was shown that the entropy is equal to the ratio of the black hole area to the square of the Planck length times a dimensionless constant of order unity.
Abstract: There are a number of similarities between black-hole physics and thermodynamics. Most striking is the similarity in the behaviors of black-hole area and of entropy: Both quantities tend to increase irreversibly. In this paper we make this similarity the basis of a thermodynamic approach to black-hole physics. After a brief review of the elements of the theory of information, we discuss black-hole physics from the point of view of information theory. We show that it is natural to introduce the concept of black-hole entropy as the measure of information about a black-hole interior which is inaccessible to an exterior observer. Considerations of simplicity and consistency, and dimensional arguments indicate that the black-hole entropy is equal to the ratio of the black-hole area to the square of the Planck length times a dimensionless constant of order unity. A different approach making use of the specific properties of Kerr black holes and of concepts from information theory leads to the same conclusion, and suggests a definite value for the constant. The physical content of the concept of black-hole entropy derives from the following generalized version of the second law: When common entropy goes down a black hole, the common entropy in the black-hole exterior plus the black-hole entropy never decreases. The validity of this version of the second law is supported by an argument from information theory as well as by several examples.
6,591 citations
TL;DR: In this paper, a renormalized entropy is defined as the difference in the entropy relative to the ground state of a quantum field theory excited by a moving mirror, and it is shown that the entropy can diverge for sharply localized states.
Abstract: In statistical physics, useful notions of entropy are defined with respect to some coarse-graining procedure over a microscopic model. Here we consider some special problems that arise when the microscopic model is taken to be relativistic quantum field theory. These problems are associated with the existence of an infinite number of degrees of freedom per unit volume. Because of these the microscopic entropy can, and typically does, diverge for sharply localized states. However, the difference in the entropy between two such states is better behaved, and for most purposes it is the useful quantity to consider. In particular, a renormalized entropy can be defined as the entropy relative to the ground state. We make these remarks quantitative and precise in a simple model situation: the states of a conformal quantum field theory excited by a moving mirror. From this work, we attempt to draw some lessons concerning the “information problem” in black hole physics.
1,798 citations
TL;DR: Jaynes's principle of maximum entropy and Kullbacks principle of minimum cross-entropy (minimum directed divergence) are shown to be uniquely correct methods for inductive inference when new information is given in the form of expected values.
Abstract: Jaynes's principle of maximum entropy and Kullbacks principle of minimum cross-entropy (minimum directed divergence) are shown to be uniquely correct methods for inductive inference when new information is given in the form of expected values. Previous justifications use intuitive arguments and rely on the properties of entropy and cross-entropy as information measures. The approach here assumes that reasonable methods of inductive inference should lead to consistent results when there are different ways of taking the same information into account (for example, in different coordinate system). This requirement is formalized as four consistency axioms. These are stated in terms of an abstract information operator and make no reference to information measures. It is proved that the principle of maximum entropy is correct in the following sense: maximizing any function but entropy will lead to inconsistency unless that function and entropy have identical maxima. In other words given information in the form of constraints on expected values, there is only one (distribution satisfying the constraints that can be chosen by a procedure that satisfies the consistency axioms; this unique distribution can be obtained by maximizing entropy. This result is established both directly and as a special case (uniform priors) of an analogous result for the principle of minimum cross-entropy. Results are obtained both for continuous probability densities and for discrete distributions.
1,774 citations