scispace - formally typeset
Search or ask a question

Showing papers in "Entropy in 2001"


Journal ArticleDOI
21 Aug 2001-Entropy
TL;DR: In this article, the authors provide a background for better understanding of energy, entropy and exergy concepts and their differences among various classes of life support systems, with a diverse coverage.
Abstract: Energy, entropy and exergy concepts come from thermodynamics and are applicable to all fields of science and engineering. Therefore, this article intends to provide background for better understanding of these concepts and their differences among various classes of life support systems with a diverse coverage. It also covers the basic principles, general definitions and practical applications and implications. Some illustrative examples are presented to highlight the importance of the aspects of energy, entropy and exergy and their roles in thermal engineering.

435 citations


Journal ArticleDOI
30 Sep 2001-Entropy
TL;DR: The present paper offers a self-contained and comprehensive treatment of fundamentals of both principles, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss.
Abstract: In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a rst reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here taken to represent "energy". This type of application is very well known from the literature with hundreds of applications pertaining to several different elds and will also here serve as important illustration of the theory. But our approach reaches further, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. These results have tempted us to speculate over the development of natural languages. In fact, we are able to relate our theoretical findings to the empirically found Zipf's law which involves statistical aspects of words in a language. The apparent irregularity inherent in models with entropy loss turns out to imply desirable stability properties of languages.

149 citations


Journal ArticleDOI
30 Sep 2001-Entropy
TL;DR: Basic concepts and results of that part of Information Theory which is often referred to as "Shannon Theory" are discussed with focus mainly on the discrete case.
Abstract: Basic concepts and results of that part of Information Theory which is often referred to as "Shannon Theory" are discussed with focus mainly on the discrete case. The paper is expository with some new proofs and extensions of results and concepts.

63 citations


Journal ArticleDOI
Jarle Breivik1
20 Nov 2001-Entropy
TL;DR: A system of ferromagnetic objects that self-organize into template-replicating polymers due to environmental fluctuations in temperature is presented, demonstrating the fundamental link between thermodynamics, information theory, and life science in an unprecedented manner.
Abstract: Living systems imply self-reproducing constructs capable of Darwinian evolution. How such dynamics can arise from undirected interactions between simple monomeric objects remains an open question. Here we circumvent difficulties related to the manipulation of chemical interactions, and present a system of ferromagnetic objects that self-organize into template-replicating polymers due to environmental fluctuations in temperature. Initially random sequences of monomers direct the formation of complementary sequences, and structural information is inherited from one structure to another. Selective replication of sequences occurs in dynamic interaction with the environment, and the system demonstrates the fundamental link between thermodynamics, information theory, and life science in an unprecedented manner.

53 citations


Journal ArticleDOI
01 Feb 2001-Entropy
TL;DR: Certain aspects of the history, derivation, and physical application of the information-theoretic entropy concept are discussed, and practical relativistic considerations suggest a possible proper reference density.
Abstract: Certain aspects of the history, derivation, and physical application of the information-theoretic entropy concept are discussed. Pre-dating Shannon, the concept is traced back to Pauli. A derivation from first principles is given, without use of approximations. The concept depends on the underlying degree of randomness. In physical applications, this translates to dependence on the experimental apparatus available. An example illustrates how this dependence affects Prigogine's proposal for the use of the Second Law of Thermodynamics as a selection principle for the breaking of time symmetry. The dependence also serves to yield a resolution of the so-called ``Gibbs Paradox.'' Extension of the concept from the discrete to the continuous case is discussed. The usual extension is shown to be dimensionally incorrect. Correction introduces a reference density, leading to the concept of Kullback entropy. Practical relativistic considerations suggest a possible proper reference density.

30 citations


Journal ArticleDOI
22 Jun 2001-Entropy
TL;DR: It is shown that writing the two particle interactions for unlike particles allows an examination in two limiting cases: large and small separations, and these limits are shown to have the limiting motions of all motions are ABOUT the center of mass or all motion is OF the Center of mass.
Abstract: It is shown that the classical laws of thermodynamics require that mechanical systems must exhibit energy that becomes unavailable to do useful work In thermodynamics, this type of energy is called entropy It is further shown that these laws require two metrical manifolds, equations of motion, field equations, and Weyl's quantum principles Weyl's quantum principle requires quantization of the electrostatic potential of a particle and that this potential be non-singular The interactions of particles through these non-singular electrostatic potentials are analyzed in the low velocity limit and in the relativistic limit It is shown that writing the two particle interactions for unlike particles allows an examination in two limiting cases: large and small separations These limits are shown to have the limiting motions of: all motions are ABOUT the center of mass or all motion is OF the center of mass The first limit leads to the standard Dirac equation The second limit is shown to have equations of which the electroweak theory is a subset An extension of the gauge principle into a five-dimensional manifold, then restricting the generality of the five-dimensional manifold by using the conservation principle, shows that the four-dimensional hypersurface that is embedded within the 5-D manifold is required to obey Einstein's field equations The 5-D gravitational quantum equations of the solar system are presented

26 citations


Journal ArticleDOI
21 Nov 2001-Entropy
TL;DR: It is shown that the establishment of the concept of the mutual information is of importance upon the generalization of Shannon's information theory, which exhibits nonadditivity of the associated uncertainty.
Abstract: Takuya YamanoInstitut fur Theoretische Physik, Universitat zu Koln, Zulpic her Str.77, D-50937 Koln, EurolandDepartment of Applied Physics, Faculty of Science, Tokyo Institute of Technology, Oh-okayama,Meguro-ku, Tokyo,152-8551, JapanE-mail: tyamano@mikan.ap.titech.ac.jpReceived: 7 August 2001/ Accepted: 31 October 2001/ Published: 21 November 2001Abstract:As a possible generalization of Shannon’s information theory, we review the formalismbased on the non-logarithmic information content parametrized by a real number q,which exhibits nonadditivity of the associated uncertainty. Moreover it is shown thatthe establishment of the concept of the mutual information is of importance upon thegeneralization.Keywords: Information theory; Tsallis entropy; Nonadditivity; Source coding theo-rem; Mutual information.c 2001 by the author. Reproduction for noncommercial purposes permitted.

24 citations


Journal ArticleDOI
08 Apr 2001-Entropy
TL;DR: There are several mystifications and a couple of mysteries pertinent to MaxEnt and a new formulation of the problem is proposed, and an answer to the recurring question 'Just what are the authors accomplishing when they maximize entropy?' is recalled.
Abstract: There are several mystifications and a couple of mysteries pertinent to MaxEnt. The mystifications, pitfalls and traps are set up mainly by an unfortunate formulation of Jaynes' die problem, the cause celebre of MaxEnt. After discussing the mystifications a new formulation of the problem is proposed. Then we turn to the mysteries. An answer to the recurring question 'Just what are we accomplishing when we maximize entropy?' [8], based on MaxProb rationale of MaxEnt [6], is recalled. A brief view on the other mystery: 'What is the relation between MaxEnt and the Bayesian method?' [9], in light of the MaxProb rationale of MaxEnt suggests that there is not and cannot be a conflict between MaxEnt and Bayes Theorem.

18 citations


Journal ArticleDOI
10 Oct 2001-Entropy
TL;DR: This paper uses mathematical models to analyzethe major transitions in language evolution and discusses the evolution ofcoordinated associations between signals and objects in a population, and analyzes the population dynamics of words and the adaptive emergence of syntax.
Abstract: Language is the most important evolutionary invention of the last few mil-lion years. How human language evolved from animal communication is a challengingquestion for evolutionary biology. In this paper we use mathematical models to analyzethe major transitions in language evolution. We begin by discussing the evolution ofcoordinated associations between signals and objects in a population. We then analyzeword-formation and its relationship to Shannon’s noisy coding theorem. Finally, wemodel the population dynamics of words and the adaptive emergence of syntax.Keywords: Language evolution; evolutionary game theory; Shannon’s noisy codingtheorem; phoneme. ∗ The authors gratefully acknowledge support from the Alfred P. Sloan Foundation, The Ambrose MonellFoundation, The Florence Gould Foundation, and the J. Seward Johnson Trust. J.B.P also acknowledgessupport from the National Science Foundation and the Burroughs Wellcome Fund c 2001 by the authors. Reproduction for noncommercial purposes permitted.

18 citations


Journal ArticleDOI
30 Sep 2001-Entropy
TL;DR: An extension of the concept of Hellinger process applicable to entropy distance and f-divergence distances is proposed, which leads to a new approach to Merton's optimal portfolio problem and its dual in general L¶evy markets.
Abstract: This paper illustrates the natural role that Hellinger processes can play in solving problems from ¯nance. We propose an extension of the concept of Hellinger process applicable to entropy distance and f-divergence distances, where f is a convex logarithmic function or a convex power function with general order q, 0 6= q < 1. These concepts lead to a new approach to Merton's optimal portfolio problem and its dual in general L¶evy markets.

11 citations


Journal ArticleDOI
20 Nov 2001-Entropy
TL;DR: It is found that data defining the acknowledged major changes in the evolution of earth, the life on it, and cultural and technological growth, conform to a robust, phenomenological model for the growth of a system parameter.
Abstract: Based on a robust, phenomenological model for the growth of a system parameter, a relation is derived to test the evolution of such a parameter through several distinct stages. It is found that data defining the acknowledged major changes in the evolution of earth, the life on it, and cultural and technological growth, conform to this model. The nature of these altering events indicates that information is the parameter involved, suggesting an unrecognized behavior in the Second Law of Thermodynamics.

Journal ArticleDOI
20 Jun 2001-Entropy
TL;DR: It is argued that it can be derived from the dynamical holographic principle which states that the dynamics of a system in a region should be described by a system which lives on the boundary of the region which can be valid in general relativity because the ADM hamiltonian reduces to the surface term.
Abstract: A simple derivation of the bound on entropy is given and the holographic principle is discussed. We estimate the number of quantum states inside space region on the base of uncertainty relation. The result is compared with the Bekenstein formula for entropy bound, which was initially derived from the generalized second law of thermodynamics for black holes. The holographic principle states that the entropy inside a region is bounded by the area of the boundary of that region. This principle can be called the kinematical holographic principle. We argue that it can be derived from the dynamical holographic principle which states that the dynamics of a system in a region should be described by a system which lives on the boundary of the region. This last principle can be valid in general relativity because the ADM hamiltonian reduces to the surface term.

Journal ArticleDOI
26 Mar 2001-Entropy
TL;DR: An elementary derivation of the Black Hole Entropy area relation in any dimension is provided based on the New Extended Scale Relativity Principle and Shan-non’s Information Entropy.
Abstract: Carlos CastroCenter for Theoretical Studies of Physical Systems, Clark Atlanta University, Atlanta, GA. 30314.E-mail: castro@ctsps.cau.eduReceived: 30 April 2000 / Accepted: 22 November 2000 / Published: 26 March 2001Abstract: An elementary derivation of the Black Hole Entropy area relation in anydimension is provided based on the New Extended Scale Relativity Principle and Shan-non’s Information Entropy. The well known entropy-area linear Bekenstein-Hawkingrelation is derived. We discuss briey how to derive the most recently obtained Log-arithmic and higher order corrections to the linear entropy-area law in full agreementwith the standard results in the literature.Keywords: Entropy, New Relativity, Cli ord Oscillator; p-branes, Black Holes.c 2001 by the author. Reproduction for noncommercial purposes permitted.

Journal ArticleDOI
01 Dec 2001-Entropy
TL;DR: This paper juxtaposes the elements of Newell’s conceptual basis with those of analternative conceptual framework based on the thesis that communication and this is a revised version of a paper of the same title delivered at the 17
Abstract: The cardinality of the class, C, of complex intelligent systems, i.e.,systems of intelligent systems and their resources, is steadily increasing. Such anincrease, whether designed, sometimes changes significantly and fundamentally,the structure of C. Recently, the study of members of C and its structure comesunder a variety of multidisciplinary headings the most prominent of which includeGeneral Systems Theory, Complexity Science, Artificial Life, and Cybernetics.Their common characteristic is the quest for a unified theory of a certain class ofsystems like a living system or an organisation. So far, the only candidate for ageneral theory of intelligent systems is Newell’s Soar. To my knowledge there ispresently no candidate theory of C except Newell’s claimed extensibility of Soar.This paper juxtaposes the elements of Newell’s conceptual basis with those of analternative conceptual framework based on the thesis that communication and * This is a revised version of a paper of the same title delivered at the 17

Journal ArticleDOI
11 Apr 2001-Entropy
TL;DR: Constraints at the level of ingredients have been characterized as a counterpart to the emergence of a collective behavior (percolation) in very simple CA simulations and succeeded in quantifying the expected, strong direct influences of the initial conditions on the configuration and movement of occupied cells.
Abstract: In a recent study by Kier, Cheng and Testa, simulations were carried out to monitor and quantify the emergence of a collective phenomenon, namely percolation, in a many-particle system modeled by cellular automata (CA). In the present study, the same setup was used to monitor the counterpart to collective behavior, namely the behavior of individual particles, as modeled by occupied cells in the CA simulations. As in the previous study, the input variables were the concentration of occupied cells and their joining and breaking probabilities. The first monitored attribute was the valence configuration (state) of the occupied cells, namely the percent of occupied cells in configuration Fi (%Fi), where i = number of occupied cells joined to that cell. The second monitored attribute was a functional one, namely the probability (in %) of a occupied cell in configuration Fi to move during one iteration (%Mi). First, this study succeeded in quantifying the expected, strong direct influences of the initial conditions on the configuration and movement of occupied cells. Statistical analyses unveiled correlations between initial conditions and cell configurations and movements. In particular, the distribution of configurations (%Fi) varied with concentration with a kinematic-like regularity amenable to mathematical modeling. However, another result also emerged from the work, such that the joining, breaking and concentration factors not only influenced the movement of occupied cells, they also modified each other's influence (Figure 1). These indirect influences have been demonstrated quite clearly, and some partial statistical descriptions were established. Thus, constraints at the level of ingredients (dissolvence) have been characterized as a counterpart to the emergence of a collective behavior (percolation) in very simple CA simulations.

Journal ArticleDOI
20 Dec 2001-Entropy
TL;DR: Property of the Iα-divergence between the law of the solution Xt and the corresponding drift-less measure is studied and will be applied to some context in statistical information theory as well as to arbitrage theory and contingent claim valuation.
Abstract: We consider asset price processes Xt which are weak solutions of one-dimensional stochastic differential equations of the form (equation (2)) Such price models can be interpreted as non-lognormally-distributed generalizations of the geometric Brownian motion. We study properties of the Iα-divergence between the law of the solution Xt and the corresponding drift-less measure (the special case α=1 is the relative entropy). This will be applied to some context in statistical information theory as well as to arbitrage theory and contingent claim valuation. For instance, the seminal option pricing theorems of Black-Scholes and Merton appear as a special case.

Journal ArticleDOI
20 Dec 2001-Entropy
TL;DR: In this article, the authors gave lectures entitled "Ugly symmetry" at several conferences and many institutes during past six years, and they used the term "ugly symmetry" in their lecture.
Abstract: I gave lectures entitled "Ugly Symmetry" at several conferences and many institutes during past six years.[...]