scispace - formally typeset
Search or ask a question

Showing papers in "Entropy in 2009"


Journal ArticleDOI
16 Nov 2009-Entropy
TL;DR: Further advances are needed to better define model thresholds, to test model significance, and to address model selection to strengthen the utility of Maxent for wildlife research and management.
Abstract: Maximum entropy (Maxent) modeling has great potential for identifying distributions and habitat selection of wildlife given its reliance on only presence locations. Recent studies indicate Maxent is relatively insensitive to spatial errors associated with location data, requires few locations to construct useful models, and performs better than other presence-only modeling approaches. Further advances are needed to better define model thresholds, to test model significance, and to address model selection. Additionally, development of modeling approaches is needed when using repeated sampling of known individuals to assess habitat selection. These advancements would strengthen the utility of Maxent for wildlife research and management.

595 citations


Journal ArticleDOI
27 Nov 2009-Entropy
TL;DR: It is tentatively suggested that MEP is equivalent to Jaynes’ Maximum Entropy (MaxEnt) inference algorithm that passively translates physical assumptions into macroscopic predictions, as applied to non-equilibrium systems.
Abstract: Is Maximum Entropy Production (MEP) a physical principle? In this paper I tentatively suggest it is not, on the basis that MEP is equivalent to Jaynes’ Maximum Entropy (MaxEnt) inference algorithm that passively translates physical assumptions into macroscopic predictions, as applied to non-equilibrium systems. MaxEnt itself has no physical content; disagreement between MaxEnt predictions and experiment falsifies the physical assumptions, not MaxEnt. While it remains to be shown rigorously that MEP is indeed equivalent to MaxEnt for systems arbitrarily far from equilibrium, work in progress tentatively supports this conclusion. In terms of its role within non-equilibrium statistical mechanics, MEP might then be better understood as Messenger of Essential Physics.

112 citations


Journal ArticleDOI
06 Nov 2009-Entropy
TL;DR: Exergy is concluded to have a significant role in assessing and improving the efficiencies of electrical power technologies and systems, and provides a useful tool for engineers and scientists as well as decision and policy makers.
Abstract: The benefits are demonstrated of using exergy to understand the efficiencies of electrical power technologies and to assist improvements. Although exergy applications in power systems and electrical technology are uncommon, exergy nevertheless identifies clearly potential reductions in thermodynamic losses and efficiency improvements. Various devices are considered, ranging from simple electrical devices to generation systems for electrical power and for multiple products including electricity, and on to electrically driven. The insights provided by exergy are shown to be more useful than those provided by energy, which are sometimes misleading. Exergy is concluded to have a significant role in assessing and improving the efficiencies of electrical power technologies and systems, and provides a useful tool for engineers and scientists as well as decision and policy makers.

100 citations


Journal ArticleDOI
21 Oct 2009-Entropy
TL;DR: Economic activity can be regarded as an evolutionary process governed by the 2nd law of thermodynamics, and the law of diminishing returns follows from the diminishing free energy while the relation between supply and demand displays a quest for a balance among interdependent energy densities.
Abstract: Economic activity can be regarded as an evolutionary process governed by the 2 nd law of thermodynamics. The universal law, when formulated locally as an equation of motion, reveals that a growing economy develops functional machinery and organizes hierarchically in such a way as to tend to equalize energy density differences within the economy and in respect to the surroundings it is open to. Diverse economic activities result in flows of energy that will preferentially channel along the most steeply descending paths, leveling a non-Euclidean free energy landscape. This principle of ‗maximal energy dispersal‘, equivalent to the maximal rate of entropy production, gives rise to economic laws and regularities. The law of diminishing returns follows from the diminishing free energy while the relation between supply and demand displays a quest for a balance among interdependent energy densities. Economic evolution is dissipative motion where the driving forces and energy flows are inseparable from each other. When there are multiple degrees of freedom, economic growth and decline are inherently impossible to forecast in detail. Namely, trajectories of an evolving economy are non-integrable,

81 citations


Journal ArticleDOI
14 Dec 2009-Entropy
TL;DR: A survey of quantum decision theory can be found in this article, where the authors present a self-consistent procedure of decision making, in the frame of the Quantum Decision Theory, taking into account both the available objective information as well as subjective contextual effects.
Abstract: A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention interference. The self-consistent procedure of decision making, in the frame of the quantum decision theory, takes into account both the available objective information as well as subjective contextual effects. This quantum approach avoids any paradox typical of classical decision theory. Conditional maximization of entropy, equivalent to the minimization of an information functional, makes it possible to connect the quantum and classical decision theories, showing that the latter is the limit of the former under vanishing interference terms.

81 citations


Journal ArticleDOI
28 Sep 2009-Entropy
TL;DR: The result of this study is a new branch of thermodynamics: Finite Dimensions Optimal Thermodynamics (FDOT), which describes the main situations of constrained (or unconstrained) optimization.
Abstract: This paper reviews how ideas have evolved in this field from the pioneering work of CARNOT right up to the present. The coupling of thermostatics with thermokinetics (heat and mass transfers) and entropy or exergy analysis is illustrated through study of thermomechanical engines such as the Carnot heat engine, and internal combustion engines. The benefits and importance of stagnation temperature and irreversibility parameters are underlined. The main situations of constrained (or unconstrained) optimization are defined, discussed and illustrated. The result of this study is a new branch of thermodynamics: Finite Dimensions Optimal Thermodynamics (FDOT).

60 citations


Journal ArticleDOI
Yan-Fang Sang, Dong Wang1, Jichun Wu, Qingping Zhu, Ling Wang1 
22 Dec 2009-Entropy
TL;DR: Because it uses information entropy theories to describe the obviously different characteristics of noises and the main series in the series data is observed first and then de-noised, the analysis process has a more reliable physical basis, and the results of the new proposed method are more reasonable and are the global optimum.
Abstract: The existence of noise has great influence on the real features of observed time series, thus noise reduction in time series data is a necessary and significant task in many practical applications. When using traditional de-noising methods, the results often cannot meet the practical needs due to their inherent shortcomings. In the present paper, first a set of key but difficult wavelet de-noising problems are discussed, and then by applying information entropy theories to the wavelet de-noising process, i.e. , using the principle of maximum entropy (POME) to describe the random character of the noise and using wavelet energy entropy to describe the degrees of complexity of the main series in original series data, a new entropy-based wavelet de-noising method is proposed. Analysis results of both several different synthetic series and typical observed time series data have verified the performance of the new method. A comprehensive discussion of the results indicates that compared with traditional wavelet de-noising methods, the new proposed method is more effective and universal. Furthermore, because it uses information entropy theories to describe the obviously different characteristics of noises and the main series in the series data is observed first and then de-noised, the analysis process has a more reliable physical basis, and the results of the new proposed method are more reasonable and are the global optimum. Besides, the analysis process of the new proposed method is simple and is easy

59 citations


Journal ArticleDOI
30 Oct 2009-Entropy
TL;DR: It can be shown that the behavioral context in which a whistle tends to occur or not occur is shared by different individuals, which is consistent with the hypothesis that dolphins are communicating through whistles.
Abstract: We show that dolphin whistle types tend to be used in specific behavioral contexts, which is consistent with the hypothesis that dolphin whistle have some sort of “meaning”. Besides, in some cases, it can be shown that the behavioral context in which a whistle tends to occur or not occur is shared by different individuals, which is consistent with the hypothesis that dolphins are communicating through whistles. Furthermore, we show that the number of behavioral contexts significantly associated with a certain whistle type tends to grow with the frequency of the whistle type, a pattern that is reminiscent of a law of word meanings stating, as a tendency, that the higher the frequency of a word, the higher its number of meanings. Our findings indicate that the presence of Zipf's law in dolphin whistle types cannot be explained with enough detail by a simplistic die rolling experiment.

45 citations


Journal ArticleDOI
02 Nov 2009-Entropy
TL;DR: A review of models based on the Maximum Entropy Formalism emphasizing their similarities and differences are presented, and expectations of the use of this formalism to model spray drop-size distribution are discussed.
Abstract: The efficiency of any application involving a liquid spray is known to be highly dependent on the spray characteristics, and mainly, on the drop-diameter distribution. There is therefore a crucial need of models allowing the prediction of this distribution. However, atomization processes are partially known and so far a universal model is not available. For almost thirty years, models based on the Maximum Entropy Formalism have been proposed to fulfill this task. This paper presents a review of these models emphasizing their similarities and differences, and discusses expectations of the use of this formalism to model spray drop-size distribution.

40 citations


Journal ArticleDOI
29 Dec 2009-Entropy
TL;DR: How basic theoretical ideas from data compression, such as the notions of entropy, mutual information, and complexity have been used for analyzing biological sequences in order to discover hidden patterns, infer phylogenetic relationships between organisms and study viral populations is reviewed.
Abstract: Data compression at its base is concerned with how information is organized in data. Understanding this organization can lead to efficient ways of representing the information and hence data compression. In this paper we review the ways in which ideas and approaches fundamental to the theory and practice of data compression have been used in the area of bioinformatics. We look at how basic theoretical ideas from data compression, such as the notions of entropy, mutual information, and complexity have been used for analyzing biological sequences in order to discover hidden patterns, infer phylogenetic relationships between organisms and study viral populations. Finally, we look at how inferred grammars for biological sequences have been used to uncover structure in biological sequences.

39 citations


Journal ArticleDOI
10 Nov 2009-Entropy
TL;DR: In this paper, the authors integrate results of long term experimental study on ant language and intelligence which were fully based on fundamental ideas of Information Theory, such as the Shannon entropy, the Kolmogorov complexity, and the Shannon's equation connecting the length of a message (l) and its frequency (p).
Abstract: In this review we integrate results of long term experimental study on ant “language” and intelligence which were fully based on fundamental ideas of Information Theory, such as the Shannon entropy, the Kolmogorov complexity, and the Shannon’s equation connecting the length of a message (l) and its frequency (p), i.e., l = –log p for rational communication systems. This approach enabled us to obtain the following important results on ants’ communication and intelligence: (i) to reveal “distant homing” in ants, that is, their ability to transfer information about remote events; (ii) to estimate the rate of information transmission; (iii) to reveal that ants are able to grasp regularities and to use them for “compression” of information; (iv) to reveal that ants are able to transfer to each other the information about the number of objects; (v) to discover that ants can add and subtract small numbers. The obtained results show that information theory is not only excellent mathematical theory, but many of its results may be considered as Nature laws.

Journal ArticleDOI
Ximing Wu1
26 Nov 2009-Entropy
TL;DR: In this paper, the authors proposed a weighted generalized maximum entropy (W-GME) estimator, where different weights are assigned to the two entropies in the objective function.
Abstract: The method of Generalized Maximum Entropy (GME), proposed in Golan, Judge and Miller (1996), is an information-theoretic approach that is robust to multicolinearity problem. It uses an objective function that is the sum of the entropies for coefficient distributions and disturbance distributions. This method can be generalized to the weighted GME (W-GME), where different weights are assigned to the two entropies in the objective function. We propose a data-driven method to select the weights in the entropy objective function. We use the least squares cross validation to derive the optimal weights. MonteCarlo simulations demonstrate that the proposedW-GME estimator is comparable to and often outperforms the conventional GME estimator, which places equal weights on the entropies of coefficient and disturbance distributions.

Journal ArticleDOI
17 Aug 2009-Entropy
TL;DR: It is shown that the simplest approximation of the path integral formula for the fundamental solution of the FPKfe can be applied to solve nonlinear continuous-discrete filtering problems quite accurately.
Abstract: A summary of the relationship between the Langevin equation, Fokker-Planck-Kolmogorov forward equation (FPKfe) and the Feynman path integral descriptions of stochastic processes relevant for the solution of the continuous-discrete filtering problem is provided in this paper. The practical utility of the path integral formula is demonstrated via some nontrivial examples. Specifically, it is shown that the simplest approximation of the path integral formula for the fundamental solution of the FPKfe can be applied to solve nonlinear continuous-discrete filtering problems quite accurately. The Dirac-Feynman path integral filtering algorithm is quite simple, and is suitable for real-time implementation.

Journal ArticleDOI
30 Nov 2009-Entropy
TL;DR: A tale is told of an instance in which a spin-off from consideration of an MEP-constrained climate model at least led to re-consideration of the very practical issue of water-vapour feedback in climate change.
Abstract: The principle of maximum entropy production (MEP) is the subject of considerable academic study, but has yet to become remarkable for its practical applications. A tale is told of an instance in which a spin-off from consideration of an MEP-constrained climate model at least led to re-consideration of the very practical issue of water-vapour feedback in climate change. Further, and on a more-or-less unrelated matter, a recommendation is made for further research on whether there might exist a general "rule" whereby, for certain classes of complex non-linear systems, a state of maximum entropy production is equivalent to a state of minimum entropy.

Journal ArticleDOI
30 Oct 2009-Entropy
TL;DR: The effective temperature is defined to calculate the real power loss of the system with the Gouy-Stodola law, and to apply it to turbine examples to show that the correct power loss can be defined if the effectiveTemperature is used instead of the real environmental temperature.
Abstract: All real processes generate entropy and the power/exergy loss is usually determined by means of the Gouy-Stodola law. If the system only exchanges heat at the environmental temperature, the Gouy-Stodola law gives the correct loss of power. However, most industrial processes exchange heat at higher or lower temperatures than the actual environmental temperature. When calculating the real loss of power in these cases, the Gouy-Stodola law does not give the correct loss if the actual environmental temperature is used. The first aim of this paper is to show through simple steam turbine examples that the previous statement is true. The second aim of the paper is to define the effective temperature to calculate the real power loss of the system with the Gouy-Stodola law, and to apply it to turbine examples. Example calculations also show that the correct power loss can be defined if the effective temperature is used instead of the real environmental temperature.

Journal ArticleDOI
11 Aug 2009-Entropy
TL;DR: It is shown that the causal state distribution is the unique stationary representation on prediction space that may have finite entropy, and lower semi-continuity, concavity, and a formula for the ergodic decomposition of statistical complexity are proved.
Abstract: Statistical complexity is a measure of complexity of discrete-time stationary stochastic processes, which has many applications. We investigate its more abstract properties as a non-linear function of the space of processes and show its close relation to the Knight’s prediction process. We prove lower semi-continuity, concavity, and a formula for the ergodic decomposition of statistical complexity. On the way, we show that the discrete version of the prediction process has a continuous Markov transition. We also prove that, given the past output of a partially deterministic hidden Markov model (HMM), the uncertainty of the internal state is constant over time and knowledge of the internal state gives no additional information on the future output. Using this fact, we show that the causal state distribution is the unique stationary representation on prediction space that may have finite entropy.

Journal ArticleDOI
12 Oct 2009-Entropy
TL;DR: The devised cost function is inspired by the definition of entropy, although the method in itself does not exploit the stochastic meaning of entropy in its usual sense.
Abstract: This paper describes the basic ideas behind a novel prediction error parameter identification algorithm exhibiting high robustness with respect to outlying data. Given the low sensitivity to outliers, these can be more easily identified by analysing the residuals of the fit. The devised cost function is inspired by the definition of entropy, although the method in itself does not exploit the stochastic meaning of entropy in its usual sense. After describing the most common alternative approaches for robust identification, the novel method is presented together with numerical examples for validation.

Journal ArticleDOI
23 Oct 2009-Entropy
TL;DR: An investigation of how morphology, i.e., the shape of components, affects a self-assembly process shows that the assembly processes were affected by the aggregation sequence in their early stages, where shape induces different behaviors and thus results in variations in aggregation speeds.
Abstract: Self-assembly is a key phenomenon whereby vast numbers of individual components passively interact and form organized structures, as can be seen, for example, in the morphogenesis of a virus. Generally speaking, the process can be viewed as a spatial placement of attractive and repulsive components. In this paper, we report on an investigation of how morphology, i.e., the shape of components, affects a self-assembly process. The experiments were conducted with 3 differently shaped floating tiles equipped with magnets in an agitated water tank. We propose a novel measure involving clustering coefficients, which qualifies the degree of parallelism of the assembly process. The results showed that the assembly processes were affected by the aggregation sequence in their early stages, where shape induces different behaviors and thus results in variations in aggregation speeds.

Journal ArticleDOI
04 Mar 2009-Entropy
TL;DR: This review shows how the algorithmic approach can provide insights into real world systems, by outlining recent work on how replicating structures that generate order can evolve to maintain a system far from equilibrium.
Abstract: The algorithmic entropy of a system, the length of the shortest algorithm that specifies the system’s exact state adds some missing pieces to the entropy jigsaw. Because the approach embodies the traditional entropies as a special case, problematic issues such as the coarse graining framework of the Gibbs’ entropy manifest themselves in a different and more manageable form, appearing as the description of the system and the choice of the universal computing machine. The provisional algorithmic entropy combines the best information about the state of the system together with any underlying uncertainty; the latter represents the Shannon entropy. The algorithmic approach also specifies structure that the traditional entropies take as given. Furthermore, algorithmic entropy provides insights into how a system can maintain itself off equilibrium, leading to Ashby’s law of requisite variety. This review shows how the algorithmic approach can provide insights into real world systems, by outlining recent work on how replicating structures that generate order can evolve to maintain a system far from equilibrium.

Journal ArticleDOI
29 Jun 2009-Entropy
TL;DR: In this paper, the thermodynamics of a system of distinguishable particles is discussed. And the corrected Boltzmann counting factor can be justified in classical statistical mechanics, which is a straightforward way to get the correction.
Abstract: The issue of the thermodynamics of a system of distinguishable particles is discussed in this paper. In constructing the statistical mechanics of distinguishable particles from the definition of Boltzmann entropy, it is found that the entropy is not extensive. The inextensivity leads to the so-called Gibbs paradox in which the mixing entropy of two identical classical gases increases. Lots of literature from different points of view were created to resolve the paradox. In this paper, starting from the Boltzmann entropy, we present the thermodynamics of the system of distinguishable particles. A straightforward way to get the corrected Boltzmann counting is shown. The corrected Boltzmann counting factor can be justified in classical statistical mechanics.

Journal ArticleDOI
25 Aug 2009-Entropy
TL;DR: The thermoeconomic optimization of an endoreversible solardriven heat engine has been carried out by using finite-time/finite-size thermodynamic theory and the optimum performance and two design parameters have been investigated under three objective functions.
Abstract: In the present paper, the thermoeconomic optimization of an endoreversible solardriven heat engine has been carried out by using finite-time/finite-size thermodynamic theory. In the considered heat engine model, the heat transfer from the hot reservoir to the working fluid is assumed to be the radiation type and the heat transfer to the cold reservoir is assumed the conduction type. In this work, the optimum performance and two design parameters have been investigated under three objective functions: the power output per unit total cost, the efficient power per unit total cost and the ecological function per unit total cost. The effects of the technical and economical parameters on the thermoeconomic performance have been also discussed under the aforementioned three criteria of performance.

Journal ArticleDOI
22 Oct 2009-Entropy
TL;DR: It is shown that the maximin average redundancy in pattern coding is eventually larger than 1.84 (n/log n)1/3 for messages of length n, which improves recent results on pattern redundancy, although it does not fill the gap between known lower- and upper-bounds.
Abstract: We show that the maximin average redundancy in pattern coding is eventually larger than 1.84 (n/log n)1/3 for messages of length n. This improves recent results on pattern redundancy, although it does not fill the gap between known lower- and upper-bounds. The pattern of a string is obtained by replacing each symbol by the index of its first occurrence. The problem of pattern coding is of interest because strongly universal codes have been proved to exist for patterns while universal message coding is impossible for memoryless sources on an infinite alphabet. The proof uses fine combinatorial results on partitions with small summands.

Journal ArticleDOI
03 Dec 2009-Entropy
TL;DR: Fisher-related thermal uncertainty relations are developed and the degree of purity of canonical distributions, regarded as mixed states, is connected to this Fisher measure as well.
Abstract: We review here the difference between quantum statistical treatments and semiclassical ones, using as the main concomitant tool a semiclassical, shift-invariant Fisher information measure built up with Husimi distributions. Its semiclassical character notwithstanding, this measure also contains abundant information of a purely quantal nature. Such a tool allows us to refine the celebrated Lieb bound for Wehrl entropies and to discover thermodynamic-like relations that involve the degree of delocalization. Fisher-related thermal uncertainty relations are developed and the degree of purity of canonical distributions, regarded as mixed states, is connected to this Fisher measure as well.

Journal ArticleDOI
21 Aug 2009-Entropy
TL;DR: Calculations of information entropy are applied to a simple chemical communication from the cotton plant to the wasp to demonstrate possible broader applications of information theory to the quantification of non-human communication systems.
Abstract: In order to demonstrate possible broader applications of information theory to the quantification of non-human communication systems, we apply calculations of information entropy to a simple chemical communication from the cotton plant (Gossypium hirsutum) to the wasp (Cardiochiles nigriceps) studied by DeMoraes et al. The purpose of this chemical communication from cotton plants to wasps is presumed to be to allow the predatory wasp to more easily obtain the location of its preferred prey—one of two types of parasitic herbivores feeding on the cotton plants. Specification of the plant-eating herbivore feeding on it by the cotton plants allows preferential attraction of the wasps to those individual plants. We interpret the emission of nine chemicals by the plants as individual signal differences, (depending on the herbivore type), to be detected by the wasps as constituting a nine-signal one-way communication system across kingdoms (from the kingdom Plantae to the kingdom Animalia). We use fractional differences in the chemical abundances, (emitted as a result of the two herbivore types), to calculate the Shannon information entropic measures (marginal, joint, and mutual entropies, as well as the ambiguity, etc. of the transmitted message). We then compare these results with the subsequent behavior of the wasps, (calculating the equivocation in the message reception), for possible insights into the history and actual working of this one-way communication system.

Journal ArticleDOI
Thomas Christen1
08 Dec 2009-Entropy
TL;DR: It is argued that MEPP, although not being an exact physical law, may provide reasonable model parameter estimates, provided the constraints contain the relevant (nonlinear) physical effects and the parameters to be determined are related to disregarded weak constraints that affect mainly global entropy production.
Abstract: Under which circumstances are variational principles based on entropy production rate useful tools for modeling steady states of electric (gas) discharge systems far from equilibrium? It is first shown how various different approaches, as Steenbeck’s minimum voltage and Prigogine’s minimum entropy production rate principles are related to the maximum entropy production rate principle (MEPP). Secondly, three typical examples are discussed, which provide a certain insight in the structure of the models that are candidates for MEPP application. It is then thirdly argued that MEPP, although not being an exact physical law, may provide reasonable model parameter estimates, provided the constraints contain the relevant (nonlinear) physical effects and the parameters to be determined are related to disregarded weak constraints that affect mainly global entropy production. Finally, it is additionally conjectured that a further reason for the success of MEPP in certain far from equilibrium systems might be based on a hidden linearity of the underlying kinetic equation(s).

Journal ArticleDOI
04 Nov 2009-Entropy
TL;DR: Evidence is provided that individual neurons possess language, and that the basic unit for communication consists of two neurons and their entire field of interacting dendritic and synaptic connections, which suggests that this unidirectional inter-neuronal language-based word translation step is of utmost importance to brain function.
Abstract: In this paper evidence is provided that individual neurons possess language, and that the basic unit for communication consists of two neurons and their entire field of interacting dendritic and synaptic connections. While information processing in the brain is highly complex, each neuron uses a simple mechanism for transmitting information. This is in the form of temporal electrophysiological action potentials or spikes (S) operating on a millisecond timescale that, along with pauses (P) between spikes constitute a two letter “alphabet” that generates meaningful frequency-encoded signals or neuronal S/P “words” in a primary language. However, when a word from an afferent neuron enters the dendritic-synaptic-dendritic field between two neurons, it is translated into a new frequency-encoded word with the same meaning, but in a different spike-pause language, that is delivered to and understood by the efferent neuron. It is suggested that this unidirectional inter-neuronal language-based word translation step is of utmost importance to brain function in that it allows for variations in meaning to occur. Thus, structural or biochemical changes in dendrites or synapses can produce novel words in the second language that have changed meanings, allowing for a specific signaling experience, either external or internal, to modify the meaning of an original word (learning), and store the learned information of that experience (memory) in the form of an altered dendritic-synaptic-dendritic field.

Journal ArticleDOI
03 Nov 2009-Entropy
TL;DR: This article used the principle of maximum entropy to show that the maximally fair pay distribution is lognormal under ideal conditions, which is in agreement with observed data for the bottom 90% to 95% of the working population.
Abstract: The high pay packages of U.S. CEOs have raised serious concerns about what would constitute a fair pay. Since the present economic models do not adequately address this fundamental question, we propose a new theory based on statistical mechanics and information theory. We use the principle of maximum entropy to show that the maximally fair pay distribution is lognormal under ideal conditions. This prediction is in agreement with observed data for the bottom 90%–95% of the working population. The theory estimates that the top 35 U.S. CEOs were overpaid by about 129 times their ideal salaries in 2008. We also provide an insight of entropy as a measure of fairness, which is maximized at equilibrium, in an economic system.

Journal ArticleDOI
29 Apr 2009-Entropy
TL;DR: According to the proposed method all linear dynamical systems evolve at constant zero entropy, while higher asymptotic values characterise nonlinear systems, in which case it has common features with other classic approaches.
Abstract: This paper provides a new approach for the analysis and eventually the classification of dynamical systems. The objective is pursued by extending the concept of the entropy of plane curves, first introduced within the theory of the thermodynamics of plane curves, to Rn space. Such a generalised entropy of a curve is used to evaluate curves that are obtained by connecting several points in the phase space. As the points change their coordinates according to the equations of a dynamical system, the entropy of the curve connecting them is used to infer the behaviour of the underlying dynamics. According to the proposed method all linear dynamical systems evolve at constant zero entropy, while higher asymptotic values characterise nonlinear systems. The approach proves to be particularly efficient when applied to chaotic systems, in which case it has common features with other classic approaches. Performances of the proposed method are tested over several benchmark problems.

Journal ArticleDOI
28 Dec 2009-Entropy
TL;DR: This paper proposes a new method for estimating seismic wavelets that can represent HOS as a polynomial function of second-order statistics to improve the anti-noise performance and accuracy and can work well for short time series.
Abstract: This paper proposes a new method for estimating seismic wavelets. Suppose a seismic wavelet can be modeled by a formula with three free parameters (scale, frequency and phase). We can transform the estimation of the wavelet into determining these three parameters. The phase of the wavelet is estimated by constant-phase rotation to the seismic signal, while the other two parameters are obtained by the Higher-order Statistics (HOS) (fourth-order cumulant) matching method. In order to derive the estimator of the Higher-order Statistics (HOS), the multivariate scale mixture of Gaussians (MSMG) model is applied to formulating the multivariate joint probability density function (PDF) of the seismic signal. By this way, we can represent HOS as a polynomial function of second-order statistics to improve the anti-noise performance and accuracy. In addition, the proposed method can work well for short time series.

Journal ArticleDOI
18 Sep 2009-Entropy
TL;DR: The origin of life has previously been modeled by biological heat engines driven by thermal cycling, caused by suspension in convecting water, but more complex heat engines are invoked to explain the origin of animals in the thermal gradient above a submarine hydrothermal vent.
Abstract: The origin of life has previously been modeled by biological heat engines driven by thermal cycling, caused by suspension in convecting water. Here more complex heat engines are invoked to explain the origin of animals in the thermal gradient above a submarine hydrothermal vent. Thermal cycling by a filamentous protein ‘thermotether’ was the result of a temperature-gradient induced relaxation oscillation not impeded by the low Reynolds number of a small scale. During evolution a ‘flagellar proton pump’ emerged that resembled Feynman’s ratchet and that turned into today’s bacterial flagellar motor. An emerged ‘flagellar computer’ functioning as Turing machine implemented chemotaxis.