scispace - formally typeset
Search or ask a question

Showing papers in "Entropy in 2011"


Journal ArticleDOI
15 Aug 2011-Entropy
TL;DR: It is shown that minimum entropy production can be obtained when the thermoelectric potential is a specific, optimal value, and based on a historical overview, the compatibility approach together with the thermodynamic arguments is reconsidered.
Abstract: Fifty years ago, the optimization of thermoelectric devices was analyzed by considering the relation between optimal performances and local entropy production. Entropy is produced by the irreversible processes in thermoelectric devices. If these processes could be eliminated, entropy production would be reduced to zero, and the limiting Carnot efficiency or coefficient of performance would be obtained. In the present review, we start with some fundamental thermodynamic considerations relevant for thermoelectrics. Based on a historical overview, we reconsider the interrelation between optimal performances and local entropy production by using the compatibility approach together with the thermodynamic arguments. Using the relative current density and the thermoelectric potential, we show that minimum entropy production can be obtained when the thermoelectric potential is a specific, optimal value.

272 citations


Journal ArticleDOI
30 Sep 2011-Entropy
TL;DR: The correct application of Second Law efficiency shows which systems operate closest to the reversible limit and helps to indicate which systems have the greatest potential for improvement.
Abstract: Increasing global demand for fresh water is driving the development and implementation of a wide variety of seawater desalination technologies. Entropy generation analysis, and specifically, Second Law efficiency, is an important tool for illustrating the influence of irreversibilities within a system on the required energy input. When defining Second Law efficiency, the useful exergy output of the system must be properly defined. For desalination systems, this is the minimum least work of separation required to extract a unit of water from a feed stream of a given salinity. In order to evaluate the Second Law efficiency, entropy generation mechanisms present in a wide range of desalination processes are analyzed. In particular, entropy generated in the run down to equilibrium of discharge streams must be considered. Physical models are applied to estimate the magnitude of entropy generation by component and individual processes. These formulations are applied to calculate the total entropy generation in several desalination systems including multiple effect distillation, multistage flash, membrane distillation, mechanical vapor compression, reverse osmosis, and humidification-dehumidification. Within each technology, the relative importance of each source of entropy generation is discussed in order to determine which should be the target of entropy generation minimization. As given here, the correct application of Second Law efficiency shows which systems operate closest to the reversible limit and helps to indicate which systems have the greatest potential for improvement.

253 citations


Journal ArticleDOI
13 Apr 2011-Entropy
TL;DR: A global multi-level thresholding method for image segmentation using the Tsallis entropy as a general information theory entropy formalism and the artificial bee colony approach, which is more rapid than either genetic algorithm or particle swarm optimization.
Abstract: This paper proposes a global multi-level thresholding method for image segmentation. As a criterion for this, the traditional method uses the Shannon entropy, originated from information theory, considering the gray level image histogram as a probability distribution, while we applied the Tsallis entropy as a general information theory entropy formalism. For the algorithm, we used the artificial bee colony approach since execution of an exhaustive algorithm would be too time-consuming. The experiments demonstrate that: 1) the Tsallis entropy is superior to traditional maximum entropy thresholding, maximum between class variance thresholding, and minimum cross entropy thresholding; 2) the artificial bee colony is more rapid than either genetic algorithm or particle swarm optimization. Therefore, our approach is effective and rapid.

241 citations


Journal ArticleDOI
14 Jan 2011-Entropy
TL;DR: Owing to more degrees of freedom in tuning the parameters, the proposed family of AB-multiplicative NMF algorithms is shown to improve robustness with respect to noise and outliers.
Abstract: We propose a class of multiplicative algorithms for Nonnegative Matrix Factorization (NMF) which are robust with respect to noise and outliers. To achieve this, we formulate a new family generalized divergences referred to as the Alpha-Beta-divergences (AB-divergences), which are parameterized by the two tuning parameters, alpha and beta, and smoothly connect the fundamental Alpha-, Beta- and Gamma-divergences. By adjusting these tuning parameters, we show that a wide range of standard and new divergences can be obtained. The corresponding learning algorithms for NMF are shown to integrate and generalize many existing ones, including the Lee-Seung, ISRA (Image Space Reconstruction Algorithm), EMML (Expectation Maximization Maximum Likelihood), Alpha-NMF, and Beta-NMF. Owing to more degrees of freedom in tuning the parameters, the proposed family of AB-multiplicative NMF algorithms is shown to improve robustness with respect to noise and outliers. The analysis illuminates the links of between AB-divergence and other divergences, especially Gamma- and Itakura-Saito divergences.

229 citations


Journal ArticleDOI
21 Feb 2011-Entropy
TL;DR: A quantum-chemical study of hydrogen-bonded complexes of binary sulfuric acid-water clusters with methyl-, dimethyl- and trimethylamines representing common atmospheric organic species, vegetation products and laboratory impurities has been carried out.
Abstract: The impact of organic species which are present in the Earth’s atmosphere on the burst of new particles is critically important for the understanding of the molecular nature of atmospheric nucleation phenomena. Amines have recently been proposed as possible stabilizers of binary pre-nucleation clusters. In order to advance the understanding of atmospheric nucleation phenomena, a quantum-chemical study of hydrogen-bonded complexes of binary sulfuric acid-water clusters with methyl-, dimethyl- and trimethylamines representing common atmospheric organic species, vegetation products and laboratory impurities has been carried out. The thermochemical stability of the sulfuric acid-amines-water complexes was found to be higher than that of the sulfuric acid-ammonia-water complexes, in qualitative agreement with the previous studies. However, the enhancement in stability due to amines appears to not be large enough to overcome the difference in typical atmospheric concentrations of ammonia and amines. Further research is needed in order to address the existing uncertainties and to reach a final conclusion about the importance of amines for the atmospheric nucleation.

155 citations


Journal ArticleDOI
28 Sep 2011-Entropy
TL;DR: The nonadditive entropy Sq has been introduced in 1988 focusing on a generalization of Boltzmann–Gibbs (BG) statistical mechanics, and some relevant aspects of this entropy are reviewed and commented on.
Abstract: The nonadditive entropy Sq has been introduced in 1988 focusing on a generalization of Boltzmann–Gibbs (BG) statistical mechanics The aim was to cover a (possibly wide) class of systems among those very many which violate hypothesis such as ergodicity, under which the BG theory is expected to be valid It is now known that Sq has a large applicability; more specifically speaking, even outside Hamiltonian systems and their thermodynamical approach In the present paper we review and comment some relevant aspects of this entropy, namely (i) Additivity versus extensivity; (ii) Probability distributions that constitute attractors in the sense of Central Limit Theorems; (iii) The analysis of paradigmatic low-dimensional nonlinear dynamical systems near the edge of chaos; and (iv) The analysis of paradigmatic long-range-interacting many-body classical Hamiltonian systems Finally, we exhibit recent as well as typical predictions, verifications and applications of these concepts in natural, artificial, and social systems, as shown through theoretical, experimental, observational and computational results

151 citations


Journal ArticleDOI
19 Jul 2011-Entropy
TL;DR: It is argued that the corresponding states in the dual string theory living on AdS2 × K are described by the twisted version of the Hartle–Hawking states, the twists being generated by a large unitary group of symmetries that this string theory must possess.
Abstract: Since Euclidean global AdS2 space represented as a strip has two boundaries, the state-operator correspondence in the dual CFT1 reduces to the standard map from the operators acting on a single copy of the Hilbert space to states in the tensor product of two copies of the Hilbert space. Using this picture we argue that the corresponding states in the dual string theory living on AdS2 × K are described by the twisted version of the Hartle–Hawking states, the twists being generated by a large unitary group of symmetries that this string theory must possess. This formalism makes natural the dual interpretation of the black hole entropy—as the logarithm of the degeneracy of ground states of the quantum mechanics describing the low energy dynamics of the black hole, and also as an entanglement entropy between the two copies of the same quantum theory living on the two boundaries of global AdS2 separated by the event horizon.

114 citations


Journal ArticleDOI
21 Jan 2011-Entropy
TL;DR: The key concepts in information theory are reviewed, how the principles of information theory can be useful for visualization are discussed, and specific examples to draw connections between data communication and data visualization in terms of how information can be measured quantitatively are provided.
Abstract: In recent years, there is an emerging direction that leverages information theory to solve many challenging problems in scientific data analysis and visualization. In this article, we review the key concepts in information theory, discuss how the principles of information theory can be useful for visualization, and provide specific examples to draw connections between data communication and data visualization in terms of how information can be measured quantitatively. As the amount of digital data available to us increases at an astounding speed, the goal of this article is to introduce the interested readers to this new direction of data analysis research, and to inspire them to identify new applications and seek solutions using information theory.

111 citations


Journal ArticleDOI
14 Jun 2011-Entropy
TL;DR: This work gives a new mathematical structure to the q-exponential family different from those previously given, which has a dually flat geometrical structure derived from the Legendre transformation and the conformal geometry is useful for understanding it.
Abstract: The Gibbs distribution of statistical physics is an exponential family of probability distributions, which has a mathematical basis of duality in the form of the Legendre transformation. Recent studies of complex systems have found lots of distributions obeying the power law rather than the standard Gibbs type distributions. The Tsallis q-entropy is a typical example capturing such phenomena. We treat the q-Gibbs distribution or the q-exponential family by generalizing the exponential function to the q-family of power functions, which is useful for studying various complex or non-standard physical phenomena. We give a new mathematical structure to the q-exponential family different from those previously given. It has a dually flat geometrical structure derived from the Legendre transformation and the conformal geometry is useful for understanding it. The q-version of the maximum entropy theorem is naturally induced from the q-Pythagorean theorem. We also show that the maximizer of the q-escort distribution is a Bayesian MAP (Maximum A posteriori Probability) estimator.

85 citations


Journal ArticleDOI
05 Aug 2011-Entropy
TL;DR: It is observed that the peak of entropygeneration rate is attained within the boundary layer region and plate surface act as a strong source of entropy generation and heat transfer irreversibility.
Abstract: The present paper is concerned with the analysis of inherent irreversibility in hydromagnetic boundary layer flow of variable viscosity fluid over a semi-infinite flat plate under the influence of thermal radiation and Newtonian heating. Using local similarity solution technique and shooting quadrature, the velocity and temperature profiles are obtained numerically and utilized to compute the entropy generation number. The effects of magnetic field parameter, Brinkmann number, the Prandtl number, variable viscosity parameter, radiation parameter and local Biot number on the fluid velocity profiles, temperature profiles, local skin friction and local Nusselt number are presented. The influences of the same parameters and the dimensionless group parameter on the entropy generation rate in the flow regime and Bejan number are calculated, depicted graphically and discussed quantitatively. It is observed that the peak of entropy generation rate is attained within the boundary layer region and plate surface act as a strong source of entropy generation and heat transfer irreversibility.

84 citations


Journal ArticleDOI
24 Nov 2011-Entropy
TL;DR: In this article, it was shown that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous, and this characterization naturally generalizes to Tsallis entropy.
Abstract: There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the "information loss", or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well.

Journal ArticleDOI
12 Jan 2011-Entropy
TL;DR: This study presents a multi-objective approach based on a mean-variance-skewness-entropy portfolio selection model (MVSEM) that performs well out-of sample relative to traditional portfolio selection models.
Abstract: In this study, we present a multi-objective approach based on a mean-variance-skewness-entropy portfolio selection model (MVSEM). In this approach, an entropy measure is added to the mean-variance-skewness model (MVSM) to generate a well‑diversified portfolio. Through a variety of empirical data sets, we evaluate the performance of the MVSEM in terms of several portfolio performance measures. The obtained results show that the MVSEM performs well out-of sample relative to traditional portfolio selection models.

Journal ArticleDOI
03 Jun 2011-Entropy
TL;DR: In this paper, the authors argue the case for Solomonoff Induction, a formal inductive framework which combines algorithmic information theory with the Bayesian framework, and examine how Solomonoff induction addresses many issues that have plagued other inductive systems, such as the black ravens paradox and the confirmation problem.
Abstract: Understanding inductive reasoning is a problem that has engaged mankind for thousands of years. This problem is relevant to a wide range of fields and is integral to the philosophy of science. It has been tackled by many great minds ranging from philosophers to scientists to mathematicians, and more recently computer scientists. In this article we argue the case for Solomonoff Induction, a formal inductive framework which combines algorithmic information theory with the Bayesian framework. Although it achieves excellent theoretical results and is based on solid philosophical foundations, the requisite technical knowledge necessary for understanding this framework has caused it to remain largely unknown and unappreciated in the wider scientific community. The main contribution of this article is to convey Solomonoff induction and its related concepts in a generally accessible form with the aim of bridging this current technical gap. In the process we examine the major historical contributions that have led to the formulation of Solomonoff Induction as well as criticisms of Solomonoff and induction in general. In particular we examine how Solomonoff induction addresses many issues that have plagued other inductive systems, such as the black ravens paradox and the confirmation problem, and compare this approach with other recent approaches.

Journal ArticleDOI
30 Dec 2011-Entropy
TL;DR: The potential of employing the concept of thermodynamic entropy generation to assess degradation in processes involving metal fatigue and it is shown that empirical fatigue models such as Miner’s rule, Coffin-Manson equation, and Paris law can be deduced from thermodynamic consideration.
Abstract: In this paper we describe the potential of employing the concept of thermodynamic entropy generation to assess degradation in processes involving metal fatigue. It is shown that empirical fatigue models such as Miner’s rule, Coffin-Manson equation, and Paris law can be deduced from thermodynamic consideration.

Journal ArticleDOI
07 Mar 2011-Entropy
TL;DR: A conceptually general view of the problem as well as a way of probing its non-linearity is provided, and connections with other theoretical areas such as statistical mechanics are emphasized.
Abstract: Predicting the future state of a turbulent dynamical system such as the atmosphere has been recognized for several decades to be an essentially statistical undertaking. Uncertainties from a variety of sources are magnified by dynamical mechanisms and given sufficient time, compromise any prediction. In the last decade or so this process of uncertainty evolution has been studied using a variety of tools from information theory. These provide both a conceptually general view of the problem as well as a way of probing its non-linearity. Here we review these advances from both a theoretical and practical perspective. Connections with other theoretical areas such as statistical mechanics are emphasized. The importance of obtaining practical results for prediction also guides the development presented.

Journal ArticleDOI
11 Jul 2011-Entropy
TL;DR: The nonextensive analysis of the southern California earthquake catalog was performed and the results show that the nonextensivity parameter q lies in the same range as obtained for other different seismic areas, thus suggesting a sort of universal character in theNonextensive interpretation of seismicity.
Abstract: Nonextensive statistics has been becoming a very useful tool to describe the complexity of dynamic systems. Recently, analysis of the magnitude distribution of earthquakes has been increasingly used in the context of nonextensivity. In the present paper, the nonextensive analysis of the southern California earthquake catalog was performed. The results show that the nonextensivity parameter q lies in the same range as obtained for other different seismic areas, thus suggesting a sort of universal character in the nonextensive interpretation of seismicity.

Journal ArticleDOI
20 May 2011-Entropy
TL;DR: In this article, the generalized mass action law together with the basic relations between kinetic factors are proved for the positivity of the entropy production but hold even without microreversibility, when the detailed balance is not applicable.
Abstract: We study chemical reactions with complex mechanisms under two assumptions: (i) intermediates are present in small amounts (this is the quasi-steady-state hypothesis or QSS) and (ii) they are in equilibrium relations with substrates (this is the quasiequilibrium hypothesis or QE). Under these assumptions, we prove the generalized mass action law together with the basic relations between kinetic factors, which are sufficient for the positivity of the entropy production but hold even without microreversibility, when the detailed balance is not applicable. Even though QE and QSS produce useful approximations by themselves, only the combination of these assumptions can render the possibility beyond the “rarefied gas” limit or the “molecular chaos” hypotheses. We do not use any a priori form of the kinetic law for the chemical reactions and describe their equilibria by thermodynamic relations. The transformations of the intermediate compounds can be described by the Markov kinetics because of their low density (low density of elementary events). This combination of assumptions was introduced by Michaelis and Menten in 1913. In 1952, Stueckelberg used the same assumptions for the gas kinetics and produced the remarkable semi-detailed balance relations between collision rates in the Boltzmann equation that are weaker than the detailed balance conditions but are still sufficient for the Boltzmann H-theorem to be valid. Our results are obtained within the Michaelis-Menten-Stueckelbeg conceptual framework.

Journal ArticleDOI
28 Nov 2011-Entropy
TL;DR: It is found that in a certain range of parameters M and q there exist a global temperature for an observer in the R-region between the black hole horizon rb and cosmological horizon rc and the detailed analysis of thermodynamics of horizons using the Padmanabhan approach.
Abstract: We address the question of thermodynamics of regular cosmological spherically symmetric black holes with the de Sitter center. Space-time is asymptotically de Sitter as r → 0 and as r → ∞. A source term in the Einstein equations connects smoothly two de Sitter vacua with different values of cosmological constant: 8πGTμν = Λδμν as r → 0, 8πGTμν = λδμν as r → ∞ with λ < Λ. It represents an anisotropic vacuum dark fluid defined by symmetry of its stress-energy tensor which is invariant under the radial boosts. In the range of the mass parameter Mcr1 ≤ M ≤ Mcr2 it describes a regular cosmological black hole. Space-time in this case has three horizons: a cosmological horizon rc, a black hole horizon rb < rc, and an internal horizon ra < rb, which is the cosmological horizon for an observer in the internal R-region asymptotically de Sitter as r → 0. We present the basicfeatures of space-time geometry and the detailed analysis of thermodynamics of horizons using the Padmanabhan approach relevant for a multi-horizon space-time with a non-zero pressure. We find that in a certain range of parameters M and q =√Λ/λ there exist a global temperature for an observer in the R-region between the black hole horizon rb and cosmological horizon rc. We show that a second-order phase transition occurs in the course of evaporation, where a specific heat is broken and a temperature achieves its maximal value. Thermodynamical preference for a final point of evaporation is thermodynamically stable double-horizon (ra = rb) remnant with the positive specific heat and zero temperature.

Journal ArticleDOI
20 Jul 2011-Entropy
TL;DR: In this paper, it is shown that the universality of these thermodynamic properties arises from an approximate conformal symmetry, which permits an effective "conformal dual" description that is largely independent of the microscopic details.
Abstract: It is no longer considered surprising that black holes have temperatures and entropies. What remains surprising, though, is the universality of these thermodynamic properties: their exceptionally simple and general form, and the fact that they can be derived from many very different descriptions of the underlying microscopic degrees of freedom. I review the proposal that this universality arises from an approximate conformal symmetry, which permits an effective “conformal dual” description that is largely independent of the microscopic details.

Journal ArticleDOI
25 Mar 2011-Entropy
TL;DR: In this paper, the authors study the SU(2) invariant formulation of static generic isolated horizons in a manifestly SU (2)-invariant formulation and show that the usual classical description requires revision in the non-static case due to the breaking of diffeomorphism invariance at the horizon leading to the nonconservation of the usual pre-symplectic structure.
Abstract: We study the classical field theoretical formulation of static generic isolated horizons in a manifestly SU(2) invariant formulation. We show that the usual classical description requires revision in the non-static case due to the breaking of diffeomorphism invariance at the horizon leading to the non-conservation of the usual pre-symplectic structure. We argue how this difficulty could be avoided by a simple enlargement of the field content at the horizon that restores diffeomorphism invariance. Restricting our attention to static isolated horizons we study the effective theories describing the boundary degrees of freedom. A quantization of the horizon degrees of freedom is proposed. By defining a statistical mechanical ensemble where only the area aH of the horizon is fixed macroscopically—states with fluctuations away from spherical symmetry are allowed—we show that it is possible to obtain agreement with the Hawkings area law (S = aH /(4l 2p)) without fixing the Immirzi parameter to any particular value: consistency with the area law only imposes a relationship between the Immirzi parameter and the level of the Chern-Simons theory involved in the effective description of the horizon degrees of freedom.

Journal ArticleDOI
Ilya Nemenman1
19 Dec 2011-Entropy
TL;DR: An asymptotic analysis of the NSB estimator of entropy of a discrete random variable shows that the estimator has a well defined limit for a large cardinality of the studied variable, which allows estimation of entropy with no a priori assumptions about the cardinality.
Abstract: We perform an asymptotic analysis of the NSB estimator of entropy of a discrete random variable. The analysis illuminates the dependence of the estimates on the number of coincidences in the sample and shows that the estimator has a well defined limit for a large cardinality of the studied variable. This allows estimation of entropy with no a priori assumptions about the cardinality. Software implementation of the algorithm is available.

Journal ArticleDOI
03 Jun 2011-Entropy
TL;DR: This review addresses in this review important topics underlying the SCM structure, viz., a good choice of probability metric space and how to assess the best distance-choice, which in this context is called a “disequilibrium” and is denoted with the letter Q.
Abstract: Statistical complexity measures (SCM) are the composition of two ingredients: (i) entropies and (ii) distances in probability-space. In consequence, SCMs provide a simultaneous quantification of the randomness and the correlational structures present in the system under study. We address in this review important topics underlying the SCM structure, viz., (a) a good choice of probability metric space and (b) how to assess the best distance-choice, which in this context is called a “disequilibrium” and is denoted with the letter Q. Q, indeed the crucial SCM ingredient, is cast in terms of an associated distance D. Since out input data consists of time-series, we also discuss the best way of extracting from the time series a probability distribution P. As an illustration, we show just how these issues affect the description of the classical limit of quantum mechanics.

Journal ArticleDOI
20 Jan 2011-Entropy
TL;DR: A novel framework to determine the number of resolution levels in the application of a wavelet transformation to a rainfall time series using multi-scale entropy (MSE) analysis and the Mann-Kendall (MK) rank correlation test of MSE curves of residuals at various resolution levels is presented.
Abstract: This paper presents a novel framework to determine the number of resolution levels in the application of a wavelet transformation to a rainfall time series. The rainfall time series are decomposed using the a trous wavelet transform. Then, multi-scale entropy (MSE) analysis that helps to elucidate some hidden characteristics of the original rainfall time series is applied to the decomposed rainfall time series. The analysis shows that the Mann-Kendall (MK) rank correlation test of MSE curves of residuals at various resolution levels could determine the number of resolution levels in the wavelet decomposition. The complexity of rainfall time series at four stations on a multi-scale is compared. The results reveal that the suggested number of resolution levels can be obtained using MSE analysis and MK test. The complexity of rainfall time series at various locations can also be analyzed to provide a reference for water resource planning and application.

Journal ArticleDOI
25 Feb 2011-Entropy
TL;DR: In evaluating the top-down selection pressures that are exerted on a microbiome the authors find cause to warrant reconsideration of the much-maligned theory of multi-level selection and reason that complexity must be underscored by modularity.
Abstract: Second-generation sequencing technologies have granted us greater access to the diversity and genetics of microbial communities that naturally reside endo- and ecto-symbiotically with animal hosts. Substantial research has emerged describing the diversity and broader trends that exist within and between host species and their associated microbial ecosystems, yet the application of these data to our evolutionary understanding of microbiomes appears fragmented. For the most part biological perspectives are based on limited observations of oversimplified communities, while mathematical and/or computational modeling of these concepts often lack biological precedence. In recognition of this disconnect, both fields have attempted to incorporate ecological theories, although their applicability is currently a subject of debate because most ecological theories were developed based on observations of macro-organisms and their ecosystems. For the purposes of this review, we attempt to transcend the biological, ecological and computational realms, drawing on extensive literature, to forge a useful framework that can, at a minimum be built upon, but ideally will shape the hypotheses of each field as they move forward. In evaluating the top-down selection pressures that are exerted on a microbiome we find cause to warrant reconsideration of the much-maligned theory of multi-level selection and reason that complexity must be underscored by modularity.

Journal ArticleDOI
19 Jan 2011-Entropy
TL;DR: In contrast to the principle of MEP, the analysis of NEF is able to provide a new insight into the mechanism responsible for the evolution of a weather system as well as a new approach to predicting its track and intensity trend.
Abstract: The concept of entropy and its relevant principles, mainly the principle of maximum entropy production (MEP), the effect of negative entropy flow (NEF) on the organization of atmospheric systems and the principle of the Second Law of thermodynamics, as well as their applications to atmospheric sciences, are reviewed. Some formulations of sub-grid processes such as diffusion parameterization schemes in computational geophysical fluid dynamics that can be improved based on full-irreversibility are also discussed, although they have not yet been systematically subjected to scrutiny from the perspective of the entropy budgets. A comparative investigation shows that the principle of MEP applies to the entropy production of macroscopic fluxes and determines the most probable state, that is, a system may choose a development meta-stable trajectory with a smaller production since entropy production behavior involves many specific dynamical and thermodynamic processes in the atmosphere and the extremal principles only provide a general insight into the overall configuration of the atmosphere. In contrast to the principle of MEP, the analysis of NEF is able to provide a new insight into the mechanism responsible for the evolution of a weather system as well as a new approach to predicting its track and intensity trend.

Journal ArticleDOI
07 Jan 2011-Entropy
TL;DR: The ideas behind the entropy production concept are reviewed and some insights about its relevance are given.
Abstract: It is unquestionable that the concept of entropy has played an essential role both in the physical and biological sciences. However, the entropy production, crucial to the second law, has also other features not clearly conceived. We all know that the main difficulty is concerned with its quantification in non-equilibrium processes and consequently its value for some specific cases is limited. In this work we will review the ideas behind the entropy production concept and we will give some insights about its relevance.

Journal ArticleDOI
03 Mar 2011-Entropy
TL;DR: It is proved that, for universal time-bounded distribution mt(x), Tsallis and Renyi entropies converge if and only if α is greater than 1.
Abstract: Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for R´enyi and Tsallis entropies of order α, showing that it only holds for α = 1. Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution mt(x), Tsallis and Renyi entropies converge if and only if α is greater than 1. We also establish the uniform continuity of these entropies.

Journal ArticleDOI
08 Jul 2011-Entropy
TL;DR: This paper investigates the possibility of using various probability density function divergence measures for the purpose of representative data sampling and shows that in many cases it is not possible unless samples consisting of thousands of instances are used.
Abstract: Generalisation error estimation is an important issue in machine learning. Cross-validation traditionally used for this purpose requires building multiple models and repeating the whole procedure many times in order to produce reliable error estimates. It is however possible to accurately estimate the error using only a single model, if the training and test data are chosen appropriately. This paper investigates the possibility of using various probability density function divergence measures for the purpose of representative data sampling. As it turned out, the first difficulty one needs to deal with is estimation of the divergence itself. In contrast to other publications on this subject, the experimental results provided in this study show that in many cases it is not possible unless samples consisting of thousands of instances are used. Exhaustive experiments on the divergence guided representative data sampling have been performed using 26 publicly available benchmark datasets and 70 PDF divergence estimators, and their results have been analysed and discussed.

Journal ArticleDOI
01 Nov 2011-Entropy
TL;DR: It is shown that due to this relation, classes of nonlinear N-dimensional Fokker-Planck equations are connected to a single entropic form, and the H-theorem is proved.
Abstract: Several previous results valid for one-dimensional nonlinear Fokker-Planck equations are generalized to N-dimensions. A general nonlinear N-dimensional Fokker-Planck equation is derived directly from a master equation, by considering nonlinearitiesin the transition rates. Using nonlinear Fokker-Planck equations, the H-theorem is proved;for that, an important relation involving these equations and general entropic forms is introduced. It is shown that due to this relation, classes of nonlinear N-dimensional Fokker-Planck equations are connected to a single entropic form. A particular emphasis is given to the class of equations associated to Tsallis entropy, in both cases of the standard, and generalized definitions for the internal energy.

Journal ArticleDOI
14 Sep 2011-Entropy
TL;DR: This paper investigates the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned documents using the Kullback–Leibler distance, the difference between entropy and conditional entropy, and the Jensen–Tsallis divergence.
Abstract: Mutual information is one of the mostly used measures for evaluating image similarity. In this paper, we investigate the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned documents. These three generalizations derive from the Kullback–Leibler distance, the difference between entropy and conditional entropy, and the Jensen–Tsallis divergence, respectively. In addition, the ratio between these measures and the Tsallis joint entropy is analyzed. The performance of all these measures is studied for different entropic indexes in the context of document classification and registration.