scispace - formally typeset
Search or ask a question

Showing papers in "Entropy in 2012"


Journal ArticleDOI
23 Aug 2012-Entropy
TL;DR: The theoretical foundations of the permutation entropy are analyzed, as well as the main recent applications to the analysis of economical markets and to the understanding of biomedical systems.
Abstract: Entropy is a powerful tool for the analysis of time series, as it allows describing the probability distributions of the possible state of a system, and therefore the information encoded in it. Nevertheless, important information may be codified also in the temporal dynamics, an aspect which is not usually taken into account. The idea of calculating entropy based on permutation patterns (that is, permutations defined by the order relations among values of a time series) has received a lot of attention in the last years, especially for the understanding of complex and chaotic systems. Permutation entropy directly accounts for the temporal information contained in the time series; furthermore, it has the quality of simplicity, robustness and very low computational cost. To celebrate the tenth anniversary of the original work, here we analyze the theoretical foundations of the permutation entropy, as well as the main recent applications to the analysis of economical markets and to the understanding of biomedical systems.

537 citations


Journal ArticleDOI
18 Sep 2012-Entropy
TL;DR: In this paper, the authors focus on the properties of modified gravity theories, in particular on black-hole solutions and its comparison with those solutions in General Relativity, and on Friedmann-Lemaitre-Robertson-Walker metrics.
Abstract: Along this review, we focus on the study of several properties of modified gravity theories, in particular on black-hole solutions and its comparison with those solutions in General Relativity, and on Friedmann-Lemaitre-Robertson-Walker metrics. The thermodynamical properties of fourth order gravity theories are also a subject of this investigation with special attention on local and global stability of paradigmatic f(R) models. In addition, we revise some attempts to extend the Cardy-Verlinde formula, including modified gravity, where a relation between entropy bounds is obtained. Moreover, a deep study on cosmological singularities, which appear as a real possibility for some kind of modified gravity theories, is performed, and the validity of the entropy bounds is studied.

303 citations


Journal ArticleDOI
31 Oct 2012-Entropy
TL;DR: A free energy principle is described that tries to explain the ability of biological systems to resist a natural tendency to disorder using a principle of least action based on variational free energy (from statistical physics) and the conditions under which it is formally equivalent to the information bottleneck method.
Abstract: This paper describes a free energy principle that tries to explain the ability of biological systems to resist a natural tendency to disorder. It appeals to circular causality of the sort found in synergetic formulations of self-organization (e.g., the slaving principle) and models of coupled dynamical systems, using nonlinear Fokker Planck equations. Here, circular causality is induced by separating the states of a random dynamical system into external and internal states, where external states are subject to random fluctuations and internal states are not. This reduces the problem to finding some (deterministic) dynamics of the internal states that ensure the system visits a limited number of external states; in other words, the measure of its (random) attracting set, or the Shannon entropy of the external states is small. We motivate a solution using a principle of least action based on variational free energy (from statistical physics) and establish the conditions under which it is formally equivalent to the information bottleneck method. This approach has proved useful in understanding the functional architecture of the brain. The generality of variational free energy minimisation and corresponding information theoretic formulations may speak to interesting applications beyond the neurosciences; e.g., in molecular or evolutionary biology.

241 citations


Journal ArticleDOI
04 Jul 2012-Entropy
TL;DR: The possibility of distinguish among the brain states related to Alzheimer’s disease patients and Mild Cognitive Impaired subjects from normal healthy elderly is checked on a real, although quite limited, experimental database.
Abstract: An original multivariate multi-scale methodology for assessing the complexity of physiological signals is proposed. The technique is able to incorporate the simultaneous analysis of multi-channel data as a unique block within a multi-scale framework. The basic complexity measure is done by using Permutation Entropy, a methodology for time series processing based on ordinal analysis. Permutation Entropy is conceptually simple, structurally robust to noise and artifacts, computationally very fast, which is relevant for designing portable diagnostics. Since time series derived from biological systems show structures on multiple spatial-temporal scales, the proposed technique can be useful for other types of biomedical signal analysis. In this work, the possibility of distinguish among the brain states related to Alzheimer’s disease patients and Mild Cognitive Impaired subjects from normal healthy elderly is checked on a real, although quite limited, experimental database. Keywords:

230 citations


Journal ArticleDOI
27 Jul 2012-Entropy
TL;DR: Simulation results demonstrated that the proposed method is a very powerful algorithm for bearing fault diagnosis and has much better performance than the methods based on single scale permutation entropy (PE) and multiscale entropy (MSE).
Abstract: Bearing fault diagnosis has attracted significant attention over the past few decades. It consists of two major parts: vibration signal feature extraction and condition classification for the extracted features. In this paper, multiscale permutation entropy (MPE) was introduced for feature extraction from faulty bearing vibration signals. After extracting feature vectors by MPE, the support vector machine (SVM) was applied to automate the fault diagnosis procedure. Simulation results demonstrated that the proposed method is a very powerful algorithm for bearing fault diagnosis and has much better performance than the methods based on single scale permutation entropy (PE) and multiscale entropy (MSE).

229 citations


Journal ArticleDOI
10 Aug 2012-Entropy
TL;DR: Some innovative concepts of CAES are presented, such as adiabaticCAES, isothermal CAES, micro-CAES combined with air-cycle heating and cooling, and constant-pressure CAes combined with pumped hydro storage that can address such problems and widen the scope ofCAES applications, by energy and exergy analyses.
Abstract: Energy storage systems are increasingly gaining importance with regard to their role in achieving load levelling, especially for matching intermittent sources of renewable energy with customer demand, as well as for storing excess nuclear or thermal power during the daily cycle. Compressed air energy storage (CAES), with its high reliability, economic feasibility, and low environmental impact, is a promising method for large-scale energy storage. Although there are only two large-scale CAES plants in existence, recently, a number of CAES projects have been initiated around the world, and some innovative concepts of CAES have been proposed. Existing CAES plants have some disadvantages such as energy loss due to dissipation of heat of compression, use of fossil fuels, and dependence on geological formations. This paper reviews the main drawbacks of the existing CAES systems and presents some innovative concepts of CAES, such as adiabatic CAES, isothermal CAES, micro-CAES combined with air-cycle heating and cooling, and constant-pressure CAES combined with pumped hydro storage that can address such problems and widen the scope of CAES applications, by energy and exergy analyses. These analyses greatly help us to understand the characteristics of each CAES system and compare different CAES systems.

155 citations


Journal ArticleDOI
14 Mar 2012-Entropy
TL;DR: This paper presents a taxonomy and overview of approaches to the measurement of graph and network complexity and distinguishes between deterministic and probabilistic approaches with a view to placing entropy-based Probabilistic measurement in context.
Abstract: This paper presents a taxonomy and overview of approaches to the measurement of graph and network complexity. The taxonomy distinguishes between deterministic (e.g., Kolmogorov complexity) and probabilistic approaches with a view to placing entropy-based probabilistic measurement in context. Entropy-based measurement is the main focus of the paper. Relationships between the different entropy functions used to measure complexity are examined; and intrinsic (e.g., classical measures) and extrinsic (e.g., Korner entropy) variants of entropy-based models are discussed in some detail.

154 citations


Journal ArticleDOI
28 Dec 2012-Entropy
TL;DR: The links are developed by studying how measures based on directed information theory naturally emerge from Granger causality inference frameworks as hypothesis testing, and showing that the useful decomposition is blurred by instantaneous coupling.
Abstract: This report reviews the conceptual and theoretical links between Granger causality and directed information theory. We begin with a short historical tour of Granger causality, concentrating on its closeness to information theory. The definitions of Granger causality based on prediction are recalled, and the importance of the observation set is discussed. We present the definitions based on conditional independence. The notion of instantaneous coupling is included in the definitions. The concept of Granger causality graphs is discussed. We present directed information theory from the perspective of studies of causal influences between stochastic processes. Causal conditioning appears to be the cornerstone for the relation between information theory and Granger causality. In the bivariate case, the fundamental measure is the directed information, which decomposes as the sum of the transfer entropies and a term quantifying instantaneous coupling. We show the decomposition of the mutual information into the sums of the transfer entropies and the instantaneous coupling measure, a relation known for the linear Gaussian case. We study the multivariate case, showing that the useful decomposition is blurred by instantaneous coupling. The links are further developed by studying how measures based on directed information theory naturally emerge from Granger causality inference frameworks as hypothesis testing.

133 citations


Journal ArticleDOI
04 Sep 2012-Entropy
TL;DR: The aim of this work is to provide the tools to compute the well-known Kullback–Leibler divergence measure for the flexible family of multivariate skew-normal distributions, and uses the Jeffreys divergence measure to compare the multivariate normal distribution with the skew-multivariatenormal distribution.
Abstract: The aim of this work is to provide the tools to compute the well-known Kullback–Leibler divergence measure for the flexible family of multivariate skew-normal distributions. In particular, we use the Jeffreys divergence measure to compare the multivariate normal distribution with the skew-multivariate normal distribution, showing that this is equivalent to comparing univariate versions of these distributions. Finally, we applied our results on a seismological catalogue data set related to the 2010 Maule earthquake. Specifically, we compare the distributions of the local magnitudes of the regions formed by the aftershocks.

76 citations


Journal ArticleDOI
24 Sep 2012-Entropy
TL;DR: This study attempts to combine the copula theory with the entropy theory for bivariate rainfall and runoff analysis and results in the detection of the nonlinear dependence between the correlated random variables-rainfall and runoff.
Abstract: Multivariate hydrologic frequency analysis has been widely studied using: (1) commonly known joint distributions or copula functions with the assumption of univariate variables being independently identically distributed (I.I.D.) random variables; or (2) directly applying the entropy theory-based framework. However, for the I.I.D. univariate random variable assumption, the univariate variable may be considered as independently distributed, but it may not be identically distributed; and secondly, the commonly applied Pearson’s coefficient of correlation (g) is not able to capture the nonlinear dependence structure that usually exists. Thus, this study attempts to combine the copula theory with the entropy theory for bivariate rainfall and runoff analysis. The entropy theory is applied to derive the univariate rainfall and runoff distributions. It permits the incorporation of given or known information, codified in the form of constraints and results in a universal solution of univariate probability distributions. The copula theory is applied to determine the joint rainfall-runoff distribution. Application of the copula theory results in: (i) the detection of the nonlinear dependence between the correlated random variables-rainfall and runoff, and (ii) capturing the tail dependence for risk analysis through joint return period and conditional return period of rainfall and runoff. The methodology is validated using annual daily maximum rainfall and the corresponding daily runoff (discharge) data collected from watersheds near Riesel, Texas (small agricultural experimental watersheds) and Cuyahoga River watershed, Ohio.

65 citations


Journal ArticleDOI
27 Dec 2012-Entropy
TL;DR: Recent development in SMI approximation based on direct density-ratio estimation and SMI-based machine learning techniques such as independence testing, dimensionality reduction, canonical dependency analysis, independent component analysis, object matching, clustering, and causal inference are reviewed.
Abstract: Mutual information (MI) is useful for detecting statistical independence between random variables, and it has been successfully applied to solving various machine learning problems. Recently, an alternative to MI called squared-loss MI (SMI) was introduced. While ordinary MI is the Kullback–Leibler divergence from the joint distribution to the product of the marginal distributions, SMI is its Pearson divergence variant. Because both the divergences belong to the ƒ-divergence family, they share similar theoretical properties. However, a notable advantage of SMI is that it can be approximated from data in a computationally more efficient and numerically more stable way than ordinary MI. In this article, we review recent development in SMI approximation based on direct density-ratio estimation and SMI-based machine learning techniques such as independence testing, dimensionality reduction, canonical dependency analysis, independent component analysis, object matching, clustering, and causal inference.

Journal ArticleDOI
30 Jan 2012-Entropy
TL;DR: The Rate-Controlled Constrained Equilibrium (RCCE) method as discussed by the authors is a general, effective, physically based method for model order reduction that was originally developed in the framework of thermodynamics and chemical kinetics.
Abstract: The Rate-Controlled Constrained-Equilibrium (RCCE) method for the description of the time-dependent behavior of dynamical systems in non-equilibrium states is a general, effective, physically based method for model order reduction that was originally developed in the framework of thermodynamics and chemical kinetics. A generalized mathematical formulation is presented here that allows including nonlinear constraints in non-local equilibrium systems characterized by the existence of a non-increasing Lyapunov functional under the system’s internal dynamics. The generalized formulation of RCCE enables to clarify the essentials of the method and the built-in general feature of thermodynamic consistency in the chemical kinetics context. In this paper, we work out the details of the method in a generalized mathematical-physics framework, but for definiteness we detail its well-known implementation in the traditional chemical kinetics framework. We detail proofs and spell out explicit functional dependences so as to bring out and clarify each underlying assumption of the method. In the standard context of chemical kinetics of ideal gas mixtures, we discuss the relations between the validity of the detailed balance condition off-equilibrium and the thermodynamic consistency of the method. We also discuss two examples of RCCE gas-phase combustion calculations to emphasize the constraint-dependent performance of the RCCE method.

Journal ArticleDOI
12 Jun 2012-Entropy
TL;DR: The combined effects of buoyancy force and Navier slip on the entropy generation rate in a vertical porous channel with wall suction/injection with Runge–Kutta–Fehlberg method with shooting technique is investigated.
Abstract: In this paper, we investigate the combined effects of buoyancy force and Navier slip on the entropy generation rate in a vertical porous channel with wall suction/injection. The nonlinear model problem is tackled numerically using Runge–Kutta–Fehlberg method with shooting technique. Both the velocity and temperature profiles are obtained and utilized to compute the entropy generation number. The effects of slip parameter, Brinkmann number, the Peclet number and suction/injection Reynolds number on the fluid velocity, temperature profile, Nusselt number, entropy generation rate and Bejan number are depicted graphically and discussed quantitatively.

Journal ArticleDOI
13 Feb 2012-Entropy
TL;DR: This article reviews computational research in loop modeling, highlighting progress and challenges and important insight is obtained on potential directions for future research.
Abstract: Authors to whom correspondence should be addressed; E-Mails: amarda@gmu.edu (A.S.);kavraki@rice.edu (L.E.K.); Tel.: +1-703-993-4135 (A.S.); Fax: +1-703-993-1710 (A.S.);Tel.: +1-713-348-5737 (L.E.K.); Fax: +1-713-348-5930 (L.E.K.)Received: 26 December 2011; in revised form: 10 January 2012 / Accepted: 3 February 2012 /Published: 13 February 2012Abstract: Unlike the secondary structure elements that connect in protein structures, loopfragments in protein chains are often highly mobile even in generally stable proteins. Thestructural variability of loops is often at the center of a protein’s stability, folding, and evenbiological function. Loops are found to mediate important biological processes, such assignaling, protein-ligand binding, and protein-protein interactions. Modeling conformationsof a loop under physiological conditions remains an open problem in computational biology.This article reviews computational research in loop modeling, highlighting progress andchallenges. Important insight is obtained on potential directions for future research.Keywords: loop modeling; conformational ensemble; equilibrium fluctuations; native state;structural analysis of proteins; structural bioinformatics

Journal ArticleDOI
25 May 2012-Entropy
TL;DR: The distribution of the MSE of EEG during the whole surgery based on adaptive resampling process is able to show the detailed variation of S E in small scales and complexity of EEG, which could help anesthesiologists evaluate the status of patients.
Abstract: Entropy as an estimate of complexity of the electroencephalogram is an effective parameter for monitoring the depth of anesthesia (DOA) during surgery. Multiscale entropy (MSE) is useful to evaluate the complexity of signals over different time scales. However, the limitation of the length of processed signal is a problem due to observing the variation of sample entropy (S E ) on different scales. In this study, the adaptive resampling procedure is employed to replace the process of coarse-graining in MSE. According to the analysis of various signals and practical EEG signals, it is feasible to calculate the S E from the adaptive resampled signals, and it has the highly similar results with the original MSE at small scales. The distribution of the MSE of EEG during the whole surgery based on adaptive resampling process is able to show the detailed variation of S E in small scales and complexity of EEG, which could help anesthesiologists evaluate the status of patients.

Journal ArticleDOI
04 Sep 2012-Entropy
TL;DR: A thorough reconstruction analysis is performed on the so-called F(T) models, where F( T) is some general function of the torsion term, and deduce the required conditions for the equivalence between of F(t) models with pure kinetic k-essence models.
Abstract: This a brief review on F(T) gravity and its relation with k-essence. Modified teleparallel gravity theory with the torsion scalar has recently gained a lot of attention as a possible explanation of dark energy. We perform a thorough reconstruction analysis on the so-called F(T) models, where F(T) is some general function of the torsion term, and deduce the required conditions for the equivalence between of F(T) models with pure kinetic k-essence models. We present a new class of models of F(T)-gravity and k-essence.

Journal ArticleDOI
08 Oct 2012-Entropy
TL;DR: In this paper, a derivation of Quantum Theory from information-theoretic principles is presented, and the broad picture emerging from the principles is that Quantum Theory is the only standard theory of information that is compatible with the purity and reversibility of physical processes.
Abstract: After more than a century since its birth, Quantum Theory still eludes our understanding. If asked to describe it, we have to resort to abstract and ad hoc principles about complex Hilbert spaces. How is it possible that a fundamental physical theory cannot be described using the ordinary language of Physics? Here we offer a contribution to the problem from the angle of Quantum Information, providing a short non-technical presentation of a recent derivation of Quantum Theory from information-theoretic principles. The broad picture emerging from the principles is that Quantum Theory is the only standard theory of information that is compatible with the purity and reversibility of physical processes.

Journal ArticleDOI
07 Nov 2012-Entropy
TL;DR: The results provide strong evidence supporting a link between autism and the aluminum in vaccines and propose that children with the autism diagnosis are especially vulnerable to toxic metals such as aluminum and mercury due to insufficient serum sulfate and glutathione.
Abstract: Autism is a condition characterized by impaired cognitive and social skills, associated with compromised immune function. The incidence is alarmingly on the rise, and environmental factors are increasingly suspected to play a role. This paper investigates word frequency patterns in the U.S. CDC Vaccine Adverse Events Reporting System (VAERS) database. Our results provide strong evidence supporting a link between autism and the aluminum in vaccines. A literature review showing toxicity of aluminum in human physiology offers further support. Mentions of autism in VAERS increased steadily at the end of the last century, during a period when mercury was being phased out, while aluminum adjuvant burden was being increased. Using standard log-likelihood ratio techniques, we identify several signs and symptoms that are significantly more prevalent in vaccine reports after 2000, including cellulitis, seizure, depression, fatigue, pain and death, which are also significantly associated with aluminum-containing vaccines. We propose that children with the autism diagnosis are especially vulnerable to toxic metals such as aluminum and mercury due to insufficient serum sulfate and glutathione. A strong correlation between autism and the MMR (Measles, Mumps, Rubella) vaccine is also observed, which may be partially explained via an increased sensitivity to acetaminophen administered to control fever.

Journal ArticleDOI
10 Apr 2012-Entropy
TL;DR: The Tsallis relative entropy, a generalization of the Kullback–Leibler entropy in the frame work of non-extensive thermodynamics to analyze the properties of anomalous diffusion processes, induces a natural ordering of the stable distributions consistent with the ordering implied by the pure diffusion and wave limits.
Abstract: In this paper we utilize the Tsallis relative entropy, a generalization of the Kullback–Leibler entropy in the frame work of non-extensive thermodynamics to analyze the properties of anomalous diffusion processes. Anomalous (super-) diffusive behavior can be described by fractional diffusion equations, where the second order space derivative is extended to fractional order α ∈ (1, 2). They represent a bridging regime, where for α = 2 one obtains the diffusion equation and for α = 1 the (half) wave equation is given. These fractional diffusion equations are solved by so-called stable distributions, which exhibit heavy tails and skewness. In contrast to the Shannon or Tsallis entropy of these distributions, the Kullback and Tsallis relative entropy, relative to the pure diffusion case, induce a natural ordering of the stable distributions consistent with the ordering implied by the pure diffusion and wave limits.

Journal ArticleDOI
21 Feb 2012-Entropy
TL;DR: A numerical model of subcritical and trans-critical power cycles using a fixed-flowrate low-temperature heat source has been validated and shows that R141b is the better working fluid for the conditions under study.
Abstract: A numerical model of subcritical and trans-critical power cycles using a fixed-flowrate low-temperature heat source has been validated and used to calculate the combinations of the maximum cycle pressure (Pev) and the difference between the source temperature and the maximum working fluid temperature (DT) which maximize the thermal efficiency (ηth) or minimize the non-dimensional exergy losses (β), the total thermal conductance of the heat exchangers (UAt) and the turbine size (SP). Optimum combinations of Pev and DT were calculated for each one of these four objective functions for two working fluids (R134a, R141b), three source temperatures and three values of the non-dimensional power output. The ratio of UAt over the net power output (which is a first approximation of the initial cost per kW) shows that R141b is the better working fluid for the conditions under study.

Journal ArticleDOI
28 Sep 2012-Entropy
TL;DR: The aim of this paper is to provide a review of new coding techniques as they apply to the case of time-varying Gaussian networks with multiple unicast connections and to review interference alignment and ergodic interference alignment for multi-source single-hop networks and interference neutralization and er godic interferenceneutralization forMulti-source multi-hop Networks.
Abstract: In recent years, there has been rapid progress on understanding Gaussian networks with multiple unicast connections, and new coding techniques have emerged. The essence of multi-source networks is how to efficiently manage interference that arises from the transmission of other sessions. Classically, interference is removed by orthogonalization (in time or frequency). This means that the rate per session drops inversely proportional to the number of sessions, suggesting that interference is a strong limiting factor in such networks. However, recently discovered interference management techniques have led to a paradigm shift that interference might not be quite as detrimental after all. The aim of this paper is to provide a review of these new coding techniques as they apply to the case of time-varying Gaussian networks with multiple unicast connections. Specifically, we review interference alignment and ergodic interference alignment for multi-source single-hop networks and interference neutralization and ergodic interference neutralization for multi-source multi-hop networks. We mainly focus on the “degrees of freedom” perspective and also discuss an approximate capacity characterization.

Journal ArticleDOI
Chao Liu, Chao He, Hong Gao, Xiaoxiao Xu, Jinliang Xu 
06 Mar 2012-Entropy
TL;DR: The results show that the OET will appear for the temperature ranges investigated when the critical temperatures of working fluids are lower than the waste heat temperatures by 18 ± 5 K under the pinch temperature difference in the evaporator.
Abstract: The subcritical Organic Rankine Cycle (ORC) with 28 working fluids for waste heat recovery is discussed in this paper. The effects of the temperature of the waste heat, the critical temperature of working fluids and the pinch temperature difference in the evaporator on the optimal evaporation temperature (OET) of the ORC have been investigated. The second law efficiency of the system is regarded as the objective function and the evaporation temperature is optimized by using the quadratic approximations method. The results show that the OET will appear for the temperature ranges investigated when the critical temperatures of working fluids are lower than the waste heat temperatures by 18 ± 5 K under the pinch temperature difference of 5 K in the evaporator. Additionally, the ORC always exhibits the OET when the pinch temperature difference in the evaporator is raised under the fixed waste heat temperature. The maximum second law efficiency will decrease with the increase of pinch temperature difference in the evaporator.

Journal ArticleDOI
02 Nov 2012-Entropy
TL;DR: MEMD-enhanced MMSE is able to distinguish the smaller differences between before and after the use of vibration shoes in both two directions, which is more powerful than the empirical mode decomposition (EMD)-enhanced MSE in each individual direction.
Abstract: Falls are unpredictable accidents and resulting injuries can be serious to the elderly. A preventative solution can be the use of vibration stimulus of white noise to improve the sense of balance. In this work, a pair of vibration shoes were developed and controlled by a touch-type switch which can generate mechanical vibration noise to stimulate the patient's feet while wearing the shoes. In order to evaluate the balance stability and treatment effect of vibrating insoles in these shoes, multivariate multiscale entropy (MMSE) algorithm is applied to calculate the relative complexity index of reconstructed center of pressure (COP) signals in antero-posterior and medio-lateral directions by the multivariate empirical mode decomposition (MEMD). The results show that the balance stability of 61.5% elderly subjects is improved after wearing the developed shoes, which is

Journal ArticleDOI
19 Apr 2012-Entropy
TL;DR: A detailed study of the variations in trends and lengths of 1554 named streets and 6004 street segments, forming a part of the evolving street network of the city of Dundee in East Scotland, shows strong linear correlations with the scaling exponents.
Abstract: Many natural and man-made lineaments form networks that can be analysed through entropy and energy considerations. Here we report the results of a detailed study of the variations in trends and lengths of 1554 named streets and 6004 street segments, forming a part of the evolving street network of the city of Dundee in East Scotland. Based on changes in the scaling exponents (ranging from 0.24 to 3.89), the streets can be divided into 21 populations. For comparison, we analysed 221 active crustal fractures in Iceland that (a) are of similar lengths as the streets of Dundee; (b) are composed of segments; and (c) form evolving networks. The streets and fractures follow power-law size distributions (validated through various statistical tests) that can be partly explained in terms of the energies needed for their formation. The entropies of the 21 street populations and 9 fracture populations show strong linear correlations with (1) the scaling exponents (R 2 = 0.845-0.947 for streets, R 2 = 0.859 for fractures) and with (2) the length ranges, that is, the differences between the longest and shortest streets/fractures, (R 2 = 0.845-0.906 for streets, R 2 = 0.927 for fractures).

Journal ArticleDOI
05 Nov 2012-Entropy
TL;DR: A novel approach to behavioral evolutionary questions is used, using tools drawn from information theory, algorithmic complexity and the thermodynamics of computation to support an intuitive assumption about the near optimal structure of a physical environment that would prove conducive to the evolution and survival of organisms.
Abstract: In evolutionary biology, attention to the relationship between stochastic organisms and their stochastic environments has leaned towards the adaptability and learning capabilities of the organisms rather than toward the properties of the environment. This article is devoted to the algorithmic aspects of the environment and its interaction with living organisms. We ask whether one may use the fact of the existence of life to establish how far nature is removed from algorithmic randomness. The paper uses a novel approach to behavioral evolutionary questions, using tools drawn from information theory, algorithmic complexity and the thermodynamics of computation to support an intuitive assumption about the near optimal structure of a physical environment that would prove conducive to the evolution and survival of organisms, and sketches the potential of these tools, at present alien to biology, that could be used in the future to address different and deeper questions. We contribute to the discussion of the algorithmic structure of natural environments and provide statistical and computational arguments for the intuitive claim that living systems would not be able to survive in completely unpredictable environments, even if adaptable and equipped with storage and learning capabilities by natural selection (brain memory or DNA).

Journal ArticleDOI
05 Nov 2012-Entropy
TL;DR: The paper is concerned with Shannon sampling reconstruction formulae of derivatives of bandlimited signals as well as of derivative of their Hilbert transform, and their application to Boas-type formulAE for higher order derivatives, and the essential aim is to extend results to non-bandlimited signals.
Abstract: The paper is concerned with Shannon sampling reconstruction formulae of derivatives of bandlimited signals as well as of derivatives of their Hilbert transform, and their application to Boas-type formulae for higher order derivatives. The essential aim is to extend these results to non-bandlimited signals. Basic is the fact that by these extensions aliasing error terms must now be added to the bandlimited reconstruction formulae. These errors will be estimated in terms of the distance functional just introduced by the authors for the extensions of basic relations valid for bandlimited functions to larger function spaces. This approach can be regarded as a mathematical foundation of aliasing error analysis of many applications.

Journal ArticleDOI
02 Mar 2012-Entropy
TL;DR: It is shown that interval entropy can uniquely determine the distribution function and a measure of discrepancy between two lifetime distributions at the interval of time is proposed in base of Kullback-Leibler discrimination information.
Abstract: The Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose a measure of discrepancy between two lifetime distributions at the interval of time in base of Kullback-Leibler discrimination information. We study various properties of this measure, including its connection with residual and past measures of discrepancy and interval entropy, and we obtain its upper and lower bounds.

Journal ArticleDOI
04 Jan 2012-Entropy
TL;DR: Generalizations of the computationally tractable quasi-diagonal direct interaction approximation for inhomogeneous barotropic turbulent flows over topography are developed and statistical closures are formulated for large eddy simulations including subgrid models that ensure the same large scale statistical behavior as higher resolution closures.
Abstract: Statistical dynamical closures for inhomogeneous turbulence described by multi‑field equations are derived based on renormalized perturbation theory. Generalizations of the computationally tractable quasi-diagonal direct interaction approximation for inhomogeneous barotropic turbulent flows over topography are developed. Statistical closures are also formulated for large eddy simulations including subgrid models that ensure the same large scale statistical behavior as higher resolution closures. The focus is on baroclinic quasigeostrophic and three-dimensional inhomogeneous turbulence although the framework is generally applicable to classical field theories with quadratic nonlinearity.

Journal ArticleDOI
16 May 2012-Entropy
TL;DR: The robustness of the MSE results were tested to confirm that MSE analysis is consistent and the same results when removing 25% data, making this approach suitable for the complexity analysis of rainfall, runoff, and RC time series.
Abstract: This paper presents a novel framework for the complexity analysis of rainfall, runoff, and runoff coefficient (RC) time series using multiscale entropy (MSE). The MSE analysis of RC time series was used to investigate changes in the complexity of rainfall-runoff processes due to human activities. Firstly, a coarse graining process was applied to a time series. The sample entropy was then computed for each coarse-grained time series, and plotted as a function of the scale factor. The proposed method was tested in a case study of daily rainfall and runoff data for the upstream Wu–Tu watershed. Results show that the entropy measures of rainfall time series are higher than those of runoff time series at all scale factors. The entropy measures of the RC time series are between the entropy measures of the rainfall and runoff time series at various scale factors. Results also show that the entropy values of rainfall, runoff, and RC time series increase as scale factors increase. The changes in the complexity of RC time series indicate the changes of rainfall-runoff relations due to human activities and provide a reference for the selection of rainfall-runoff models that are capable of dealing with great complexity and take into account of obvious self-similarity can be suggested to the modeling of rainfall-runoff processes. Moreover, the robustness of the MSE results were tested to confirm that MSE analysis is consistent and the same results when removing 25% data, making this approach suitable for the complexity analysis of rainfall, runoff, and RC time series.

Journal ArticleDOI
23 Jan 2012-Entropy
TL;DR: Simulated data show that the permutation entropy, topological entropy and the modified permutations entropy are more appropriate to quantify the uncertainty associated to a time series than those based on the standard deviation or other measures of dispersion.
Abstract: Several measures of volatility have been developed in order to quantify the degree of uncertainty of an energy price series, which include historical volatility and price velocities, among others. This paper suggests using the permutation entropy, topological entropy and the modified permutation entropy as alternatives to measure volatility in energy markets. Simulated data show that these measures are more appropriate to quantify the uncertainty associated to a time series than those based on the standard deviation or other measures of dispersion. Finally, the proposed method is applied to some typical electricity markets: Nord Pool, Ontario, Omel and four Australian markets.