scispace - formally typeset
Search or ask a question

Showing papers on "Entropy (information theory) published in 1976"


Journal ArticleDOI
01 Jun 1976-Nature
TL;DR: This article showed that enthalpy and entropy estimates from kinetic and equilibrium data are highly correlated, varying in a linear fashion with one another (enthalpy-entropy compensation effect), which can be readily explained in most cases simply as a statistical or data handling artefact.
Abstract: TYPICALLY, enthalpy and entropy estimates from kinetic and equilibrium data are highly correlated, varying in a linear fashion with one another (enthalpy–entropy compensation effect). This effect can be readily explained in most cases simply as a statistical or data handling artefact. The statistical analysis presented here reveals three novel insights. First, the enthalpy and entropy parameter estimates are highly correlated, such that estimated correlation coefficients > 0.95, say, do not imply chemical causation. Second, enthalpy and entropy estimates are distributed by experimental and measurement errors in elliptical probability regions that are very elongated and appear as lines. The slope of such lines is the harmonic mean of the experimental temperatures. Third, estimates of enthalpy and free energy at the harmonic mean of the experimental temperatures are not statistically correlated, so any observed structured variation between these parameter estimates arises from the chemical effect alone. Note that, since the thermodynamic potentials are interrelated by the Maxwell relationships, a correlation between any two potentials can be transformed to give the corresponding correlation between any other two. We now discuss these results to resolve a number of issues concerning a much disputed data set.

191 citations


Journal ArticleDOI
TL;DR: This paper considers the problem of efficient transmission of vector sources over a digital noiseless channel and gives the optimally decorrelating scheme for a source whose components are dependent and treats the problems of selecting the optimum characteristic of the encoding scheme such that the overall mean-squared error is minimized.
Abstract: This paper considers the problem of efficient transmission of vector sources over a digital noiseless channel. It treats the problem of optimal allocation of the total number of available bits to the components of a memoryless stationary vector source with independent components. This allocation is applied to various encoding schemes, such as minimum mean-square error, sample-by-sample quantization, or entropy quantization. We also give the optimally decorrelating scheme for a source whose components are dependent and treat the problems of selecting the optimum characteristic of the encoding scheme such that the overall mean-squared error is minimized. Several examples of encoding schemes, including the ideal encoder that achieves the rated istortion bound, and of sources related to a practical problem are discussed.

184 citations


ReportDOI
26 Mar 1976
TL;DR: The maximum entropy (Burg) technique is as good as any of the methods considered, for the univariate case, and particularly advantageous for short data segments.
Abstract: : A comparison of several methods for spectral estimation of a univariate process with equi-spaced samples, including maximum entropy, linear predictive, and autoregressive techniques, is made. The comparison is conducted via simulation for situations both with and without bad (or missing) data points. The case of bad data points required extensions of existing techniques in the literature and is documented fully here in the form of processing equations and FORTRAN programs. It is concluded that the maximum entropy (Burg) technique is as good as any of the methods considered, for the univariate case. The methods considered are particularly advantageous for short data segments. This report also reviews several available techniques for spectral analysis under different states of knowledge and presents the interrelationships of the various approaches in a consistent notation. Hopefully, this non-rigorous presentation will clarify this method of spectral analysis for readers who are nonexpert in the field.

110 citations


Journal ArticleDOI
C. H. Chen1
TL;DR: This paper provides a fairly complete list of information and distance measures including a new average conditional cubic entropy proposed by the author and major problem areas such as computation with these measures are examined and methods of approach for the unresolved problems are suggested.

80 citations


Journal ArticleDOI
TL;DR: This paper analyses the problem of determining a structure for an automaton, optimal in some sense, from observations of its behaviour which are themselves uncertain, and shows that appropriate measures lead to the poorness-of-fit of admissible models of a probabilistic source being an entropy for that source.
Abstract: This paper analyses the problem of determining a structure for an automaton, optimal in some sense, from observations of its behaviour which are themselves uncertain. It is shown that extension of deterministic modelling techniques based on the Nerode equivalence to probabilistic sources gives meaningless results. The problem of approximate modelling with nondeterministic structures is rigorously formulated leading to the concept of a space of admissible models . The special case where the observed behaviour may be represented as a symbol string is then analysed in terms of measures of string approximation . It is shown that appropriate measures lead to the poorness-of-fit of admissible models of a probabilistic source being an entropy for that source. The formulation is consistent with a computational complexity basis for probability theory and leads to natural expressions for the surprise at each observation and the uncertainty as to the next observation. An implemented algorithm for this modelling process is then described with examples of its application to: probabilistic sources; sampled deterministic sources; grammatical inference; human behaviour; and program derivation from traces.

68 citations


Book ChapterDOI
01 Jan 1976
TL;DR: The economic process is a partial process that, like all partial processes, is circumscribed by a boundary across which matter and energy are exchanged with the rest of the material universe.
Abstract: This chapter presents the entropy law and the economic problem. The economic process is a partial process that, like all partial processes, is circumscribed by a boundary across which matter and energy are exchanged with the rest of the material universe. The answer to the question of what this material process does is simple: it neither produces nor consumes matter energy; it only absorbs matter-energy and throws it out continuously. From the viewpoint of thermodynamics, matter-energy enters the economic process in a state of low entropy and comes out of it in a state of high entropy. The entropy law states that the entropy—that is, the amount of bound energy—of a closed system continuously increases or that the order of such a system steadily turns into disorder. Practically, all organisms live on low entropy in the form found immediately in the environment.

66 citations


Journal ArticleDOI
TL;DR: The concept of Markov chains, applied to stratigraphic sections, is reliable in analyzing cyclic patterns in lithologic successions, and entropy for the whole system of sedimentation is introduced to discuss variability of the condition in the depositional processes.
Abstract: The concept of Markov chains, applied to stratigraphic sections, is reliable in analyzing cyclic patterns in lithologic successions. Randomness in the occurrence of lithologies repeating in a succession is evaluated generally in terms of entropies which can be calculated for the Markov matrix equated with the succession. Two types of entropies pertain to every state; one is relevant to the Markov matrix expressing the upward transitions, and the other, relevant to the matrix expressing the downward transitions. The latter and the former with respect to a certain state, making an entropy set, correspond to the degree of randomness in its linking with the others which occur as the precursor and the successor, respectively. It is obvious that the entropy sets which are calculated for all state variables serve as a reliable criterion in the discrimination of cyclic pattern of the succession. We are able based on the entropy sets to classify the various patterns into asymmetric, symmetric, and random cycles, which are exhibited also in actual lithologic successions. The entropy sets are calculated for Markov matrices which have been reported from a number of areas in the world, and compared with the cyclic patterns supposed there. Entropy for the whole system of sedimentation also is introduced to discuss variability of the condition in the depositional processes.

62 citations


Journal ArticleDOI
A. Wehrl1
TL;DR: In this article, it was shown that weak convergence of a sequence of density matrices towards a density matrix already implies convergence with respect to the trace norm topology, and that the set of all density matrix with finite entropy is of the first category.

45 citations


Journal ArticleDOI
TL;DR: As the transmission rate R gets large, differential pulse-code modulation (PCM) when followed by entropy coding forms a source encoding system which performs within 1.53 dB of Shannon's rate distortion function which bounds the performance of any encoding system with a minimum mean-square error (mmse) fidelity criterion.
Abstract: As the transmission rate R gets large, differential pulse-code modulation (PCM) when followed by entropy coding forms a source encoding system which performs within 1.53 dB of Shannon's rate distortion function which bounds the performance of any encoding system with a minimum mean-square error (mmse) fidelity criterion. This is true for any ergodic signal source. Furthermore, this source encoder introduces the same amount of uncertainty as the mmse encoder. The 1.53 dB difference between this encoder and the mmse encoder is perceptually so small that it would probably not be noticed by a human user of a high quality (signal-to-noise ratio (S/N) \geq 30 dB) speech or television source encoding system.

39 citations


Journal ArticleDOI
01 May 1976
TL;DR: The article presents the complex form of the maximum entropy method for estimating the power spectral density of a sequence of complex-valued samples.
Abstract: The article presents the complex form of the maximum entropy method for estimating the power spectral density of a sequence of complex-valued samples.

35 citations


Journal ArticleDOI
Gregory J. Chaitin1
TL;DR: A theory of the entropy of recursively enumerable sets of objects is proposed which includes the previous theory as the special case of sets having a single element.
Abstract: In a previous paper a theory of program size formally identical to information theory was developed. The entropy of an individual finite object was defined to be the size in bits of the smallest program for calculating it. It was shown that this is −log 2 of the probability that the object is obtained by means of a program whose successive bits are chosen by flipping an unbiased coin. Here a theory of the entropy of recursively enumerable sets of objects is proposed which includes the previous theory as the special case of sets having a single element. The primary concept in the generalized theory is the probability that a computing machine enumerates a given set when its program is manufactured by coin flipping. The entropy of a set is defined to be −log 2 of this probability.

Journal ArticleDOI
TL;DR: The entropy measure H =−σpi log pi is used with increasing frequency in the analysis of business and economic data as discussed by the authors, however, it is simply another measure of dispersion which can be related to the moments of the probability function.
Abstract: The entropy measure H=−σpi log pi is being used with increasing frequency in the analysis of business and economic data. It is, however, simply another measure of dispersion which can be related to the moments of the probability function. Its virtues stem from its decomposition and interpretative properties. This paper surveys the uses to which the measure has been put in the literature, and discusses whether its use has been appropriate and innovative.

Book ChapterDOI
J. Rissanen1
TL;DR: This chapter introduces a criterion based on entropy, which is aimed at supplying the missing structure-dependent term, and discusses the numerical computations required for performing the minimization in the optimum estimations with respect to the parameter vectors θ.
Abstract: Publisher Summary This chapter introduces a criterion based on entropy, which is aimed at supplying the missing structure-dependent term. The chapter introduces the problem, or at any rate, what is thought to be a problem and suggests a remedy in the form of a criterion that is based on a very broad and intuitively attractive principle. In all the studies known, the structure is estimated separately from the other parameters, in contrast with the approach taken in the chapter. The chapter discusses the numerical computations required for performing the minimization in the optimum estimations with respect to the parameter vectors θ. These calculations are the same as those required in the maximum-likelihood method.


Journal ArticleDOI
TL;DR: The Kalman-Bucy filter is derived for both the discrete-time and the continuous-time systems by an application of the information theory and the information structures of the optimal filter for a continuous- time nonlinear system are made clear.


Journal ArticleDOI
TL;DR: In this article, an information theoretic procedure for the prediction of collision probability matrices subject to a dynamic constraint is derived, discussed and illustrated by an example, where the symmetry of time-reversal invariance is imposed as a rigorous kinematic constraint in the process of minimizing the entropy deficiency.
Abstract: An information theoretic procedure for the prediction of collision probability matrices subject to a dynamic constraint is derived, discussed and illustrated by an example The novel point in the derivation is the imposition of the symmetry of time-reversal invariance as a rigorous kinematic constraint in the process of minimizing the entropy deficiency The resulting probability matrix is thereby guaranteed to be strictly symmetric The structure of the probability matrix, as derived here, relates the energy requirements and energy disposal of the same reaction

Journal ArticleDOI
TL;DR: Another expression for the entropy is introduced by considering the variation in the ages at which offspring will be produced by newborn individuals and the relation between these two measures of entropy and their biological significance are discussed.

Journal ArticleDOI
TL;DR: It is argued that any reasonable measure D ( v | m ) of the dispersion of an m -continuous probability measure v relative to a reference measure m should satisfy the following natural conditions: It should be formally reasonable, restrictedly continuous, invariant under density-preserving maps, quasi-subadditive and additive.

Journal ArticleDOI
TL;DR: It is concluded that entropy should, first and foremost, be regarded as a technique to expand the methods of statistical inference and hypothesis testing, rather than one of theory construction.
Abstract: The framework of Bayesian inference is proposed as a structure for unifying those highly disparate approaches to entropy modelling that have appeared in geography to date, and is used to illuminate the possibilities and shortcomings of some of these models. The inadequacy of most descriptive entropy statistics for measuring the information in a spatially-autocorrelated map is described. The contention that entropy maximization in itself provides theoretical justification for spatial models is critically evaluated. It is concluded that entropy should, first and foremost, be regarded as a technique to expand our methods of statistical inference and hypothesis testing, rather than one of theory construction.

Journal ArticleDOI
Donald R. Moscato1
TL;DR: In this paper, the use of the entropy measure, H = − σpi log pi, is investigated as a plausible alternative to current procedure, and an analysis of the results will compare the entropy approach with existing practice.
Abstract: Given the objective of determining the degree of part commonality in a product line, this study discusses several methods of analysis which are possible. The use of the entropy measure, H= − σpi log pi, is investigated as a plausible alternative to current procedure. The methodology will incorporate a simulation of several types of distributions of part usage which might occur in a typical application. An analysis of the results will compare the entropy approach with existing practice.


Journal ArticleDOI
TL;DR: In this paper, a model of the objective complexity of production systems involving a single person and usually some machine is developed, which can be described and measured in terms of a physical component and a mental component.
Abstract: A model of the objective complexity of production systems involving a single person and usually some machine, is developed. Objective complexity can be described and measured in terms of a physical component and a mental component. It is defined as task entropy and measured in terms of information processing rates. Empirical data are offered to validate the model and to postulate a channel capacity theorem. This is the maximum working capacity or total demand which can be placed on a human being and that fraction of capacity which is mobilized for long working periods. The impact of technology on the two components of complexity, taken separately and in combination, is analyzed and discussed.


Journal ArticleDOI
Jr. W. Adams1
TL;DR: The entropy of the output of three adaptive source encoders, AΔM, APCM, and ADPCM, is measured for speech inputs at low sampling rates to correlate the source encoder output entropies with the results of informal listening tests.
Abstract: The entropy of the output of three adaptive source encoders, AΔM, APCM, and ADPCM, is measured for speech inputs at low sampling rates. It is attempted to correlate the source encoder output entropies, normalized to the channel bit rate, with the results of informal listening tests. It was hypothesized that the channel bit rate utilization factor would predict subjective ranking of the source encoders. The results of measurements do not support this hypothesis; an intuitive explanation is offered.


Journal ArticleDOI
TL;DR: The purpose of this paper is to study fixed-point smoothing problems from the viewpoint of the information theory and proves that the necessary and sufficient condition for maximizing the mutual information between a state and a smoothed estimate is to minimize the entropy of the smoothed estimation error.
Abstract: The purpose of this paper is to study fixed-point smoothing problems from the viewpoint of the information theory. For a linear stochastic system it is proved that the necessary and sufficient condition for maximizing the mutual information between a state and the smoothed estimate is to minimize the entropy of the smoothed estimation error. Based on this relation between the mutual information and the error entropy, the optimal fixed-point smoothing estimator for a discrete-time linear system is derived. Furthermore, a similar estimator for a continuous-time linear system is also considered by an analogous approach.

Journal ArticleDOI
Bruce G. Taylor1
01 Aug 1976
TL;DR: In this paper, the information properties of delta-coded speech with channel-encoding applications were analyzed and compared with the relative entropy of Markov process approximations to the message-generating process for orders up to 9.
Abstract: The paper presents an analysis of the information properties of delta-coded speech with channel-encoding applications. Predictive coding techniques for reducing the entropy of the average distribution of the signal elements are described, and evaluated by comparison with computations of the relative entropy of Markov process approximations to the message-generating process, for orders up to 9. The optimal digital fixed-structure predictors are established for orders up to 7, and the optimal group codes are established for block lengths of 2 to 6 elements. The redundancy is shown to be typically about one half, and a predictor success probability of 0.9 is attainable with a practical 6th-order discrete structure. The entropies of the sequences, which are generated by modulo-2 addition of the predictions and source elements, are found to be much closer to the process entropies than are the corresponding performance characteristics for group encoding. In order to achieve message compression, for encoding the predictor error sequence, 5-element group encoding (attaining a compression factor of 0.5 with a 300-word buffer) is found to be superior to run-length encodings. The combination transformation is much more efficient than direct exact coding of blocks of source elements.


Journal ArticleDOI
TL;DR: In this paper, a model of local breakdown is proposed for calculating the life span of load-carrying technical-grade rubber parts, based on the thermodynamics of irreversible processes and on the entropy criterion of breakdown.
Abstract: A model of local breakdown is proposed for calculating the life span of load-carrying technical-grade rubber parts, based on the thermodynamics of irreversible processes and on the entropy criterion of breakdown. The entropy hypothesis has been proved experimentally, and the numerical value of the critical specific entropy has been obtained for grade IRP-1347 rubber, this value being constant for a given material.