scispace - formally typeset
Search or ask a question

Showing papers on "Entropy (information theory) published in 1975"


Journal ArticleDOI
TL;DR: An application is the construction of a uniformly universal sequence of codes for countable memoryless sources, in which the n th code has a ratio of average codeword length to source rate bounded by a function of n for all sources with positive rate.
Abstract: Countable prefix codeword sets are constructed with the universal property that assigning messages in order of decreasing probability to codewords in order of increasing length gives an average code-word length, for any message set with positive entropy, less than a constant times the optimal average codeword length for that source. Some of the sets also have the asymptotically optimal property that the ratio of average codeword length to entropy approaches one uniformly as entropy increases. An application is the construction of a uniformly universal sequence of codes for countable memoryless sources, in which the n th code has a ratio of average codeword length to source rate bounded by a function of n for all sources with positive rate; the bound is less than two for n = 0 and approaches one as n increases.

1,306 citations


Journal ArticleDOI
TL;DR: In this article, a new spectral estimate, called the maximum entropy method, was described, which maximizes the information of a signal under the constraint that the estimated autocorrelation function of the signal is the Fourier transform of the spectral power density.
Abstract: A new spectral estimate, called the maximum entropy method, is described. This estimate was originated by John Parker Burg for use in seismic wave analysis. In the maximum entropy method the entropy, or information, of a signal is maximized under the constraint that the estimated autocorrelation function of the signal is the Fourier transform of the spectral power density. The spectral estimates are calculated in two ways: (1) by minimization of the error power to obtain the coefficients of the prediction error filter, as suggested by Burg, and (2) by a direct solution of the matrix equation using an algorithm due to Norman Levinson. For comparison a Blackman-Tukey technique, calculated with a Hamming window, is used also. We illustrate these three methods by applying them to a composite signal consisting of four sinusoids of unit amplitude: one each at high and low frequencies and two at moderate frequencies with respect to the Nyquist frequency, to which is added white noise of 0.5 amplitude. Results are shown to indicate that the best correspondence with the input spectrum is provided by the Burg technique. Applications of the maximum entropy method to geomagnetic micropulsations reveal complex multiplet structure in the Pc 4, 5 range. Such structure, not previously resolved by conventional techniques, has been predicted by a recent theory of magnetospheric resonances. In a period range 7 orders of magnitude longer than micropulsation periods, analysis of annual sunspot means shows that the 11-yr band is composed of at least three distinct lines. With each of these lines is associated a harmonic sequence. Long periods of the order of 100 yr also are revealed.

100 citations


Journal ArticleDOI
TL;DR: A generalization of the Shannon-McMillan Theorem for the action of an amenable group on a probability space was obtained in this article, and interesting properties of the limit function were derived.
Abstract: A generalization of the Shannon-McMillan Theorem ($L^1$ version) is obtained for the action of an amenable group on a probability space, thereby settling a conjecture of Pickel and Stepin. Interesting properties of the limit function are derived. The entropy of an action of an amenable group is defined.

97 citations


Journal ArticleDOI
Te Sun Han1
TL;DR: In this article, a study of linear dependence relations among Shannon's and McGill's mutual informations as well as the lattice-theoretic description of them is presented, where the concept of entropy vector is defined as a functional on the set of admissible probability distributions by which the problem can be transformed to that of investigating the algebraic structure of the (correlative) entropy space.
Abstract: A study of linear dependence relations among Shannon's and McGill's (multiple) mutual informations as well as the lattice—theoretic description of them is presented. The concept of entropy vector is defined as a functional on the set of admissible probability distributions by which the problem can be transformed to that of investigating the algebraic structure of the (correlative) entropy space. First, we give the bases of this space as well as several basis transformations and thus determine the dimension. Next, a set of admissible values which McGill's informations can simultaneously take is obtained by establishing the elementary distributions, where, as a by-product, certain nonlinear dependence property is clarified. Finally, considerations on the duality of the lattice of random variables lead to the concept of dual (multiple) mutual informations, from which the duals of several theorems are derived.

78 citations


Journal ArticleDOI
01 Mar 1975-Genetics
TL;DR: The concept of entropy of a genotype is introduced and it is shown that in a random mating population in Hardy-Weinberg equilibrium and under slow selection, the rate of change of entropy is equal to the genetic variance in entropy minus the covariance in entropy and reproductive potential.
Abstract: This paper studies the properties of a new class of demographic parameters for age-structured populations and analyzes the effect of natural selection on these parameters. Two new demographic variables are introduced: the entropy of a population and the reproductive potential. The entropy of a population measures the variability of the contribution of the different age classes to the stationary population. The reproductive potential measures the mean of the contribution of the different age classes to the Malthusian parameter. The Malthusian parameter is precisely the difference between the entropy and the reproductive potential. The effect of these demographic variables on changes in gene frequency is discussed. The concept of entropy of a genotype is introduced and it is shown that in a random mating population in Hardy-Weinberg equilibrium and under slow selection, the rate of change of entropy is equal to the genetic variance in entropy minus the covariance in entropy and reproductive potential. This result is an information theoretic analog of Fisher's fundamental theorem of natural selection.

43 citations


Journal ArticleDOI
Prem Nath1
TL;DR: A new measure L(α), called average code length of order α, has been defined and its relationship with Renyi's entropy has been discussed and a coding theorem for discrete noiseless channels has been proved.
Abstract: A new measure L(α), called average code length of order α, has been defined and its relationship with Renyi's entropy has been discussed. Using L(α), a coding theorem for discrete noiseless channels has been proved.

38 citations




Book ChapterDOI
01 Jan 1975
TL;DR: In this paper, a general form for densities which maximize the entropy in a class of distributions having specified values for the expectations of certain functions of the random variables is given, and characterizations of several well-known distributions are obtained.
Abstract: A general form is given for densities which maximize the entropy in a class of distributions having specified values for the expectations of certain functions of the random variables. Characterizations of several well-known distributions are obtained.

26 citations


Journal ArticleDOI
TL;DR: This paper responds to two justifications given for the method, which has been applied to distributions of income, of traffic, of stock-price changes, and of types of brand-article purchases.
Abstract: In a growing body of literature, available partial knowledge is used to estimate the prior probability distribution p≡(p1,...,pn) by maximizing entropy H(p)≡-Σpi log pi, subject to constraints on p which express that partial knowledge. The method has been applied to distributions of income, of traffic, of stock-price changes, and of types of brand-article purchases. We shall respond to two justifications given for the method: (α) It is “conservative,” and therefore good, to maximize “uncertainty,” as (uniquely) represented by the entropy parameter. (β) One should apply the mathematics of statistical thermodynamics, which implies that the most probable distribution has highest entropy. Reason (α) is rejected. Reason (β) is valid when “complete ignorance” is defined in a particular way and both the constraint and the estimator's loss function are of certain kinds.

25 citations


Journal ArticleDOI
TL;DR: This paper proposes a modification of the initial Shannon information theory which takes into account the point of view of the receiver, and obtains the concept of negative information which may explain the practical phenomenon of information loss.
Abstract: This paper is the continuation of two articles which appeared in this journal, and presents an approach to the thermodynamics of general systems via information theory. It is composed of two parts. In the first part one proposes a modification of the initial Shannon information theory which takes into account the point of view of the receiver. The concepts of ‘ effective entropy ’ and ‘ effective information ’ are given, the relation between effective information and Shannon information is exhibited. We then obtain the concept of negative information which may explain the practical phenomenon of information loss. In the second part of the paper, the interpretation of the results of the first part provides a model of state equation which would apply in the thermodynamics of general systems. The counterpart of perfect gas, the perfect system, is investigated via evolution principles and it is emphasized that the results so obtained agree with some practical phenomena. Some suggestions for further research a...

Journal ArticleDOI
TL;DR: This concise paper describes experiments with a television source encoder which consists of a differential PCM encoder followed by entropy coding and has the desirable property that it produces low noise in quiet areas of the picture and higher noise in busy areas ofThe picture.
Abstract: This concise paper describes experiments with a television source encoder which consists of a differential PCM encoder followed by entropy coding. This encoder converts analog television signals into a digital bit stream for digital transmission or storage. When optimized, this type of system is known to perform very close to the rate distortion bound. The differential PCM encoder has a 16-level quantizer during low entropy areas of the picture (quiet areas) but switches to a 6-level quantizer in high entropy (busy) areas of the picture which tend to fill up the buffer. This strategy avoids buffer overflow and has the desirable property that it produces low noise in quiet areas of the picture and higher noise in busy areas of the picture.

Book
01 Jul 1975
TL;DR: In this article, an approach to geographical hypothesis testing based on the concept of expected information is presented, and the use of the spatial entropy formula in fitting continuous population density functions to cities is explored and some comparative tests with other methods of estimation are presented.
Abstract: This paper presents an approach to geographical hypothesis testing based on the concept of expected information. The expected information formula and its relationship to other entropy formulas is first introduced and this concept is then used to test various hypotheses concerning the distribution of population and its density, in the New York, London, and Los Angeles regions. A related method of analysis based on the idea of deriving equivalent forms of system in which entropy is maximized and expected information minimized, is then presented and this provides alternative ways in which the various hypotheses can be tested. Finally, the use of the spatial entropy formula in fitting continuous population density functions to cities is explored and some comparative tests with other methods of estimation are presented. The ease with which the entropy estimator method can be used in this manner is then offset against its disadvantages, and in conclusion, these techniques are drawn together and evaluated.

Journal ArticleDOI
31 Oct 1975-Science
TL;DR: Under the Buckley Amendment, colleges and universities forfeit their federal support for research as well as students from the Office of Education if they do not comply with the regulatory requirements of access to student files imposed by legislation introduced by the Senator from New York.
Abstract: follow that the end justifies the means in this case either. To use the vernacular of judicial dissent, I would have thought that the one member of the United States Senate wearing the Conservative Party label would have been especially alert to the evil of expanding federal regulatory power beyond its constitutional bounds simply because the spending power opens the gate. I doubt if anyone would assert that the student records of local schools and colleges are within the reach of direct federal criminal law. Under the Buckley Amendment, however, we forfeit our federal support for research as well as students from the Office of Education if we do not comply with the regulatory requirements of access to student files imposed by legislation introduced by the Senator from New York.

Journal ArticleDOI
TL;DR: The paper discusses the concept of entropy in hypothesis selection, and the conflict with maximum-likelihood methods, and in Markovian processes, wherein there is some conflict with implied steady-state behaviour.
Abstract: This paper deals with the irrelevance of entropy to decision. The entropy measure fails to discriminate between states and their significance, as is indicated by a study of a reconnaissance example for which the entropy measure has been suggested. If the reduction of uncertainty is an explicit objective, then it is important to examine the properties of uncertainty measures, and it is queried as to whether some of those uniquely characterizing entropy are relevant. Other measures have some of the properties. The paper then discusses the concept of entropy in hypothesis selection, and the conflict with maximum-likelihood methods, and in Markovian processes, wherein there is some conflict with implied steady-state behaviour.

Journal ArticleDOI
Toby Berger1
TL;DR: The study of information-singularity contributes to a more thorough understanding of the mathematical nature of information generation and elucidates the manner in which generation of information by a time series is critically dependent on the detailed behavior of the sample functions of its spectral representation.
Abstract: Random processes that generate information slower than linearly with time are termed information-singular. The study of information-singularity contributes to a more thorough understanding of the mathematical nature of information generation. Specifically, it elucidates the manner in which generation of information by a time series is critically dependent on the detailed behavior of the sample functions of its spectral representation. The main theorem states that any random sequence whose spectral representation has stationary independent increments with no Brownian motion component is information-singular in the mean-squared sense. The concept of information-singularity can be construed as a means for discriminating between deterministic and nondeterministic processes. It is felt that information-singularity fulfills this discriminating function in a physically more satisfying manner than does the classical Hilbert space theory of linear and nonlinear prediction. The desire for a still more satisfying discriminant motivates investigation of the class of random processes that retain their information-singularity even when corrupted by additive noise. In the case of strictly stationary processes, the discussion focuses on the relationship between information-singularity and zero entropy. Lastly, some alternative definitions of information-singularity are considered and several open problems are identified.

Journal ArticleDOI
TL;DR: This paper uses a relativistic formulation of the classical information theory of Shannon to describe the dynamics of a system in function of its external entropy, and defines the concepts of ‘ organizability ’, ‘ own external entropy ’ and ‘ universe organizable ’.
Abstract: In a preceding article which appeared in this journal we have modified the classical information theory of Shannon (mainly we defined the concepts of ‘ effective information ’ and ‘ negative information ’) and by this way we have inferred a thermodynamic model for describing general systems. In the present paper we improve this approach by using a relativistic formulation which explicitly describes the basic relativistic feature of the information notion. After a careful analysis of the ‘ internal entropy ’ and ‘ external entropy ’ concepts, we apply these notions to the universe and we show that this latter exhibits a universal constant. We introduce a Riemannian space to describe the dynamics of a system in function of its external entropy and the Lorentz transformation is given. We define the concepts of ‘ organizability ’, ‘ own external entropy ’ and ‘ universe organizability ’. Composition laws for the organizability are given. Lastly we define the relativistic dynamics of general systems and some e...

Journal ArticleDOI
TL;DR: In this paper, a minimax entropy approach for estimating the reflection coefficients of a stationary random process from a short observation interval is proposed, which has a lattice type digital ladder structure.

Journal ArticleDOI
TL;DR: The generalized Boltzmann-Gibbs-Shannon entropy is characterized which includes the discrete Shannon entropy and the continuous Boltzman-gibbs entropy as special cases.

Journal ArticleDOI
TL;DR: Six functional concepts named detection limit, firmness, accuracy, precision, efficiency, efficiency and cost are defined and the redundancy of entropy is an indicator of the usefulness of analytical method for routine analysis.
Abstract: The usefulness of information theory for critical estimation of different analytical methods is shown. Six functional concepts named detection limit, firmness, accuracy, precision, efficiency and cost are defined. Their values calculated for any multicomponent analytical procedure are substituted for the expressions for the amount of information (I) and entropy (H). The obtained comparison results should be completed by redundant information, which is a more useful criterion for estimating the method due to the difficulty of interpreting absolute I and H values. The information redundancy value provides information about the sensitivity, resistance against chemical and physical agents, precision, accuracy, speed and cost of analytical procedure. The redundancy of entropy is an indicator of the usefulness of analytical method for routine analysis.

Book ChapterDOI
01 Jan 1975
TL;DR: This note examines characterizations based on second-order entropy, via a theorem which closely parallels Theorem 13.2.1 of Kagan, Linnik and Rao [2].
Abstract: For certain distributions there exist information-theoretic characterizations based on the maximization of their Shannon entropy subject to certain conditions. This note examines characterizations based on second-order entropy, via a theorem which closely parallels Theorem 13.2.1 of Kagan, Linnik and Rao [2].

Journal Article
TL;DR: The measures of inaccuracy generalize the concept of entropy in Communication theory and are useful in Coding theory, Information theory and Inference, etc.
Abstract: In some special cases, helps in characterizing the measures of inaccuracy and this application is pointed out in the next section. The measures of inaccuracy generalize the concept of entropy in Communication theory and are useful in Coding theory, Information theory and Inference, etc.

Journal ArticleDOI
01 Nov 1975
TL;DR: In this paper, the inverse Fourier transform of the maximum entropy spectrum was evaluated and shown to correspond to a reasonable nonzero extension of the autocorrelation function, which corresponded to a non-zero extension to a normal autocorerelation function.
Abstract: The technique and the relative advantages of maximum entropy spectrum analysis have been discussed by Burg [1], [2]. Evaluation of the inverse Fourier transform of the maximum entropy spectrum shows that this method does, indeed, correspond to a reasonable nonzero extension of the autocorrelation function.

ReportDOI
25 Jun 1975
TL;DR: This tutorial paper describes the maximum entropy spectrum and the Burg technique for computing the prediction error power and prediction error filter coefficients in the associated spectral estimation formula.
Abstract: : This tutorial paper describes the maximum entropy spectrum and the Burg technique for computing the prediction error power and prediction error filter coefficients in the associated spectral estimation formula. The maximum entropy spectrum is identical to the autoregressive spectral estimator. Also included in this paper is a discussion of the K-line spectrum, which is the wavenumber analogue of the frequency-domain maximum entropy spectrum, and the Burg technique modifications necessary for its implementation. The purpose of this paper is to provide a complete and self-contained account of the main features of the maximum entropy spectrum. Since many of the relevant mathematical derivations are not found in the formal published literature, they are incorporated in this paper. Supporting material and various sidelights of the maximum entropy spectrum appear in the appendices.

Journal ArticleDOI
TL;DR: It was to strike an information-theoretic balance between classifwation purity and predictive uncertainty *that the entropy minimax methoti was formulated, and this method, which has been applied to numerous real-world data sets, involves finding a partition of feature space for which Si, the expected value of the conditional classification entropy, is a minimum.


Journal ArticleDOI
TL;DR: In this article, the information content of the statement that the transverse momenta in production scattering are strongly cut is quantitatively defined and compared with the statistical mechanicla entropy both in the old limit of a pion gas at high temperature and in the new Hagedorn limit with bounded temperature.
Abstract: The information content of the statement that the transverse momenta in production scattering are strongly cut is quantitatively defined. This measure of information is compared with the statistical mechanicla entropy both in the old limit of a pion gas at high temperature and in the new Hagedorn limit with bounded temperature. A criterion for classifying statistical countings is given.

Journal ArticleDOI
TL;DR: The object of this paper is to characterize an information theoretic measure associated with three probability distributions known as 'Information Improvement' through a functional equation which arises by considering the additive property of the measure.
Abstract: The object of this paper is to characterize an information theoretic measure associated with three probability distributions known as 'Information Improvement' through a functional equation which arises by considering the additive property of the measure. This measure has been extensively used in economic analysis. INFORMATION THEORY; ENTROPY; INACCURACY; INFORMATION; INFORMATION IMPROVEMENT; FUNCTIONAL EQUATION