scispace - formally typeset
Search or ask a question

Showing papers on "Entropy (information theory) published in 1978"


Journal ArticleDOI
TL;DR: In this paper, a generalization to the multichannel case of the well-known Burg maximum entropy technique for spectral estimation is presented. The extension is obtained by first obtaining the proper generalization of the scalar reflection coefficients to the multi-channel (or matrix) case.
Abstract: We present a generalization to the multichannel case of the well-known Burg maximum entropy technique for spectral estimation. The extension is obtained by first obtaining the proper generalization of the scalar reflection coefficients to the multichannel (or matrix) case.

255 citations


Journal ArticleDOI
TL;DR: A “hierarchical structure” of probabilistic dependence relations is proposed where it is shown that any symmetric correlation associated with a nonnegative entropy is decomposed into pairwise conditional and/or nonconditional correlations.
Abstract: A study of nonnegativity “in general” in the symmetric (correlative) entropy space as well as discussions of some related problems is presented. The main result is summarized as Theorems 4.1 and 5.3, which give the necessary and sufficient condition for an element of the symmetric (correlative) entropy space to be nonnegative. In particular, Theorem 4.1 may be regarded as establishing a mathematical foundation for information-theoretic analysis of multivariate symmetric correlation. On the basis of these results, we propose a “hierarchical structure” of probabilistic dependence relations where it is shown that any symmetric correlation associated with a nonnegative entropy is decomposed into pairwise conditional and/or nonconditional correlations. A systematic duality existing in the set of nonnegative entropies is also considerably clarified.

231 citations


Journal ArticleDOI
TL;DR: The finite-state complexity of a sequence plays a role similar to that of entropy in classical information theory (which deals with probabilistic ensembles of sequences rather than an individual sequence).
Abstract: A quantity called the {\em finite-state} complexity is assigned to every infinite sequence of elements drawn from a finite sot. This quantity characterizes the largest compression ratio that can be achieved in accurate transmission of the sequence by any finite-state encoder (and decoder). Coding theorems and converses are derived for an individual sequence without any probabilistic characterization, and universal data compression algorithms are introduced that are asymptotically optimal for all sequences over a given alphabet. The finite-state complexity of a sequence plays a role similar to that of entropy in classical information theory (which deals with probabilistic ensembles of sequences rather than an individual sequence). For a probabilistic source, the expectation of the finite state complexity of its sequences is equal to the source's entropy. The finite state complexity is of particular interest when the source statistics are unspecified.

202 citations


01 Aug 1978
TL;DR: An algorithm for solving the underlying least-squares problem directly, without forcing a Toeplitz structure on the model, leads to more accurate frequency determination for short sample harmonic processes, and it is computationally efficient and numerically stable.
Abstract: : Experience with the maximum entropy method of spectral analysis suggests that it can produce inaccurate frequency estimates of short sample sinusoidal data, and it sometimes produces calculated values for the filter coefficients that are unduly contaminated by rounding errors. Consequently, this report develops an algorithm for solving the underlying least-squares problem directly, without forcing a Toeplitz structure on the model. This approach leads to more accurate frequency determination for short sample harmonic processes, and our algorithm is computationally efficient and numerically stable. The algorithm can also be applied to two other versions of the linear prediction problem. A FORTRAN program is supplied.

124 citations


Journal ArticleDOI
TL;DR: In this article, a variational principle where the Lagrange multipliers of a trial distribution are used as variational parameters is discussed as an efficient, practical route to the determination of the distribution of maximal entropy.

82 citations


Journal ArticleDOI
Steven Kay1
TL;DR: Using maximum entropy power spectral estimation, the estimate of the frequency of a sinusoid in white noise has been shown to be very sensitive to the initial sinusoidal phase as discussed by the authors, which can be reduced by replacing the real data by its analytic form, reducing the sampling rate by two, and employing the power spectral estimate for complex data.
Abstract: Using maximum entropy power spectral estimation, the estimate of the frequency of a sinusoid in white noise has been shown to be very sensitive to the initial sinusoidal phase. This phase dependence can be significantly reduced by replacing the real data by its analytic form, reducing the sampling rate by two, and employing the power spectral estimate for complex data.

61 citations


Journal ArticleDOI
01 Nov 1978
TL;DR: An additional recursive formula is presented which simplifies and reduces the computational load of the Burg algorithm.
Abstract: Two points recently brought up in this journal concerning the performance of maximum entropy spectral analysis are discussed. First, an additional recursive formula is presented which simplifies and reduces the computational load of the Burg algorithm. Second, attention is drawn to some recent results in the geophysics literature related to the proper selection of prediction filter length.

34 citations



Journal ArticleDOI
TL;DR: Information gain, entropy and redundancy measures were used to analyse human monitoring of a Gaussian autoregressive process and found that there was a rapid loss of information following a sample.
Abstract: Information gain, entropy and redundancy measures were used to analyse human monitoring of a Gaussian autoregressive process. Particular emphasis was placed on the values of these measures when sampling occurred, yielding the following results: (i) the subjects appeared to ‘ tolerate ’ a relatively high degree of uncertainty (large entropy) before they decided to take a sample, the entropy asymptotes ranging from 2[sdot]12 to 4[sdot]24 bits sample−1: (ii) the entropy at sampling instants increased significantly as the variance of the monitored process increased; (iii) the ratio of the cost of the process exceeding certain threshold limits to the cost of sampling had no significant effect on the sampling entropy, while the interaction of this cost ratio with the process variance had a significant effect on the absolute scatter of the sampling entropy: (iv) there was a rapid loss of information following a sample; (v) sampling redundancy ranged from 1% to 18% with an overall mean of 10%; (vi) neither the pr...

25 citations


Journal ArticleDOI
TL;DR: The task entropy had a significant influence on the relationship between the performance (cycle time) and practice (the number of repetitions or cycle number) and the model parameters for this relationship were found to be significantly and linearly related to the task entropy.
Abstract: This experimental study analysed the effects of the complexity (entropy) of a repetitive tusk on the human looming function for the task. The results from 20 subjects revealed that the task entropy had a significant influence on the relationship between the performance (cycle time) and practice (the number of repetitions or cycle number). The model parameters for this relationship were found to be significantly and linearly related to the task entropy, the nominal rate of which varied within the maximum range of 0·9·to 6.1 bits/s

19 citations


Journal ArticleDOI
TL;DR: It is shown that, if f is bounded on an arbitrarily small nonvanishing interval contained in (0, 1), then f ≡ S where S is Shannon's measure of entropy on a 2-event space, and this answers in the affirmative some closely related questions of J. Aczel and Z. Darcozy.
Abstract: Let f be an information function. We show that, if f is bounded on an arbitrarily small nonvanishing interval contained in (0, 1), then f ≡ S where S is Shannon's measure of entropy on a 2-event space. This answers in the affirmative some closely related questions of J. Aczel and Z. Darcozy, and at the same time provides even more evidence for the Aczel-Darcozy nonnegativity question. As corollaries, we obtain a strengthened form of Lee's theorem, an improvement to a theorem of Darcozy, and an extension to the author's previous result. Furthermore, several different proofs for completing the argument on the rationals are given, including one due to P. Erdos (private communication).

Journal ArticleDOI
TL;DR: It is shown that, for the spectral analysis of radar clutter, reliable short-term spectral estimates can be obtained with a small number of data points, while keeping a good resolution capability.
Abstract: A brief review of the maximum entropy method for spectral analysis of complex signals is presented. It is shown that, for the spectral analysis of radar clutter, reliable short-term spectral estimates can be obtained with a small number of data points, while keeping a good resolution capability.

Journal ArticleDOI
Ove Frank1
TL;DR: By specifying the stochastic model in various ways it is shown how the decrease in entropy caused by the publication of a frequency distribution can be determined and interpreted.

Journal ArticleDOI
TL;DR: In this paper, an empirical application of an information theoretic approach to spatial hypothesis testing is presented, which employs the concept of expected information to test hypotheses concerning the distribution of urban population and population density in San Antonio for the years 1960 and 1970.
Abstract: This paper presents an empirical application of an information theoretic approach to spatial hypothesis testing. Following the lead of Batty [1] this study employs the concept of expected information to test hypotheses concerning the distribution of urban population and population density in San Antonio for the years 1960 and 1970. Cast for the first time in a longitudinal context, major concerns of this work are the relative advantages, both theoretical and methodological, of certain entropy measures. Specifically, comparisons are made between the Shannon and the Kullback formulations. In this context of comparison, problems closely linked to what has been called the “entropy paradox” are identified and explained, suggesting important qualitative differences between these two measures.


Journal ArticleDOI
TL;DR: A method for the quality evaluation of radiographic images in terms of entropy is presented and the performance of tank development is found to be superior to that of automatic processor development.
Abstract: A method for the quality evaluation of radiographic images in terms of entropy is presented By this method, the image quality can be synthetically evaluated by a single number The method presented is used to calculate the amount of information contributed by the image of a uniform lucite step-wedge object The new method is also applied to the evaluation of development processes The calculated results show that the information quantities conveyed by tank developed and automatic processor developed images are 176 and 151 bits per image on the average (the maximum of information quantity can be equally transmitted with 232 bits per image) The performance of tank development is found to be superior to that of automatic processor development

Journal ArticleDOI
S.M. Macgill1
TL;DR: In this article, a new approach to input output multiplier methods relevant in situations using rectangular input output tables (situations where a sector product distinction can be made) is introduced, making use of entropy maximising principles, using constraints expressing basic technologies of production and taking base year intersectoral transactions as prior values for the expected future values of those transactions.

Journal ArticleDOI
TL;DR: In this paper, the entropy of multicomponent mixtures is examined and two reference states of unique entropy values are defined, the completely mixed and the separated states, and the position of any system relative to these fixed points is accounted for in terms of four entropy terms, one for chemical specie separation and three for physical or spatial separations.
Abstract: The entropy of multicomponent mixtures is examined and two reference states of unique entropy values are defined, the completely mixed and the separated states. The position of any system relative to these fixed points is accounted for in terms of four entropy terms, one for chemical specie separation and three for physical or spatial separations. An apt comparison of separations must acknowledge the possibility of these four distinct processes. An alternative approach to the evaluation of a separation is based on the reduction of the information in a matrix of pairwise criterion of separation to a single point in vector space. This vector distance as a measure of separation is illustrated for differential migration processes.

Journal Article
TL;DR: A new technique integrating concepts from cluster analysis and information theory was applied to the classification of Michigan hospitals, producing a hierarchy of classifications that yields the most reasonable groupings of hospitals.
Abstract: A new technique integrating concepts from cluster analysis and information theory was applied to the classification of Michigan hospitals. First, a number of cost-related variables that describe the hospitals and their surroundings were used in a cluster analysis to produce a hierarchy of classifications. Then for each classification, the within-group entropy was computed for each group of hospitals and averaged over the classification. Finally, this average entropy was used as an aid to judgment in deciding which of the many classifications in the hierarchy yields the most reasonable groupings of hospitals.

Journal ArticleDOI
TL;DR: In this paper, it was shown that if σ is the shift on sequences of {0, 1} and τ is the entropy zero transformation used by Ornstein in constructing a counter-example to Pinsker's conjecture, then the skew-product transformationT defined byT(x,y)=(σx,τ = x0,τ = σ) is Bernoulli.
Abstract: We show that if σ is the shift on sequences of {0,1} and τ is the entropy zero transformation used by Ornstein in constructing a counter-example toPinsker's conjecture, then the skew-product transformationT defined byT(x,y)=(σx,τ x0 y) is Bernoulli. ThisT is conditionally mixing with respect to the independent generator for σ, a partition with full entropy.


Journal ArticleDOI
TL;DR: Using an error entropy estimation lower bound, which is independent of any estimation procedure, conditions for which identification cannot be made with certainty are presented and examples involving non-Gaussian statistics are used to illustrate the efficiency of the error entropy adaptive identification algorithm.
Abstract: Information-theoretic concepts are utilized to develop a procedure for identifying a parameter of a stochastic linear discrete time dynamic scalar system based on noisy linear measurements of the system's state. After various simplifying approximations, the derived error entropy identification algorithm reduces to an on-line adaptive identification algorithm that is similar in many respects to well-established identification techniques. Conditions under which the developed on-line adaptive algorithm identifies the system with certainty are presented. Using an error entropy estimation lower bound, which is independent of any estimation procedure, conditions for which identification cannot be made with certainty are also presented. Examples involving non-Gaussian statistics are used to illustrate the efficiency of the error entropy adaptive identification algorithm as well as to compare it with several other identification procedures.

Journal ArticleDOI
TL;DR: Some properties of this entropy of the distribution resulting from such a random deletion of stochastic point process are given, with special reference to the difference between the entropies of the original and the deleted point process.

Journal ArticleDOI
TL;DR: This note indicates an observation on the relationship between the independent works of Hellerman on a measure of computational work and of Cook and Flynn on the average minimum cost of logical networks.
Abstract: This note indicates an observation on the relationship between the independent works of Hellerman on a measure of computational work and of Cook and Flynn on the average minimum cost of logical networks.

Book
01 Feb 1978
TL;DR: Several intuitive design approaches and also a general design philosophy based upon the generation of fake processes i.e., finite entropy processes which are close to the process one wishes to compress are presented.
Abstract: : Recent results in information theory promise the existence of tree and trellis data compression systems operating near the Shannon theoretical bound, but provide little indication of how actually to design such systems Presented here are several intuitive design approaches and also a general design philosophy based upon the generation of fake processes ie, finite entropy processes which are close (in the generalized Ornstein distance) to the process one wishes to compress Most of the design procedures can be used for a wide class of sources Performance is evaluated, via simulations, for memoryless, autoregressive and moving average Gaussian sources and compared to traditional Data Compression systems The new schemes typically provide 1-2 dB improvement in performance over the traditional schemes at a rate of 1 bit/symbol The inevitable increase in complexity is moderate in most cases

Dissertation
01 Nov 1978
TL;DR: In this article, the authors describe a new approach to steady-state forecasting models based on Bayesian principles and information theory, which extends beyond the constraints of normality and linearity required in all existing forecasting methods.
Abstract: This thesis describes a new approach to steady-state forecasting models based on Bayesian principles and Information Theory. Shannon's entropy function and Jaynes' principle of maximum entropy are the essen­tial results borrowed from Information Theory and are extensively used in the model formulation. The Bayesian Entropy Forecasting (BEF) models obtained in this way extend beyond the constraints of normality and linearity required in all existing forecasting methods. In this sense, it reduces in the normal case to the well known Harrison and Stevens steady-state model. Examples of such models are presented, including the Poisson-gamma process, the Binomial-Beta process and the Truncated Normal process. For all of these, numerical applications using real and simulated data are shown, including further analyses of epidemic data of Cliff et al, (1975).

Journal ArticleDOI
M Nathanson1
TL;DR: In this paper, a dynamic entropy-maximising Markovian framework for projecting trip and location distributions is suggested in conjunction with an earlier proposal of a Bayes chain based on α = 1 information minimisation.
Abstract: The contrast between entropy measured in thermodynamics and information theory is discussed in terms of its implications for trip distribution and related urban-modelling procedures based on entropy maximisation or information minimisation. Models of spatial interaction using α = 2 entropy measures are shown to be improper because of the limited properties of these measures. A dynamic entropy-maximising Markovian framework for projecting trip and location distributions is suggested in conjunction with an earlier proposal of a ‘Bayes chain’ based on α = 1 information minimisation.

19 Jul 1978
TL;DR: O'Kelly et al. as discussed by the authors empirically test an entropy maximizing model of retail location and consumer behavior to understand the relationship between location and behavior in the context of retail locations.
Abstract: Title: Empirical Tests of an Entropy Maximizing Model of Retail Location and Consumer Behaviour, Author: Morton E. O'Kelly, Location: Thode