scispace - formally typeset
Search or ask a question

Showing papers on "Entropy (information theory) published in 1983"


Journal ArticleDOI
TL;DR: In this paper, the authors give an entropic criterion for triviality of the boundary and prove an analogue of Shannon's theorem for entropy, obtain a boundary triviality criterion in terms of the limit behavior of convolutions and prove a conjecture of Furstenberg about existence of a non-degenerate measure with trivial boundary on any amenable group.
Abstract: The paper is devoted to a study of the exit boundary of random walks on discrete groups and related topics. We give an entropic criterion for triviality of the boundary and prove an analogue of Shannon's theorem for entropy, obtain a boundary triviality criterion in terms of the limit behavior of convolutions and prove a conjecture of Furstenberg about existence of a nondegenerate measure with trivial boundary on any amenable group. We directly connect Kesten's and Folner's amenability criteria by consideration of the spectral measure of the Markov transition operator. Finally we give various examples, some of which disprove some old conjectures.

677 citations


Journal ArticleDOI
TL;DR: The quantum limits to information flow are explored in this article, where the central concept of a medium comprising several channels through which the information flows is introduced, and applications are made to the energy cost of computing and to the maximum rate of cooling, which in any one channel is Q
Abstract: The quantum limits to information flow are explored developing the central concept of a medium comprising several channels through which the information flows. In each channel there is an inequality between information flow I and energy flow E I2

222 citations


Journal ArticleDOI
TL;DR: Algorithms for automatic thresholding of grey levels (without reference to histogram) are described using the terms 'index of fuzziness' and 'entropy' of a fuzzy set to be minimum when the crossover point of an S-function corresponds to boundary levels among different regions in image space.

148 citations


Journal ArticleDOI
TL;DR: In this paper, a maximization of the expected entropy of the predictive distribution interpretation of Akaike's minimum AIC procedure is exploited for the modeling and prediction of time series with trend and seasonal mean value functions and stationary covariances.
Abstract: A maximization of the expected entropy of the predictive distribution interpretation of Akaike's minimum AIC procedure is exploited for the modeling and prediction of time series with trend and seasonal mean value functions and stationary covariances. The AIC criterion best one-step-ahead and best twelve-step-ahead prediction models can be different. The different models exhibit the relative optimality properties for which they were designed. The results are related to open questions on optimal trend estimation and optimal seasonal adjustment of time series.

107 citations


Journal ArticleDOI
TL;DR: It is shown that Shannon's entropy as well as two other related characteristic functions can express the local behaviour and overall relationships of species.
Abstract: The use of mathematical methods based on Shannon's entropy function is proposed for the evaluation of the consequences of sampling unit size and for the study of vegetation succession. The concept of diversity is extended to sets of phytosociological releves under the term florula diversity. It is shown that Shannon's entropy as well as two other related characteristic functions can express the local behaviour and overall relationships of species. Characteristic areas are defined in terms of the maxima and minima of these functions. Several study areas yielded the data which are used in the examples. Some theoretical problems of the methods are discussed and a computer, written in FORTRAN, is described.

91 citations


Journal ArticleDOI
TL;DR: A conceptual repeated sampling experiment is considered for evaluating a predictive distribution used to describe such future observations and leads to an asymptotic likelihood principle that gives a small-sample justification for the use of entropy for evaluating parameter estimation as well as model order and structure determination procedures.
Abstract: SUMMARY The objective of inferring stochastic models from a set of data is to obtain the best description, by using a probability model, of the statistical behaviour of future samples of the process. A conceptual repeated sampling experiment is considered for evaluating a predictive distribution used to describe such future observations and leads to an asymptotic likelihood principle. Considerations of likelihood and sufficiency lead to the use of entropy or the Kullback-Leibler information as the natural measure of approximation to the actual distribution by a predictive distribution in repeated samples. This gives a small-sample justification for the use of entropy for evaluating parameter estimation as well as model order and structure determination procedures.

67 citations


Journal ArticleDOI
TL;DR: An algorithm for determining the topological entropy of a unimodal map of the interval given its kneading sequence is given and it is shown that this algorithm converges exponentially in the number of letters of the kneaded sequence.
Abstract: We give an algorithm for determining the topological entropy of a unimodal map of the interval given its kneading sequence. We also show that this algorithm converges exponentially in the number of letters of the kneading sequence.

50 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed frequency-magnitude relations from the principle of maximum entropy as null hypotheses, to be tested by the empirical regional or global seismicity data, respectively.
Abstract: Summary. The entropy S for a continuous distribution p(x) is defined by This expression, however, is a measure of uncertainty relative to the coordinate x so that the probability distribution p(x) generated from the principle of maximum entropy depends on the choice of x. Only when the chosen parameter actually has a uniform prior distribution, can we expect the generated distribution to conform with the empirical data. For a physical system in which the independent variable x is measured to only limited accuracy, the prior distribution m(x) can be shown to be inversely proportional to the measurement error of x. A parameter with uniform prior distribution, then, is one that can be measured with equal accuracy throughout its range. In this context, the magnitude of an earthquake is such a parameter because using this parameter in the principle of maximum entropy leads to the empirically determined Gutenberg-Richter frequency-magnitude relation. Other proposed frequency-magnitude relations can also be generated from the principle of maximum entropy by imposing appropriate constraints. However, it is emphasized that such relations are generated from the principle as null hypotheses, to be tested by the empirical regional or global seismicity data.

49 citations


Journal ArticleDOI
TL;DR: A derivation of the method from the principle of minimum cross entropy is given, and the method is compared to minimum cross-entropy spectral analysis, of which it is a generalization.
Abstract: This paper presents a new information-theoretic method for simultaneously estimating a number of power spectra when a prior estimate of each is available and new information is obtained in the form of values of the autocorrelation function of their sum One application of this method is the separate estimation of the spectra of a signal and additive noise, based on autocorrelations of the signal plus noise A derivation of the method from the principle of minimum cross entropy is given, and the method is compared to minimum cross-entropy spectral analysis, of which it is a generalization Some basic mathematical properties are discussed Three numerical examples are included, two based oil synthetic spectra, and one based on actual speech data

41 citations








Journal ArticleDOI
TL;DR: In this article, entropy is interpreted as a prior probability for least-squares nonlinear regression, and the measure proves to be a transformation of the R2 statistic, however, it diminishes rapidly as the number of fitting parameters increases.
Abstract: Interpreting entropy as a prior probability suggests a universal but “purely empirical” measure of “goodness of fit.” This allows statistical techniques to be used in situations where the correct theory- and not just its parameters-is still unknown. As developed illustratively for least-squares nonlinear regression, the measure proves to be a transformation of theR2 statistic. Unlike the latter, however, it diminishes rapidly as the number of fitting parameters increases.

Journal ArticleDOI
TL;DR: The present paper attempts to explain each of the entropies in its proper perspective by making a comparative assessment of the various measures proposed.
Abstract: A large number of measures of entropy have been proposed by Hartley [21], Shannon [46], Renyi [43], Havrada and Charvat [22], Aczel and Daroczy [5], Kapur [25, 26,27], Rathie [42], Behara and Chawla [11], Sharma and Taneja [47] and others. These do not measure the same entity. Moreover the definitions of the various measures have been motivated by quite different considerations. The use of the same word ‘entropy’ for so many Intrinsically different entities is confusing and unfortunate. The present paper attempts to explain each of the entropies in its proper perspective by making a comparative assessment of the various measures proposed.

Journal ArticleDOI
TL;DR: A new concept of air monitoring network design using Shannon's entropy concept is introduced with the objectives of optimum estimation performance measurements and multivariate discrete entropy concept; however, this concept does not guarantee the location of stations in adversely affected subregions.

Journal ArticleDOI
01 Sep 1983
TL;DR: Both the truncation method of distribution and the ensemble dependency analysis are informative for clarifying the statistical characteristics of interval sequence of a skewed distribution in a heterogeneous time series.
Abstract: A measure of simplified dependency is introduced representing Markovian characteristics based on Shannon's entropy and conditional entropy under the Gaussian assumption. It is considered to be the most concise measure for expressing the higher order statistical properties of a time series and, in this regard, to be superior to a correlation or spectral measure. Simplified dependency is shown to be closely related to the prediction error in the autoregressive analysis of a time series and to be applicable also to non-Gaussian processes. Both the truncation method of distribution and the ensemble dependency analysis are informative for clarifying the statistical characteristics of interval sequence of a skewed distribution in a heterogeneous time series. These techniques serve to clarify the neural modulation mechanism.

Journal ArticleDOI
TL;DR: In this article, a recursive procedure for estimating, directly from a data set, the matrix coefficients representing a multichannel or?-vector stationary time series is presented. And an AR data model which considers all possible inter-relationships between the component channels is obtained.
Abstract: Autoregressive (AR) data models are implied in many analytical procedures used in the description and interpretation of geophysical measurements. Predictive deconvolution, prediction filtering, and spectral estimation based upon the maximum entropy and maximum likelihood criteria are among those procedures which imply AR models. Predictive deconvolution as a method for determining a seismic reflection series or for targetting an anomaly in profile potential field data has been broadly applied. Maximum entropy spectral analysis has been found particularly useful in searching for the presence of harmonics in short segments of geophysical data. We describe a new recursive procedure for estimating, directly from a data set, the matrix coefficients representing a multichannel or ?-vector stationary time series. An AR data model which considers all possible inter-relationships between the component channels is obtained. In several examples, multichannel geophysical data sets are modeled for deconvolution and maximum entropy multispectral analysis.

Journal ArticleDOI
TL;DR: In this paper, the expected information gain as a result of life testing n units for time t is calculated for the time transformed exponential model and a utility function based on entropy, and it is shown that the expected entropy is concave increasing in n and a transform of the test time t.
Abstract: : Expected information gain as a result of life testing n units for time t is calculated for the time transformed exponential model and a utility function based on entropy. We show that the expected information gain is concave increasing in n and a transform of the test time t. A computer program for calculating expected entropy for the Weibull distribution model is given. This may provide practical guidance in designing life test experiments.

Journal ArticleDOI
TL;DR: A two-stage nested recursive estimation procedure which separates effects of singly-constrained entropy models is introduced, and functional forms of the quantity component are related to the nature of the choice process, and their indices determined to minimize the Kullback information gain between the model destination-probabilities and observed destinations.
Abstract: In singly-constrained entropy models, the observed response of the system represents trade-offs between macrospatial accessibility measures from zones to facilities and more microspatial and other intrinsic attributes of attractiveness within the facilities themselves. In this paper, a two-stage nested recursive estimation procedure which separates these effects is introduced. For origin-constrained models, the first stage involves determination of conditional probabilities of zone of origin given destination, which best explain the origin-specific and spatial-interaction constraints. In the second stage, destination probabilities are estimated using Kullback's method, such as to satisfy the intrinsic destination-specific quality constraints, to minimize the divergence from residual destination-probabilities representing the quantity or choice-statistical component of attractiveness and to maintain the value of the legendre transform of the conditional entropy evaluated from the first stage. Functional forms of the quantity component are related to the nature of the choice process, and their indices determined to minimize the Kullback information gain between the model destination-probabilities and observed destination-probabilities. (Author/TRRL)

Journal ArticleDOI
TL;DR: A fully nonlinear(adaptive) DPCM system based on a composite model of the image fragment is described, which takes into consideration mutual influence of the prediction and quantization parts of the system.
Abstract: This paper presents a DPCM system for storage or transmission (via noiseless channel) of a moving (3-D) TV image. A fully nonlinear(adaptive) DPCM system based on a composite model of the image fragment is described. The description takes into consideration mutual influence of the prediction and quantization parts of the system. New results are presented concerning the probability density functions of the image luminance and the prediction error. The paper also addresses two types of system optimization: 1) minimization of the mean square quantization error (MSQE); 2) minimization of the MSQE when the entropy of the quantizer output is below a specified level. The results of the optimization determine the parameters of the system.

Journal ArticleDOI
TL;DR: The robust design of robust block quantizers when the number of quantization levels is large is formulated as a two-person game, and it is shown that for convex families of signal probability density functions there is a saddle point solution.
Abstract: In this paper we consider the design of robust block quantizers when the number of quantization levels is large. The r th power distortion measure is utilized through the convenient expression developed by Bennett and Gersho. The robust design is formulated as a two-person game, and it is shown that for convex families of signal probability density functions there is a saddle point solution. The evaluation of the robust solution amounts to determining the maximum s -norm element in the class of signal densities. We then develop specific solutions for three classes of pdf: a) the class specified by generalized moment contraints, b) the class of econtaminated densities, which has been a popular model in robust signal detection, and c) the class specified by upper and lower bounds to the probability density function of the signal. For high-quality quantization under fixed output entropy, the quantizer is uniform and the resulting distortion is an increasing function of the source entropy. The least favorable distribution is then the one having maximum entropy. For the e-contaminated family and the "banded" family (c) , we derive the maxentropic distributions.

Journal ArticleDOI
TL;DR: A hybrid differential pulse-code-modulation (DPCM) system is presented to demonstrate the application of fuzzy set theory in data compression.
Abstract: A hybrid differential pulse-code-modulation (DPCM) system is presented to demonstrate the application of fuzzy set theory in data compression. The system involves entropy coding, Laplacian quantiser and fuzzy enhancement operations as applied on the DPCM signal. Saving of about 0.5 bit is found to be obtained when a DPCM signal is quantised to 32-level, and then Huffman-coded.

Journal ArticleDOI
TL;DR: When population constraints were imposed on residential location models, it was found that the model which developed naturally from the approach taken contained as a special case the model proposed by Dacey and Norcliffe and not the Wilson model.
Abstract: This paper derives several well-known spatial models in a framework based upon the laws of conditional probability analysis. In particular, it relates the structure of some existing models of trip distribution, elementary residential location and residential location with capacity constraints, to either the multinomial or hypergeometric probability distributions. The major changes from traditional methods for developing these models deal with the derivation and form of the objective function for each interaction model. This alternative analysis reaches a wider audience than that only familiar with entropy methods and leads to several improvements in generality. Further, when population constraints were imposed on residential location models, it was found that the model which developed naturally from the approach taken in the paper contained as a special case the model proposed by Dacey and Norcliffe and not the Wilson model.

03 Aug 1983
TL;DR: Two algorithms are presented that implement multisignal minimum cross entropy spectral analysis (MCESA), a method for estimating the power spectrum of one or more independent signals when a prior estimate for each is available and new information is obtained in the form of values of the autocorrelation function of their sum.
Abstract: : Two algorithms are presented that implement multisignal minimum cross entropy spectral analysis (MCESA), a method for estimating the power spectrum of one or more independent signals when a prior estimate for each is available and new information is obtained in the form of values of the autocorrelation function of their sum Single signal MCESA is included as a special case One of the algorithms is slow, but general: the prior spectrum estimates and the resulting (posterior) spectrum estimates are represented by discrete frequency approximations with arbitrarily spaced frequencies, and the autocorrelation values may be given at arbitrarily spaced lags The other algorithm is considerably faster and applies to an important special case: the prior and posterior spectrum estimates are of the all pole form that results from maximum entropy (or linear predictive) spectral analysis, and the autocorrelation values are given at equispaced lags beginning at zero

Proceedings ArticleDOI
01 Apr 1983
TL;DR: Results of a new spectrum-analysis method that estimates a number of power spectra when a prior estimate of each is available and new information is obtained in the form of values of the auto-correlation function of their sum are presented.
Abstract: This paper presents results of a new spectrum-analysis method that estimates a number of power spectra when a prior estimate of each is available and new information is obtained in the form of values of the auto-correlation function of their sum. The method applies for instance when one obtains autocorrelation measurements for a signal with independent additive interference, and one has prior estimates of the signal and noise spectra. By incorporating prior estimates for both spectra, the method offers considerable flexibility for tailoring an estimator to the characteristics of a signal or noise. The new method, a generalization of Minimum-Cross-Entropy Spectrum Analysis (MCESA) and Maximum Entropy Spectrum Analysis (MESA), is called Multisignal MCESA. Its theoretical basis is reviewed, and results of experimental tests of an implementation are presented. The test data comprise digitized samples of speech corrupted with helicopter noise and tone interference.

Book ChapterDOI
01 Jan 1983
TL;DR: The paper concerns with analysis in terms of information theory and identification of static and dynamic systems.
Abstract: The paper concerns with analysis in terms of information theory and identification of static and dynamic systems.