scispace - formally typeset
Search or ask a question

Showing papers on "Entropy (information theory) published in 1986"


Journal ArticleDOI
TL;DR: Quantities are defined operationally which qualify as measures of complexity of patterns arising in physical situations, and are essentially Shannon information needed to specify not individual patterns, but either measure-theoretic or algebraic properties of ensembles of pattern arising ina priori translationally invariant situations.
Abstract: Quantities are defined operationally which qualify as measures of complexity of patterns arising in physical situations. Their main features, distinguishing them from previously used quantities, are the following: (1) they are measuretheoretic concepts, more closely related to Shannon entropy than to computational complexity; and (2) they are observables related to ensembles of patterns, not to individual patterns. Indeed, they are essentially Shannon information needed to specify not individual patterns, but either measure-theoretic or algebraic properties of ensembles of patterns arising ina priori translationally invariant situations. Numerical estimates of these complexities are given for several examples of patterns created by maps and by cellular automata.

665 citations


Journal ArticleDOI
TL;DR: A strengthened central limit theorem for densities is established in this article, showing monotone convergence in the sense of relative entropy, which is a stronger theorem than the central limit for the densities.
Abstract: A strengthened central limit theorem for densities is established showing monotone convergence in the sense of relative entropy

412 citations


Journal ArticleDOI
TL;DR: The proposed picture compressibility is shown to possess the properties that one would expect and require of a suitably defined concept of two-dimensional entropy for arbitrary probabilistic ensembles of infinite pictures.
Abstract: Distortion-free compressibility of individual pictures, i.e., two-dimensional arrays of data, by finite-state encoders is investigated. For every individual infinite picture I , a quantity \rho(I) is defined, called the compressibility of I , which is shown to be the asymptotically attainable lower bound on the compression ratio that can be achieved for I by any finite-state information-lossless encoder. This is demonstrated by means of a constructive coding theorem and its converse that, apart from their asymptotic significance, might also provide useful criteria for finite and practical data-compression tasks. The proposed picture compressibility is also shown to possess the properties that one would expect and require of a suitably defined concept of two-dimensional entropy for arbitrary probabilistic ensembles of infinite pictures. While the definition of \rho(I) allows the use of different machines for different pictures, the constructive coding theorem leads to a universal compression scheme that is asymptotically optimal for every picture. The results are readily extendable to data arrays of any finite dimension.

217 citations



Journal ArticleDOI
01 Sep 1986
TL;DR: The terms index of fuzziness, entropy, and ¿-ness are used to define an index of feature evaluation in pattern recognition problems in terms of their intraclass and interclass measures.
Abstract: The terms index of fuzziness, entropy, and ?-ness, which give measures of fuzziness in a set, are used to define an index of feature evaluation in pattern recognition problems in terms of their intraclass and interclass measures. The index value decreases as the reliability of a feature in characterizing and discriminating different classes increases. The algorithm developed has been implemented in cases of vowel and plosive identification problem using formant frequencies and different S and ? membership functions.

115 citations


Journal ArticleDOI
TL;DR: The authors extended the original use of entropy by Amorocho and Espildora as a measure of uncertainty in hydrologic data and the reduction in that uncertainty due to application of a model.

106 citations


Journal ArticleDOI
TL;DR: This paper reviews the essentials of the Brooks-Wiley theory, and gives an account of hierarchical physical information systems within which the theory can be interpreted, and shows how the major conceptual objections can be answered.
Abstract: Daniel R. Brooks and E. O. Wiley have proposed a theory of evolution in which fitness is merely a rate determining factor. Evolution is driven by non-equilibrium processes which increase the entropy and information content of species together. Evolution can occur without environmental selection, since increased complexity and organization result from the likely “capture” at the species level of random variations produced at the chemical level. Speciation can occur as the result of variation within the species which decreases the probability of sharing genetic information. Critics of the Brooks-Wiley theory argue that they have abused terminology from information theory and t thermodynamics. In this paper I review the essentials of the theory, and give an account of hierarchical physical information systems within which the theory can be interpreted. I then show how the major conceptual objections can be answered.

94 citations


Journal ArticleDOI
TL;DR: In this paper, the optimality conditions obtained in [1] for dynamic compensation in the presence of state-, control-, and measurement-dependent noise were applied to a series of increasingly robust control designs for the example considered in [2].
Abstract: This note presents an application of the optimality conditions obtained in [1] for dynamic compensation in the presence of state-, control-, and measurement-dependent noise. By solving these equations, which represent a fundamental generalization of standard steady-state LQG theory, a series of increasingly robust control designs is obtained for the example considered in [2].

82 citations


Journal ArticleDOI
TL;DR: In this article, the principle of maximum entropy (POME) was employed to develop a procedure for derivation of a number of frequency distributions used in hydrology, which required specification of constraints and maximization of entropy, and is thus a solution of the classical optimization problem.

74 citations


Journal ArticleDOI
TL;DR: It has been shown that the normalized measures of a given probability distribution are much closer to one another than the corresponding absolute measures of entropy.
Abstract: A number of “normalized” measures of entropy have been obtained to measure the “intrinsic” uncertainty of a probability distribution. Their graphs have been drawn and it has been shown that the normalized measures of a given probability distribution are much closer to one another than the corresponding absolute measures of entropy.

52 citations


Journal ArticleDOI
TL;DR: The results indicate that weighting the entropy value will increase the investment performance of the entropy risk measure.
Abstract: This paper explores a problem with the use of the entropy measure of investment performance. The commonly used entropy measure ignores the dispersion of security frequency classes used in the calculation of entropy. Therefore, state-value weighting of the entropy is proposed and tested using a portfolio selection heuristic algorithm. The results indicate that weighting the entropy value will increase the investment performance of the entropy risk measure.

Journal ArticleDOI
TL;DR: A visually weighted suboptimal quantization scheme is developed to take into account the relative importance of different transform coefficients to the human visual system.
Abstract: Source encoding of images for bandwidth compression has become attractive in recent years because of decreasing hardware costs. By combining the source encoding approach with transform coding techniques, it is possible to obtain good image quality at low data rates. The general aspects of such a system are presented. The design of the quantizer for transform coefficients, which is the major source of error associated with the compression process, is considered using a visual fidelity criterion and subject to the constraint that the entropy of the quantizer be a prespecified quantity. A visually weighted suboptimal quantization scheme is developed to take into account the relative importance of different transform coefficients to the human visual system.

Journal ArticleDOI
TL;DR: Applications of entropy minimax are summarized in three major areas: meteorology, engineering/ materials science, and medicine/biology, and continuous patterns employing concepts of potential functions and fuzzy entropies.
Abstract: Applications of entropy minimax are summarized in three major areas: meteorology, engineering/ materials science, and medicine/biology. The applications cover both discrete patterns in multidimensional spaces of mixed quantitative and qualitative variables, and continuous patterns employing concepts of potential functions and fuzzy entropies. Major achievements of entropy minimax modeling include the first long range weather forecasting models with statistical reliability significantly above chance verified on independent data, the first models of fission gas release and nuclear fuel failure under commercial operating conditions with significant and independently verified statistical reliability, and the first prognosis models in coronary artery disease and in non-Hodgkin's lymphoma with significant predictability verified on independent data. In addition, applications of entropy minimization and maximization separately are reviewed, including feature selection, unsupervised classification, probability es...

Journal ArticleDOI
TL;DR: A new theory applicable to data treatment that derives a mathematical model of data disturbed by uncertainty, the statistical model of which may be unknown or even unjustifiable is briefly exposed.

Journal ArticleDOI
TL;DR: It is shown that ME is also a special case of the MDL criterion; maximizing the entropy subject to some constraints on the underlying probability function is identical to minimizing the code length required to represent all possible i.i.d, realizations of the random variable such that the sample frequencies satisfy those given constraints.
Abstract: The Maximum Entropy (ME) and Maximum Likelihood (ML) criteria are the bases for two approaches to statistical inference problems. A new criterion, called the Minimum Description Length (MDL), has been recently introduced. This criterion generalizes the ML method so it can be applied to more general situations, e.g., when the number of parameters is unknown. It is shown that ME is also a special case of the MDL criterion; maximizing the entropy subject to some constraints on the underlying probability function is identical to minimizing the code length required to represent all possible i.i.d, realizations of the random variable such that the sample frequencies (or histogram) satisfy those given constraints.

Journal ArticleDOI
TL;DR: It is demonstrated through simulations that the overflow/ undertow problem can be practically eliminated at the cost of a negligible increase in average distortion of the adaptive system.
Abstract: Variable-length codes can be used in entropy coding the outputs of an optimum entropy-constrained quantizer. Transmitting these codes over a synchronous channel; however, requires a buffer connecting the entropy coder to the channel. In a practical application, this buffer is of finite size and hence might overflow or undertow. To alleviate this difficulty, we use an adaptive scheme in which the quantizer parameters are changed successively according to the state of the buffer. Rate-distortion performance of optimum entropy-constrained quantizers in conjunction with this adaptive scheme is studied for the class of generalized Gaussian sources. It is demonstrated through simulations that the overflow/ undertow problem can be practically eliminated at the cost of a negligible increase in average distortion. Furthermore, it is shown that the efficiency of this system is more pronounced at high rates and for more broadtailed source densities. Easily computable upper and lower bounds on the average distortion of the adaptive system are developed.

Journal ArticleDOI
TL;DR: In this paper, it was shown that the product of the effective widths of the intensity functions in the space and the spatial-frequency domains takes its minimum value for a wave field with a Gaussian-shaped cross-spectral density function.
Abstract: It is shown that, among all partially coherent wave fields having the same informational entropy, the product of the effective widths of the intensity functions in the space and the spatial-frequency domains takes its minimum value for a wave field with a Gaussian-shaped cross-spectral density function. Furthermore, it is shown how this minimum value is related to the informational entropy and how this informational entropy is related to other quantities that can measure the overall degree of coherence of the partially coherent wave field.

Book ChapterDOI
01 Jan 1986
TL;DR: In this paper, the information dimension, metric entropy, and Lyapunov exponents associated with the limit sets in phase space were determined for numerically integrable dynamical systems such as iterated maps and systems of ordinary differential equations.
Abstract: In order to characterise quantitatively the behaviour of dissipative dynamical systems we have to determine the values of information dimension, metric entropy and Lyapunov exponents associated with the limit sets in phase space. For numerically integrable dynamical systems such as iterated maps and systems of ordinary differential equations, methods are available which lead to the determination of Lyapunov exponents with an accuracy generally depending only on the power of the utilized computer. We furthermore have the values of information dimension and metric entropy by applying the conjectured formulas relating their values to the Lyapunov exponents.

Journal ArticleDOI
TL;DR: The large sample properties of a new class of histogram estimators whose derivation is based on an information theory criterion--the maximum entropy principle, which preserves the observed mass and mean--are studied.
Abstract: The large sample properties of a new class of histogram estimators whose derivation is based on an information theory criterion--the maximum entropy principle, which preserves the observed mass and mean--are studied. The pointwise strong consistency, the point-wise asymptotic normality, and the rate of convergence to normality are investigated. The asymptotic mean square error (MSE) of these estimates is also compared relative to the histogram based on spacings, the classical k -nearest neighbor, the kernel estimator, and the generalized k -nearest neighbor density estimator.

Proceedings ArticleDOI
01 Apr 1986
TL;DR: A new Maximum Entropy pole-zero spectral estimation method is described, designed to match given correlation and cepstral values, yet achieve the maximum possible entropy.
Abstract: We describe a new Maximum Entropy pole-zero spectral estimation method. The model is designed to match given correlation and cepstral values, yet achieve the maximum possible entropy. The solution is based on solving a generalized symmetric almost-Toeplitz eigenvalue problem. We characterize this solution, present a fast computational algorithm, and give examples.

Journal ArticleDOI
TL;DR: On trouve des diffeomorphismes de basse entropie dans chaque classe d'isotopie sur S 3 ×S 3 as mentioned in this paper.
Abstract: On trouve des diffeomorphismes de basse entropie dans chaque classe d'isotopie sur S 3 ×S 3

Journal ArticleDOI
TL;DR: Experiments with a method for inference of Markov networks are described, using dynamic programming to search for string alignments; which cause high probability, landmark substrings to emerge by reinforcement as the training samples are processed.

Journal ArticleDOI
01 Jan 1986
TL;DR: The results of different fuzzy clustering algorithms are dealt with collectively in a formal framework of probabilistic set theory in order to interpret the structure of data.
Abstract: The results of different fuzzy clustering algorithms are dealt with collectively in a formal framework of probabilistic set theory in order to interpret the structure of data. Special attention is paid to calculation of entropy of the fuzzy clusters detected by various grouping methods. Two numerical examples illustrate applicability of the proposed way of cluster evaluation.

Journal ArticleDOI
TL;DR: Introduction a la mesure d'entropie comme moyen de caracterisation des images, Clarification de l'utilisation of the mesure de l’entropies.
Abstract: Introduction a la mesure d'entropie comme moyen de caracterisation des images. Clarification de l'utilisation de la mesure de l'entropie

Journal ArticleDOI
TL;DR: In this article, the authors used quantum mechanical estimates to bound the growth rates of small perturbations in a system of reaction-diffusion equations, and obtained bounds on the entropy and Hausdorff dimension of any attracting set.
Abstract: Under general assumptions we give upper bounds, proportional to the volume of the domain, on the temporal and spatial complexity of solutions to systems of reaction-diffusion equations. The notions of complexity measure instantaneous rather than asymptotic complexity, but time averaged versions of the estimates give bounds on entropy and Hausdorff dimension of any attracting set. The techniques involve the use of quantum mechanical estimates to bound the growth rates of small perturbations.

Journal ArticleDOI
TL;DR: In this article, an unbiased estimator of the uncertainty associated with a variable in both, the sampling with replacement and the sampling without replacement, is proposed, based on the entropy of order? = 2, proposed by Havrda and Charvat.
Abstract: This paper is concerned with the problem of estimating the uncertainty associated with a variable in a finite population. The study of this problem leads to the following conclusion: The classical measure of uncertainty, Shannon's entropy, is not suitable for sampling from finite populations; nevertheless, by using the entropy of order ? = 2, proposed by Havrda and Charvat, one can define an unbiased estimator of the uncertainty associated with the variable in both, the sampling with replacement and the sampling without replacement. This conclusion will be illustrated by an example.


Journal ArticleDOI
TL;DR: A formalism is described which does not require any a priori knowledge on the order parameters but rather allows us to determine these as well as the slaved modes and the emerging patterns, applicable also to non-physical systems such as neural nets.
Abstract: The maximum entropy principle allows one to make guesses on the distribution function of systems by maximizing the information entropy under given constraints. In a previous paper we succeeded to formulate appropriate constraints for systems undergoing nonequilibrium phase transitions, but we had to confine our treatment to the order parameters. In this paper we describe a formalism which does not require any a priori knowledge on the order parameters but rather allows us to determine these as well as the slaved modes and the emerging patterns. The method is applicable also to non-physical systems such as neural nets. Our approach allows us to reconsider the Landau theory of phase transitions from a new point of view. A guess is made on the Fokker-Planck equation underlying the processes which give rise to stationary distribution functions of a single order parameter.

Proceedings ArticleDOI
01 Apr 1986
TL;DR: A method based on a Linear Prediction approach is proposed for obtaining the missing part in the projection and results showing the effectiveness of this method in extracting significantly more information from truncated projections are presented.
Abstract: Unambiguous reconstruction is not possible with truncated projections. In order to reduce the ambiguity in the reconstructed image, several authors have suggested that a simple 'completion' involving extrapolation of the truncated projection is sufficient. In this paper a method based on a Linear Prediction approach is proposed for obtaining the missing part in the projection. Reconstruction is then carried out using the Convolution Backprojection (CBP) method [1]. Simulation results showing the effectiveness of this method in extracting significantly more information from truncated projections are presented.

Journal ArticleDOI
TL;DR: Following methods, a lower bound for the topological entropy of a differentiable map F:ℝn→ℜn possessing a snap-back repeller is obtained.
Abstract: In this paper we present a generalization to higher dimensions of the techniques for computation of the entropy of graphs in dimension one. Following these methods, we obtain a lower bound for the topological entropy of a differentiable map F :ℝ n →ℝ n possessing a snap-back repeller.