scispace - formally typeset
Search or ask a question

Showing papers on "Entropy (information theory) published in 1979"


15 Mar 1979
TL;DR: Algorithms are designed for coding discrete memoryless sources which have a known symbol probability ordering but unknown probability values and have exhibited performance only slightly above all entropy values when applied to real data with stationary characteristics over the measurement span.
Abstract: Some practical adaptive techniques for the efficient noiseless coding of a broad class of such data sources are developed and analyzed. Algorithms are designed for coding discrete memoryless sources which have a known symbol probability ordering but unknown probability values. A general applicability of these algorithms to solving practical problems is obtained because most real data sources can be simply transformed into this form by appropriate preprocessing. These algorithms have exhibited performance only slightly above all entropy values when applied to real data with stationary characteristics over the measurement span. Performance considerably under a measured average data entropy may be observed when data characteristics are changing over the measurement span.

410 citations


Journal ArticleDOI
TL;DR: In this article, an objective procedure for the detection of outliers is given by using Akaike's information criterion and numerical illustrations are given, using data from Grubbs [8] and Tietjen and Moore [7].
Abstract: An objective procedure for the detection of outliers is given by using Akaike's information criterion. Numerical illustrations are given, using data from Grubbs [8] and Tietjen and Moore [7].

57 citations


Journal ArticleDOI
TL;DR: An iterative algorithm for direct three-dimensional reconstruction from cone-beam projection data is presented, based on the maximum entropy criterion and uses an efficient parametrization of the cone- beam projection geometry.

37 citations


01 Jan 1979
TL;DR: This paper presents techniques for inferring migration flows by migrant category from some available aggregate data and proves the convergence of the general iterative procedure of which the well-known RAS and entropy methods are special cases.
Abstract: This paper presents techniques for inferring migration flows by migrant category from some available aggregate data. The data are in the form of marginal totals of migration flow matrices or prior information on certain cell values. A generalized estimation procedure is presented which incorporates both maximum likelihood and x^2 estimates. The duality results of the optimizing problems rely on the decomposition principle of Rockafellar. We prove the convergence of the general iterative procedure of which the well-known RAS and entropy methods are special cases. The validity of the methods is tested by comparison of estimates and observations for Austria and Sweden, using x^2 and absolute percentage deviation test statistics. The techniques are then applied to infer age-specific migration flows for Bulgaria. Algorithms and FORTRAN computer programs are also given.

19 citations


Journal ArticleDOI
TL;DR: A combinatorial approach is proposed for proving the classical source coding theorems for a finite memoryless stationary source (giving achievable rates and the error probability exponent) and provides a sound heuristic justification for the widespread appearence of entropy and divergence (Kullback's discrimination) in source coding.
Abstract: A combinatorial approach is proposed for proving the classical source coding theorems for a finite memoryless stationary source (giving achievable rates and the error probability exponent). This approach provides a sound heuristic justification for the widespread appearence of entropy and divergence (Kullback's discrimination) in source coding. The results are based on the notion of composition class -- a set made up of all the distinct source sequences of a given length which are permutations of one another. The asymptotic growth rate of any composition class is precisely an entropy. For a finite memoryless constant source all members of a composition class have equal probability; the probability of any given class therefore is equal to the number of sequences in the class times the probability of an individual sequence in the class. The number of different composition classes is algebraic in block length, whereas the probability of a composition class is exponential, and the probability exponent is a divergence. Thus if a codeword is assigned to all sequences whose composition classes have rate less than some rate R , the probability of error is asymptotically the probability of the must probable composition class of rate greater than R . This is expressed in terms of a divergence. No use is made either of the law of large numbers or of Chebyshev's inequality.

16 citations


Journal ArticleDOI
TL;DR: The fuzzy entropy is applied to the seal impression problem to measure the subjective value of information under the condition of uncertainty and the effectiveness of the former method is 2.32 times higher than that of the latter method, provided that the cost of information is equal.

16 citations


Journal ArticleDOI
TL;DR: The second law of thermodynamics provides an analytic framework for the assessment of the potential displacement of fossil fuels by solar energy as discussed by the authors, and the most promising areas are those which have entropy levels corresponding to the entropy level of the solar resource as converted to heat in various types of solar collectors.
Abstract: The second law of thermodynamics provides an analytic framework for the assessment of the potential displacement of fossil fuels by solar energy. the most promising areas are those which have entropy levels corresponding to the entropy level of the solar resource as converted to heat in various types of solar collectors. Since the entropy of solar heat can be partitioned by the means of collection—d.g., by the collector concentration ratio—solar can be matched much more precisely to many tasks at temperatures up to 300°C than can fossil fuels which are low entropy sources now widely misused for high entropy tasks.

15 citations


01 Jan 1979
TL;DR: Application of BARC image data compression to the Galileo orbiter mission of Jupiter is considered and it is noted that the compressor can also be operated as a floating rate noiseless coder by simply not altering the input data quantization.
Abstract: A block adaptive rate controlled (BARC) image data compression algorithm is described. It is noted that in the algorithm's principal rate controlled mode, image lines can be coded at selected rates by combining practical universal noiseless coding techniques with block adaptive adjustments in linear quantization. Compression of any source data at chosen rates of 3.0 bits/sample and above can be expected to yield visual image quality with imperceptible degradation. Exact reconstruction will be obtained if the one-dimensional difference entropy is below the selected compression rate. It is noted that the compressor can also be operated as a floating rate noiseless coder by simply not altering the input data quantization. Here, the universal noiseless coder ensures that the code rate is always close to the entropy. Application of BARC image data compression to the Galileo orbiter mission of Jupiter is considered.

14 citations


Proceedings ArticleDOI
01 Apr 1979
TL;DR: The extrapolation (prediction) process under the maximum entropy condition is shown to correspond to the most random extension or to the maximization of the mean square prediction error conditioned on using the optimum predictor.
Abstract: Using the ideas from one-dimensional (1-D) maximum entropy spectral estimation, we derive a 2-D spectral estimator by actually extrapolating the 2-D sampled autocorrelation function. The method used here is to maximize the entropy of a set of random variables. The extrapolation (prediction) process under the maximum entropy condition is shown to correspond to the most random extension or to the maximization of the mean square prediction error conditioned on using the optimum predictor. The 2-D extrapolation must be terminated by the investigator. The Fourier transform of the extrapolated autocorrelation function is our 2-D spectral estimator. A specific algorithm for estimating the 2-D spectrum is presented. The algorithm has been programmed and computer examples are presented.

9 citations


Book ChapterDOI
01 Jan 1979
TL;DR: The use of entropy as a basis for object/image reconstruction procedures is not new, but with the appearance of new, faster algorithms the actual use of these algorithms for the reconstruction of objects from ‘real’ data is likely to increase.
Abstract: The use of entropy as a basis for object/image reconstruction procedures is not new, but with the appearance of new1, faster algorithms the actual use of these algorithms2 for the reconstruction of objects from ‘real’ data is likely to increase.

8 citations


01 Sep 1979
TL;DR: In this paper, a generalized estimation procedure is presented which incorporates both maximum likelihood and x^2 estimates, and the duality results of the optimizing problems rely on the decomposition principle of Rockafellar.
Abstract: This paper presents techniques for inferring migration flows by migrant category from some available aggregate data. The data are in the form of marginal totals of migration flow matrices or prior information on certain cell values. A generalized estimation procedure is presented which incorporates both maximum likelihood and x^2 estimates. The duality results of the optimizing problems rely on the decomposition principle of Rockafellar. We prove the convergence of the general iterative procedure of which the well-known RAS and entropy methods are special cases. The validity of the methods is tested by comparison of estimates and observations for Austria and Sweden, using x^2 and absolute percentage deviation test statistics. The techniques are then applied to infer age-specific migration flows for Bulgaria. Algorithms and FORTRAN computer programs are also given.

Journal ArticleDOI
TL;DR: In this article, a general expression for a recursion formula which describes a random walk with coupled modes is given, where the random walker is specified by the jumping probabilities P+ and P− which depend on the modes.
Abstract: A general expression for a recursion formula which describes a random walk with coupled modes is given. In this system, the random walker is specified by the jumping probabilities P+ and P− which depend on the modes. The transition probability between the modes is expressed by a jumping probabilityR(ij) (orrij). With the aid of this recursion formula, spatial structures of the steady state of a coupled random walk are studied. By introducing a Liapunov function and entropy, it is shown that the stability condition for the present system can be expressed as the principle of the extremum entropy production.

Proceedings ArticleDOI
09 Jan 1979
TL;DR: A new measure of scene content based on the concept of structural entropy is presented, which utilizes unary and binary relationships extracted from a relational representation of a scene.
Abstract: A new measure of scene content based on the concept of structural entropy is presented. The measure utilizes unary and binary relationships extracted from a relational representation of a scene. Bounds on the entropy measure are dependent only on the number of unary and binary relations. Experimental results illustrating the concepts developed in the paper are presented.

Journal ArticleDOI
TL;DR: A simple iterative dual algorithm for maximum entropy image restoration that involves fewer parameters than conventional minimization in the image space and results for Fourier synthesis with inadequate phantom data are given.
Abstract: A simple iterative dual algorithm for maximum entropy image restoration is presented. The dual algorithm involves fewer parameters than conventional minimization in the image space. Mini-computer test results for Fourier synthesis with inadequate phantom data are given.


Journal ArticleDOI
TL;DR: In this paper, a conditionalized version of the Friedman-Ornstein result on Markov processes is used to study the way in which a factor generated by a finite length stationary coding sits in a Markov process.
Abstract: Using Thouvenot’s relativized isomorphism theory, the author develops a conditionalized version of the Friedman—Ornstein result on Markov processes. This relativized statement is used to study the way in which a factor generated by a finite length stationary coding sits in a Markov process. All such factors split off if they are maximal in entropy. Moreover, one can show that if a finite coding factor fails to split off, it is relatively finite in a larger factor which either generates or itself splits off.

Journal ArticleDOI
Ellen Hisdal1
TL;DR: This paper deals with the quantity of information acquired when the prior probabilities of a binary source are learned from a sequence of N source symbols or Bernoulli trials.
Abstract: When Shannon presents the formula for the entropy of a memoryless source, he presupposes that the prior probabilities of the different source symbols are known. This paper deals with the quantity of information acquired when the prior probabilities of a binary source are learned from a sequence ofN source symbols or Bernoulli trials. Two learning methods are considered: Maximum likelihood estimation of a parameter ϑ by calculation of the relative frequency; and calculation of the posterior probability density for ϑ. For both methods the acquired information behaves as 1/2 logN + const. for largeN.

Journal ArticleDOI
TL;DR: In this article, a method for evaluating multiple choice examinations is developed by measuring the entropy state of a student's exam above the "thermal background level" (the most probable state).
Abstract: Considering a student’s examination as one state of a system which consists of the set of all possible examinations, a new method for evaluating multiple choice examinations is developed by measuring the entropy state of a student’s exam above the ’’thermal background level’’ (the most probable state). The entropy grade is shown to contain standard scoring as a special case when both the number of questions given and the number of possible selections become large. The entropy grade is also shown to be related to the information it takes a student to go from one state of understanding to another state of understanding. The student’s state of understanding is characterized either by his personal probability that his selections will be correct or by experimentally measured probability that the students’s selections were correct.

Book ChapterDOI
01 Jan 1979

Proceedings ArticleDOI
04 Jun 1979
TL;DR: The primary emphasis is to obtain realizable decompositions using readily-implementable metrics, as well as focus upon suitable partitioning alternatives in terms of identifying mathematically consistent criteria for structural decomposition.
Abstract: The decomposition of a metric space into successive subregions exhibiting distinctive characteristics is a problem of broad application. In pattern classification, the object is to partition the space such that pattern classes are easily separable; that is, so that each subregion of the partition contains predominantly samples of only one class. In piece-wise-constant approximation the decompositions produced contain samples whose values are sufficiently close to allow approximation with a specified degree of accuracy. In defining software it is quite often necessary to derive a structural model of a computer program which contains modules, i.e., partitions exhibiting the flow relations or connectivities among the elements (statements) in a program. The subsequent analysis and manipulation of the structural model produces useful design alternatives that enhance the operational qualities of the software generated in terms of program control, logic paths, data transfer and other relevant software issues. The basic feasibility of this approach has been demonstrated by numerous investigators. 1 - 5 However, the analytical and diagnostic tools for performing structural decompositions require further refinement and development. For example, the metrics usually used 6 , 7 for defining the topology of a given software structure are primarily single attribute measures. Although the entropy metric proposed in this paper is metrizable in terms of its hypergraph representation, 8 the extension to a multi-attribute unique formulation is, as yet, elusive. This is because an all-purpose problemindependent metric space places unrealizable constraints on the structure it proposes to define. Thus, as Koontz et al. 9 point out, even when a metric is given and a structure well known, the notion of neighboring points can not be rigorously defined for finite point sets from a computational point of view, since the simplest Euclidean distance measure must be scaled by a factor indicating its own respective distance to the nearest neighbor in order to avoid overlapping and ambiguous regions. Although conceptually, the construction of a neighborhood and the determination of the limit point of a sequence of real numbers is a widely used idea, a more fundamental requirement for metrizable hyper-spaces is that of specifying the existence of a limit point of a set. The resultant necessary and sufficient conditions for identifying metrizable spaces is given by Hausdorff. 10 However, equivalent normalizations and the use of discrete semi-metrics over a restricted space have precluded some of these inherent problems in the quest for such a unique, multi-attribute metric. Thus, the primary emphasis is to obtain realizable decompositions using readily-implementable metrics, as well as focus upon suitable partitioning alternatives in terms of identifying mathematically consistent criteria for structural decompositions.

Journal ArticleDOI
TL;DR: In this article, the integration of public values into collective action and policy must be based on a study of the public interest as an entropic process, and willingness-tocompromise and consistency of values are characterized as analogous to the concept of entropy.
Abstract: The fundamental premise of this paper is that the integration of public values into collective action and policy must be based on a study of the public interest as an entropic process. To perform such a study, willingness-to-compromise and consistency of values are characterized as analogous to the concept of entropy. Hence high entropy is taken to imply unwillingness-to-compromise and consistency of values while low entropy to indicate that public values are in a state of flux. The assessment of the need for entropy reversal and the assessment of the entropic indeterminateness are suggested as focal points of research. A methodology is suggested and tested through the case study. Findings from the case study are also interpreted as illuminating the contributions of the approach and providing supportive arguments for a participatory, systematic, anticipatory and qualitative study of policy options and public values.

Journal ArticleDOI
TL;DR: In this paper, the concept of entropy is applied to the measurement of the extent of information in the communication theory of Wiener and Shannon, which is a convenient measure of the uncertainty or unpredictability of a system or of a process containing an element of contingency.
Abstract: This study was inspired by the concept of entropy as applied to the measurement of the extent of information in the communication theory of Wiener and Shannon. Like its precursor, entropy in statistical mechanics, it is a convenient measure of the uncertainty or unpredictability of a system or of a process containing an element of contingency. By keeping in mind the fact that in human circles unpredictability is often allowed to pass under the sacred name of liberty, it is possible to perceive an opening for mathematics into a domain which was until now inaccessible to the mathematician. It was four years ago that we made a first attempt to enter that field in our “Essay on the Mathematical Theory of Freedom”, presented to the Royal Statistical Society of London.

Book ChapterDOI
01 Jan 1979
TL;DR: The well known connection between entropy and Shannon's definition of information finds a generalization in the connection between Kullback's information gain and thermodynamic quantities which are essential for entropy production, for the stability criterion of Glansdorff and Prigogine, or for the probability and dynamics of fluctuations in a steady state.
Abstract: The well known connection between entropy and Shannon’s definition of information finds a generalization in the connection between Kullback’s information gain and thermodynamic quantities which are essential for entropy production, for the stability criterion of Glansdorff and Prigogine, or for the probability and dynamics of fluctuations in a steady state Another generalization leads to a set of ordered correlation measures In particular the measure of second order leads to specific heat and generalizations which show a characteristic critical behaviour in nonequilibrium phase transitions

Journal ArticleDOI
TL;DR: The “pattern” of entropy descriptions for ontogenetic and phylogenetic changes is shown to be different, and the latter is consistent with the Prigogine-Glansdorff principle for irreversible thermodynamic processes.
Abstract: Selection at constant selective pressures results in the optimization of the average productivity within the system and an increase in the information content. The entropy increase through evolutionary time is, therefore, minimized. The “pattern” of entropy descriptions for ontogenetic (developmental) and phylogenetic (evolutionary) changes is shown to be different, and the latter is consistent with the Prigogine-Glansdorff principle for irreversible thermodynamic processes.

Proceedings ArticleDOI
01 Apr 1979
TL;DR: The proposed method is based on analyzing the equally spaced quasi-phoneme string produced by the acoustic preprocessing and the phonemic labeling, which consists of the most probable phonemic candidates and their reliabilities.
Abstract: A new method to perform the phonemic segmentation of speech is introduced. The proposed method is based on analyzing the equally spaced quasi-phoneme string produced by the acoustic preprocessing and the phonemic labeling. The string consists of the most probable phonemic candidates and their reliabilities. The probabilities of different phonemes and the entropy of the string are evaluated at each analysis interval. The minima of the entropy indicate the steady-state regions of the phonemes. The performance of the entropy method is compared with that of two other segmentation methods.

01 Jul 1979
TL;DR: In this paper, the entropy maximizing method is used to estimate interregional migration flow matrices for the whole population or subgroups of the population, when the available data are in an aggregated form.
Abstract: The collection of disaggregated data is in most economic areas an expensive as well as a time-consuming procedure. If real data could be replaced by estimations from data on a highly aggregated level, much effort could be saved. The entropy maximizing method can be used to estimate interregional migration flow matrices for the whole population or subgroups of the population, when the available data are in an aggregated form. This means estimating the elements of matrices in which individuals a reclassified according to two or more discrete variables. Matrices of this form are called contingency tables. In this paper we present the entropy-maximizing method and test its validity for different levels of data aggregation. The tests are carried out by means of information theory and the chi-square distribution. For the tests we have used data from two of the countries that produce disaggregated data, Sweden and Austria.


Journal ArticleDOI
TL;DR: A noiseless encoding scheme is presented for sources with memory which has as its symbol set the ordinal numbers of the symbol probabilities and has entropy smaller than that of the adjoint.
Abstract: A noiseless encoding scheme is presented for sources with memory. The source is modeled by an ruth-order finite ergodic Markov process. A secondary source is derived which has as its symbol set the ordinal numbers of the symbol probabilities. The secondary source is approximately memoryless and has entropy smaller than that of the adjoint

Journal ArticleDOI
TL;DR: In this article, the authors examined possible relationships between network entropy of commodity flow and variation in unit price of freight transportation service and found that there is an inverse relationship between variation in the unit price and network entropy.