scispace - formally typeset
Search or ask a question

Showing papers on "Entropy (information theory) published in 1971"


Journal ArticleDOI
Suguru Arimoto1
TL;DR: A new definition of generalized information measures is introduced so as to investigate the finite-parameter estimation problem and yields a class of generalized entropy functions which are useful for treating the error-probability of decision and the other equivocation measures in the same framework.
Abstract: A new definition of generalized information measures is introduced so as to investigate the finite-parameter estimation problem. This definition yields a class of generalized entropy functions which is useful for treating the error-probability of decision and the other equivocation measures such as Shannon's logarithmic measure in the same framework and, in particular, deriving upper bounds to the error-probability. A few of inequalities between these equivocation measures are presented including an extension of Fano's inequality.

201 citations


Journal ArticleDOI
TL;DR: An adaptive variable length coding system is presented, developed primarily for the proposed Grand Tour missions, but many features of this system clearly indicate a much wider applicability.
Abstract: An adaptive variable length coding system is presented Although developed primarily for the proposed Grand Tour missions, many features of this system clearly indicate a much wider applicability Using sample to sample prediction, the coding system produces output rates within 025 bit/picture element (pixel) of the onedimensional difference entropy for entropy values ranging from 0 to 8 bit/pixel This is accomplished without the necessity of storing any code words Performance improvements of 05 bit/pixel can be simply achieved by utilizing previous line correlation A Basic Compressor, using concatenated codes, adapts to rapid changes in source statistics by automatically selecting one of three codes to use for each block of 21 pixels The system adapts to less frequent, but more dramatic, changes in source statistics by adjusting the mode in which the Basic Compressor operates on a line-to-line basis Furthermore, the compression system is independent of the quantization requirements of the pulse-code modulation system

185 citations


Journal ArticleDOI
P. E. Hart1
01 Jan 1971
TL;DR: In this paper, it was shown that when the number of firms is large enough to use statistical distribution theory, the classical statistical measures are superior to the entropy or the redundancy, but both are inferior to the traditional measures of concentration derived from the cumulative concentration curve.
Abstract: In recent years, economists have begun to use the entropy, or redundancy, of a size distribution to measure the extent to which business is concentrated in the control of giant firms. This paper compares these new measures derived from information theory with the classical statistical measures of dispersion and with traditional measures of business concentration derived from the cumulative concentration curve. It shows that when the number of firms is large enough to use statistical distribution theory, the classical statistical measures are superior to the entropy or the redundancy. When the number of firms is small, the entropy is superior to the redundancy, but both are inferior to the traditional measures of concentration derived from the cumulative concentration curve. Consequently, there is little point in using the information theory measures to measure business concentration.

154 citations


Journal ArticleDOI
TL;DR: In this paper, the authors defined the notion of the entropy of a "data source" and defined the absolute epsilon entropy, which is the amount of capacity needed when storage of experiments is allowed before transmission.
Abstract: This article studies efficient data transmission, or "data compression", from the standpoint of the theory of epsilon entropy. The notion of the entropy of a "data source" is defined. This quantity gives a precise measure of the amount of channel capacity necessary to describe a data source to within a given fidelity, epsilon, with probability one, when each separate "experiment" must be transmitted without storage from experiment to experiment. We also define the absolute epsilon entropy of a source, which is the amount of capacity needed when storage of experiments is allowed before transmission. The absolute epsilon entropy is shown to be equal to Shannon's rate distortion function evaluated for zero distortion, when suitable identifications are made. The main result is that the absolute epsilon entropy and the epsilon entropy have ratio close to one if either is large. Thus, very little can be saved by storing the results of independent experiments before transmission.

47 citations




Journal ArticleDOI
TL;DR: The family of pseudolinear tuple grammars is considered, which allow rational structure generating functions easily computed by an algorithm and strong statements on the entropy of tuple languages generated by pseudolinears can be made.
Abstract: The structure generating function of a language enumerates the number of distinct words contained in the language with respect to their length. Given any unambiguous tuple grammar, a method is described which yields a system of equations, whose unique solution is the structure generating function. The entropy (channel capacity) is an important information theoretic quantity associated with a language. Since the entropy depends directly on the number of words contained in a language, the structure generating function of a tuple language allows to compute the entropy of the tuple language. The family of pseudolinear tuple grammars is considered. These pseudolinear tuple grammars allow rational structure generating functions easily computed by an algorithm. Strong statements on the entropy of tuple languages generated by pseudolinear tuple grammars can be made.

15 citations


Journal ArticleDOI
01 Oct 1971
TL;DR: In this paper, a quantitative measure for the extent of separation, based on the specific entropy, is derived, and two functions are shown to be suitably related to this measure to qualify as resolution functions.
Abstract: A quantitative measure for the extent of separation, based on the specific entropy, is derived. Two functions are shown to be suitably related to this measure to qualify as resolution functions. The first is Glueckauf's impurity ratio; the second is a new function termed the purity index. The relationship between the latter and Rony's extent of separation is demonstrated.

14 citations


Journal ArticleDOI
TL;DR: This note provides justification for the use of several well-known distributions in various cases for the value of a system parameter in the absence of any knowledge except its range.
Abstract: In the simulation of discrete systems, such as those using the popular GPSS simulator, the following problem is often encountered: What probability distribution should we employ for the value of a system parameter in the absence of any knowledge except its range? Based on the concept of entropy, this note provides justification for the use of several well-known distributions in various cases.

9 citations


Journal ArticleDOI
TL;DR: An error estimate is derived for the measured entropy and information rate which depends only upon the number of possible events and not upon the numerical values of their probabilities, to answer the question of how often experiments should be repeated independently in order to come sufficiently close to the exact values of H and T.
Abstract: Experimental determination of the entropy H and information rate T of experiments or information sources rely on measurements of the relative frequencies of events and thus furnish only approximations to the exact values of H and T, defined by probabilities rather than relative frequencies. We derive an error estimate for the measured entropy and information rate, which depends only upon the number of possible events and not upon the numerical values of their probabilities, and thereby answer the question of how often experiments should be repeated independently in order that the measured entropy and information rate come sufficiently close to the exact values of H and T with probability sufficiently close to 1.

9 citations


Journal ArticleDOI
TL;DR: The purpose of this paper is to investigate how closely real world networks tend through time toward the ideal postulated by utilizing a technique based on information statistics.
Abstract: The purpose of this paper is to investigate how closely real world networks tend through time toward the ideal postulated. A comparison of these networks can be made by utilizing a technique based on information statistics. These statistics enable the level of entropy in any network to be measured. By comparing the entropy of an ideal network with a real world network a statistical index of redundancy is obtained. This redundancy index indicates how closely real world networks approach the ideal. An application of the technique is presented for a regional network found in the greater Grey-Bruce area of southwestern Ontario. /TRRL/

Journal ArticleDOI
TL;DR: The author demonstrates an explicit proof valid for an ideal prediction, adding a negative proof for general experimental conditions, implying the general validity of the lower bound has been denied.

Journal ArticleDOI
TL;DR: In this paper, an elementary model defining maximum entropy populations for a set of nodes is developed, where nodes are connected by a simple transportation network to a central point where all work places are concentrated.
Abstract: An elementary model defining maximum entropy populations for a set of nodes is developed. These nodes are connected by a simple transportation network to a central point where all work places are concentrated. A congestion cost function is defined for network arcs. Then the model yields an equilibrium solution that identifies nodal populations as an entropic function of the total cost of the journey to work.

Journal ArticleDOI
TL;DR: An analysis of the rank‐frequency distribution of the EURATOM‐thesaurus was carried out and an exponential function was used, which may provide a criterion to ‘revise’ some zones of thesauri.
Abstract: An analysis of the rank‐frequency distribution of the EURATOM‐thesaurus was carried out. Zipf's law (a hyperbolic function) was not found to be suitable for this distribution, and an exponential law was used. The total entropy of the thesaurus calculated by means of this exponential function was found in good agreement with the actual entropy of the thesaurus. The exponential function may provide a criterion to ‘revise’ some zones of thesauri.


Journal ArticleDOI
R. Rink1
TL;DR: A three-alternative step-function process on two dimensions, with step size constrained in both directions, is considered, and the maximum specific entropy is found to be only 1.02 bits/interval.
Abstract: A three-alternative step-function process on two dimensions, with step size constrained in both directions, is considered. The maximum specific entropy is found to be only 1.02 bits/interval, compared with the 1.59 bits/interval that would be obtained if step size were constrained in only one direction.

Journal ArticleDOI
01 Jan 1971