scispace - formally typeset
Search or ask a question

Showing papers on "Entropy (information theory) published in 1969"


Journal ArticleDOI
TL;DR: It is shown that it is important to take the channel into consideration when designing the quantizer even when the system is not constrained to operate in real time.
Abstract: We consider the transmission of numerical data over a noisy channel. Two sources of error exist. The first is the quantizer where the input data is mapped into a finite set of rational numbers and the second is the channel which includes the encoder, transmitter, transmission medium, receiver, and decoder. For any given probability density on the input data and any given channel matrix, we determine the quantization values and transition levels which minimize the total mean-square error. We also determine the best quantizer structure under the constraint that quantization values and transition levels be equally spaced. For the special case of a noiseless channel both results reduce to those of Max [2]. As an example we consider the case of Gaussian input data and phase-shift keyed (PSK) transmission in additive white Gaussian noise. The transmitter is both peak and average power limited, and the system operates in real time. Both the natural and Gray codes are considered. The mean-square error, quantizer entropy, channel capacity, and information rate are computed for the system using the optimum uniform quantizer. Finally, we show that it is important to take the channel into consideration when designing the quantizer even when the system is not constrained to operate in real time.

165 citations


Book ChapterDOI
01 Jan 1969
TL;DR: In this paper, the authors present entropy analysis of feedback control systems and show that entropy can be used to evaluate the performance of communication systems, including estimation, feedback control, and adaptive control systems.
Abstract: Publisher Summary This chapter presents entropy analysis of feedback control systems. Effective use of the entropy (or uncertainty) function proved to be a very formidable tool for evaluating the performance of communication systems. It is perhaps unfortunate that this function offered such powerful solutions to this particular type of problem, because as a result, information theory has remained the private preserve of coding theorists and very little conclusive work has been accomplished in extending the original concepts to other applications. Estimation, feedback control, and adaptive control systems are members of such a class, and clearly many of the components of these systems can be described (at least verbally) as information transformations. More important, common to these types of systems is the use of a sensor or measuring device that is necessary if the desired system performance is to be obtained. This sensor generally monitors the behavior of some process and transmits information about that behavior to another device. The application of entropy techniques to the feedback control problem has led to two important results, the first being a separation theorem and the second being an information theoretic interpretation of the feedback process.

50 citations



Journal ArticleDOI
TL;DR: In this article, a model of infinite chains of classical coupled harmonic oscillators is treated as models of thermodynamic systems in contact with a heat bath, and the Liouville function pn is reduced by integrating over the "outside" variables to a function pN of the variables of the N-particle segment that is the thermodynamic system.
Abstract: Finite segments of infinite chains of classical coupled harmonic oscillators are treated as models of thermodynamic systems in contact with a heat bath, i.e., canonical ensembles. The Liouville function p for the infinite chain is reduced by integrating over the "outside" variables to a function pN of the variables of the N-particle segment that is the thermodynamic system. The reduced Liouville function pn, which is calculated from the dynamics of the infinite chain and the statistical knowledge of the coordinates and momenta at t = 0, is a time-dependent probability density in the 2N-dimensional phase space of the system. A Gibbs entropy defined in terms of pN measures the evolution of knowledge of the system (more accurately, the growth of missing pertinent information) in the sense of information theory. As p t [ -+ 0% energy is equipartitioned, the entropy evolves to the value expected from equilibrium statistical mechanics, and p~evolves to an equilibrium distribution function. The simple chain exhibits diffusion in coordinate space, i.e., Brownian motion, and the diffusivity is shown to depend only on the initial distribution of momenta (not of coordinates) in the heat bath. The harmonically bound chain, in the limit of weak coupling, serves as an excellent model for the approach to equilibrium of a canonical ensemble of weakly interacting particles.

19 citations


Journal ArticleDOI
Ralph A. Evans1
TL;DR: The principle of maximum entropy for assignment of estimated probabilities, since entropy is the negative of information, is renamed and the formalisms for discrete and continuous random variables are described and illustrated.
Abstract: The principle of maximum entropy for assignment of estimated probabilities is intriguing. Since entropy is the negative of information, the principle was renamed. The formalisms for discrete and continuous random variables are described and illustrated. The problem of complete ignorance is discussed and the concept of quasi-ignorance is introduced as a substitute. Constraints on the probabilities, beyond the minimum, are a source of consternation and a stumbling block to the application of the principle. Many questions are raised, but no answers are given. At present, the principle cannot be evoked to solve practical problems in reliability. Theoreticians should apply themselves to find realistic tractable constraints that do not involve logical contradictions.

16 citations


Journal ArticleDOI
B. L. Gurevich1
01 Jan 1969
TL;DR: In this article, the Hartley measure of information transmission is used to measure the probability that a given event will have a particular outcome and the Shannon measure of entropy, which takes into consideration the probability of a particular event having a certain outcome.
Abstract: The preceding preliminary paper is expanded into a full-fledged theory of differentiation. Two types are considered: feature-based differentiation and areal differentiation, both in their geographical and nongeographical contexts. Each type is examined both in a broad sense, without weighting, and in a narrow sense, with the values of a selected feature or other weights attached to the parts of a whole. Measures of differentiation are then borrowed from information theory. Differentiation in the broad sense (without weights) can be measured by the simpler Hartley measure of information transmission, which applies when any one of distinct possible outcomes of an event is equally likely to occur. Differentiation in the narrow sense (weighted) can be measured by the more complicated Shannon measure of entropy, which takes into consideration the probability that a given event will have a particular outcome.

13 citations


Journal ArticleDOI
TL;DR: In this paper, a pseudomolecular model for studying the vibrations of a crystal with a point defect is used to express the ratio of the products of the perturbed and unperturbed frequencies exactly in terms of the molecular frequencies.

9 citations


Journal ArticleDOI
01 Apr 1969
TL;DR: The variability of Renyi’s entropy in the continuous case with co-ordinate systems and invariance of transinformation of order α and type β for linear transformations are discussed.
Abstract: In a recent paper, we defined entropy of order α and type β. For α = 1, it reduces to Renyi’s2 entropy of order α, while for α = 1, β= 1 and for a complete probability distribution, it reduces to Shannon’s definition of entropy. In this paper we have studied some properties of this generalised entropy. We have also discussed the variability of Renyi’s entropy in the continuous case with co-ordinate systems and invariance of transinformation of order α and type β for linear transformations.

6 citations


Journal ArticleDOI
TL;DR: In this paper, a mathematical theory of communication was applied to psychological testing and a formula for the entropy of a test was derived assuming an information transmission from the source of the postulated true score distribution to the destination, i.e. the observer.
Abstract: The mathematical theory of communication was applied to psychological testing and a formula for the entropy of a test was derived assuming an information transmission from the source of the postulated true score distribution to the destination, i.e. the observer. Two test score models, i.e. a simplified model of bivariate normal distribution of true and observed scores and the binomial error model, were both employed to calculate the entropy of the actual test data. The amount of information transmitted per subject was found to be surprisingly low. Under these test models, the rate of information transmission varied monotonically with test reliability. However, some of the results promised the possibility that a more detailed analysis of relationship between test structure and its entropy might clarify the meaning of the new measure.

3 citations



Journal ArticleDOI
J. Park1
TL;DR: The equivocation and rate are computed for single- error detection and single-error correction double-error detection codes with and without feedback.
Abstract: Comparison is made between one-way communication links using binary block codes and a similar link with decision feedback. The known reduction in word error probability is shown. Earlier papers consider only the encoding entropy rate. In this paper the equivocation and rate are computed for single-error detection and single-error correction double-error detection codes with and without feedback.

Journal ArticleDOI
TL;DR: The product epsilon entropy of mean-continous Gaussian processes is studied, that is, a givenmean-continuous Gaussian process on the unit interval is expanded into its Karhulnen expansion.
Abstract: 0. Summary. This paper studies the product epsilon entropy of mean-continous Gaussian processes. That is, a given mean-continuous Gaussian process on the unit interval is expanded into its Karhulnen expansion. Along the kth eigenfunction axis, a partition by intervals of length Ek iS made, and the entropy of the resulting discrete distribution is noted. The infimum of the sum over k of these entropies subject to the constraint that E Ek _ = is the product epsilon entropy of the process. It is shown that the best partition to take along each eigenfunction axis is the one in which 0 is the midpoint of an interval in the partition. Furthermore, the product epsilon entropy is finite if and only if E Xk log Xk-' is finite, where Xk is the kth eigenvalue of the process. When the above series is finite, the values of Ek which achieve the product entropy are found. Asymptotic expressions for the product epsilon entropy are derived in some special cases. The problem arises in the theory of data compression, which studies the efficient representation of random data with prescribed accuracy. 1. Introduction. This paper is motivated by the problem of data compression, the efficient representation of data for the purpose of information transmission. We shall consider the case in which the data to be represented consists of a sample function from a Gaussian process X (t) on the unit interval which is mean-continuous; i.e. E[x (s) - x (t)]2 > 0 as s -> t, for all t. Our basic problem is how to transmit (over a noiseless channel) information as to which sample function of X occurred. We assume that the recipient of the transmitted data has full knowledge of the statistics of the process. In particular he knows the Karhiinen expansion [1] of the process; namely

Journal ArticleDOI
TL;DR: The rudiments of information theoretic methods are introduced and companion papers dealing with solutions of some reliability problems using information theory approaches are preluded.
Abstract: The rudiments of information theoretic methods are introduced and companion papers dealing with solutions of some reliability problems using information theoretic approaches are preluded. Major elements of the communication system are outlined from an information processing point of view. Information is quantified following the work of Shannon. The concepts of uncertainty, self-and mutual information, and entropy are developed as seen at the encoder, channel, and decoder. Channel modeling is demonstrated using a binary symmetric channel as an example. Channel capacity is derived by maximizing transinformation. Elements of coding are described and Shannon's fundamental theorem of discrete noiseless coding is stated. The fundamental relations governing unique decipherability and irreducibility are given and demonstrated by examples. Code efficiency and redundancy are quantified and discussed as system parameters subject to tradeoff. Applications of information theoretic methods in various disciplines are discussed with emphasis on reliability and maintainability. Two unresolved reliability problem areas are identified where, potentially, information theoretic approaches may present a viable solution.