scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Some inequalities satisfied by the quantities of information of Fisher and Shannon

01 Jun 1959-Information & Computation (Academic Press)-Vol. 2, Iss: 2, pp 101-112
TL;DR: A certain analogy is found to exist between a special case of Fisher's quantity of information I and the inverse of the “entropy power” of Shannon and this constitutes a sharpening of the uncertainty relation of quantum mechanics for canonically conjugated variables.
Abstract: A certain analogy is found to exist between a special case of Fisher's quantity of information I and the inverse of the “entropy power” of Shannon (1949, p. 60). This can be inferred from two facts: (1) Both quantities satisfy inequalities that bear a certain resemblance to each other. (2) There is an inequality connecting the two quantities. This last result constitutes a sharpening of the uncertainty relation of quantum mechanics for canonically conjugated variables. Two of these relations are used to give a direct proof of an inequality of Shannon (1949, p. 63, Theorem 15). Proofs are not elaborated fully. Details will be given in a doctoral thesis that is in preparation.
Citations
More filters
Book
02 Jan 2013
TL;DR: In this paper, the authors provide a detailed description of the basic properties of optimal transport, including cyclical monotonicity and Kantorovich duality, and three examples of coupling techniques.
Abstract: Couplings and changes of variables.- Three examples of coupling techniques.- The founding fathers of optimal transport.- Qualitative description of optimal transport.- Basic properties.- Cyclical monotonicity and Kantorovich duality.- The Wasserstein distances.- Displacement interpolation.- The Monge-Mather shortening principle.- Solution of the Monge problem I: global approach.- Solution of the Monge problem II: Local approach.- The Jacobian equation.- Smoothness.- Qualitative picture.- Optimal transport and Riemannian geometry.- Ricci curvature.- Otto calculus.- Displacement convexity I.- Displacement convexity II.- Volume control.- Density control and local regularity.- Infinitesimal displacement convexity.- Isoperimetric-type inequalities.- Concentration inequalities.- Gradient flows I.- Gradient flows II: Qualitative properties.- Gradient flows III: Functional inequalities.- Synthetic treatment of Ricci curvature.- Analytic and synthetic points of view.- Convergence of metric-measure spaces.- Stability of optimal transport.- Weak Ricci curvature bounds I: Definition and Stability.- Weak Ricci curvature bounds II: Geometric and analytic properties.

5,524 citations

Book
16 Jan 2012
TL;DR: In this article, a comprehensive treatment of network information theory and its applications is provided, which provides the first unified coverage of both classical and recent results, including successive cancellation and superposition coding, MIMO wireless communication, network coding and cooperative relaying.
Abstract: This comprehensive treatment of network information theory and its applications provides the first unified coverage of both classical and recent results. With an approach that balances the introduction of new models and new coding techniques, readers are guided through Shannon's point-to-point information theory, single-hop networks, multihop networks, and extensions to distributed computing, secrecy, wireless communication, and networking. Elementary mathematical tools and techniques are used throughout, requiring only basic knowledge of probability, whilst unified proofs of coding theorems are based on a few simple lemmas, making the text accessible to newcomers. Key topics covered include successive cancellation and superposition coding, MIMO wireless communication, network coding, and cooperative relaying. Also covered are feedback and interactive communication, capacity approximations and scaling laws, and asynchronous and random access channels. This book is ideal for use in the classroom, for self-study, and as a reference for researchers and engineers in industry and academia.

2,442 citations

Journal ArticleDOI
TL;DR: In this article, the authors showed that the mutual information with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input statistics.
Abstract: This paper deals with arbitrarily distributed finite-power input signals observed through an additive Gaussian noise channel. It shows a new formula that connects the input-output mutual information and the minimum mean-square error (MMSE) achievable by optimal estimation of the input given the output. That is, the derivative of the mutual information (nats) with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input statistics. This relationship holds for both scalar and vector signals, as well as for discrete-time and continuous-time noncausal MMSE estimation. This fundamental information-theoretic result has an unexpected consequence in continuous-time nonlinear estimation: For any input signal with finite power, the causal filtering MMSE achieved at SNR is equal to the average value of the noncausal smoothing MMSE achieved with a channel whose SNR is chosen uniformly distributed between 0 and SNR.

1,129 citations

Posted Content
TL;DR: A new formula is shown that connects the input-output mutual information and the minimum mean-square error (MMSE) achievable by optimal estimation of the input given the output, which has an unexpected consequence in continuous-time nonlinear estimation.
Abstract: This paper deals with arbitrarily distributed finite-power input signals observed through an additive Gaussian noise channel. It shows a new formula that connects the input-output mutual information and the minimum mean-square error (MMSE) achievable by optimal estimation of the input given the output. That is, the derivative of the mutual information (nats) with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input statistics. This relationship holds for both scalar and vector signals, as well as for discrete-time and continuous-time noncausal MMSE estimation. This fundamental information-theoretic result has an unexpected consequence in continuous-time nonlinear estimation: For any input signal with finite power, the causal filtering MMSE achieved at SNR is equal to the average value of the noncausal smoothing MMSE achieved with a channel whose signal-to-noise ratio is chosen uniformly distributed between 0 and SNR.

966 citations

Journal ArticleDOI
TL;DR: In this article, the relationship between the Brunn-Minkowski inequality and other inequalities in geometry and analysis, and some applications, is discussed; see Section 5.1.1 for a survey.
Abstract: In 1978, Osserman [124] wrote an extensive survey on the isoperimetric inequality. The Brunn-Minkowski inequality can be proved in a page, yet quickly yields the classical isoperimetric inequality for important classes of subsets of Rn, and deserves to be better known. This guide explains the relationship between the Brunn-Minkowski inequality and other inequalities in geometry and analysis, and some applications.

921 citations

References
More filters
Journal ArticleDOI
TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.
Abstract: In this final installment of the paper we consider the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now. To a considerable extent the continuous case can be obtained through a limiting process from the discrete case by dividing the continuum of messages and signals into a large but finite number of small regions and calculating the various parameters involved on a discrete basis. As the size of the regions is decreased these parameters in general approach as limits the proper values for the continuous case. There are, however, a few new effects that appear and also a general change of emphasis in the direction of specialization of the general results to particular cases.

65,425 citations

Journal Article
TL;DR: The Mathematical Theory of Communication (MTOC) as discussed by the authors was originally published as a paper on communication theory more than fifty years ago and has since gone through four hardcover and sixteen paperback printings.
Abstract: Scientific knowledge grows at a phenomenal pace--but few books have had as lasting an impact or played as important a role in our modern world as The Mathematical Theory of Communication, published originally as a paper on communication theory more than fifty years ago. Republished in book form shortly thereafter, it has since gone through four hardcover and sixteen paperback printings. It is a revolutionary work, astounding in its foresight and contemporaneity. The University of Illinois Press is pleased and honored to issue this commemorative reprinting of a classic.

15,525 citations

Book
01 Jan 1937

2,577 citations

Book
01 Jan 1931

2,384 citations

Journal ArticleDOI
01 Jan 1938-Nature
TL;DR: Titchmarsh's new book comes up to the high standard of the former Introduction to the Theory of Fourier Integrals by Prof E C Titchmarm as mentioned in this paper.
Abstract: SINCE the publication of Prof Zygmund's “Trigonometric Series” in 1935, there has been considerable demand for another book dealing with trigonometric integrals Prof Titchmarsh's book meets this demand He is already well known to students of mathematics by his text-book on the theory of functions, and his new book comes up to the high standard of the former Introduction to the Theory of Fourier Integrals By Prof E C Titchmarsh Pp x + 390 (Oxford: Clarendon Press; London: Oxford University Press, 1937) 17s 6d net

746 citations