scispace - formally typeset
Journal ArticleDOI

Information theoretic inequalities

TLDR
The authors focus on the entropy power inequality (including the related Brunn-Minkowski, Young's, and Fisher information inequalities) and address various uncertainty principles and their interrelations.
Abstract
The role of inequalities in information theory is reviewed, and the relationship of these inequalities to inequalities in other branches of mathematics is developed. The simple inequalities for differential entropy are applied to the standard multivariate normal to furnish new and simpler proofs of the major determinant inequalities in classical mathematics. The authors discuss differential entropy inequalities for random subsets of samples. These inequalities when specialized to multivariate normal variables provide the determinant inequalities that are presented. The authors focus on the entropy power inequality (including the related Brunn-Minkowski, Young's, and Fisher information inequalities) and address various uncertainty principles and their interrelations. >

read more

Citations
More filters
Book

Optimal Transport: Old and New

TL;DR: In this paper, the authors provide a detailed description of the basic properties of optimal transport, including cyclical monotonicity and Kantorovich duality, and three examples of coupling techniques.
Journal ArticleDOI

The Brunn-Minkowski inequality

TL;DR: In this article, the relationship between the Brunn-Minkowski inequality and other inequalities in geometry and analysis, and some applications, is discussed; see Section 5.1.1 for a survey.
Journal ArticleDOI

A Statistical Measure of Complexity

TL;DR: In this article, a measure of complexity based on a probabilistic description of physical systems is proposed, which can be applied to many physical situations and to different descriptions of a given system.
Journal ArticleDOI

Stabilizability of Stochastic Linear Systems with Finite Feedback Data Rates

TL;DR: By inductive arguments employing the entropy power inequality of information theory, and a new quantizer error bound, an explicit expression for the infimum stabilizing data rate is derived, under very mild conditions on the initial state and noise probability distributions.
References
More filters
Journal ArticleDOI

A mathematical theory of communication

TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.
Book

Inequalities: Theory of Majorization and Its Applications

TL;DR: In this paper, Doubly Stochastic Matrices and Schur-Convex Functions are used to represent matrix functions in the context of matrix factorizations, compounds, direct products and M-matrices.
Journal ArticleDOI

General properties of entropy

TL;DR: This paper discusses properties of entropy, as well as related concepts such as relative entropy, skew entropy, dynamical entropy, etc, in detail with reference to their implications in statistical mechanics, to get a glimpse of systems with infinitely many degrees of freedom.
Journal ArticleDOI

Inequalities in Fourier analysis