scispace - formally typeset
Open AccessJournal ArticleDOI

Strong Data Processing Inequalities for Input Constrained Additive Noise Channels

TLDR
In this article, it was shown that for the additive Gaussian noise channel with quadratic cost constraint, it is necessary and sufficient to saturate the channel to the point where mutual information is close to the maximum possible.
Abstract
This paper quantifies the intuitive observation that adding noise reduces available information by means of nonlinear strong data processing inequalities. Consider the random variables $W\to X\to Y$ forming a Markov chain, where $Y = X + Z$ with $X$ and $Z$ real valued, independent and $X$ bounded in $L_{p}$ -norm. It is shown that $I(W; Y) \le F_{I}(I(W;X))$ with $F_{I}(t) whenever $t > 0$ , if and only if $Z$ has a density whose support is not disjoint from any translate of itself. A related question is to characterize for what couplings $(W, X)$ the mutual information $I(W; Y)$ is close to maximum possible. To that end we show that in order to saturate the channel, i.e., for $I(W; Y)$ to approach capacity, it is mandatory that $I(W; X)\to \infty $ (under suitable conditions on the channel). A key ingredient for this result is a deconvolution lemma which shows that postconvolution total variation distance bounds the preconvolution Kolmogorov–Smirnov distance. Explicit bounds are provided for the special case of the additive Gaussian noise channel with quadratic cost constraint. These bounds are shown to be order optimal. For this case, simplified proofs are provided leveraging Gaussian-specific tools such as the connection between information and estimation (I-MMSE) and Talagrand’s information-transportation inequality.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

On the Robustness of Information-Theoretic Privacy Measures and Mechanisms

TL;DR: In this article, the discrepancy between the privacy-utility guarantees for the empirical distribution, used to design the privacy mechanism, and those for the true distribution, experienced by the private mechanism in practice was studied.
Journal ArticleDOI

Privacy With Estimation Guarantees

TL;DR: In this article, an estimation-theoretic analysis of the privacy-utility trade-off is presented, where an analyst can reconstruct (in a mean-squared error sense) certain functions of the data (utility), while other private functions should not be reconstructed with distortion below a certain threshold (privacy).
Posted Content

Privacy with Estimation Guarantees

TL;DR: This work proposes a convex program to compute privacy-assuring mappings when the functions to be disclosed and hidden are known a priori and the data distribution is known and derives lower bounds on the minimum mean-squared error of estimating a target function from the disclosed data.
Journal ArticleDOI

On Data-Processing and Majorization Inequalities for f-Divergences with Applications

Igal Sason
- 21 Oct 2019 - 
TL;DR: The main results are first introduced without proofs, followed by exemplifications of the theorems with further related analytical results, interpretations, and information-theoretic applications.
Journal ArticleDOI

Estimation in Poisson Noise: Properties of the Conditional Mean Estimator

TL;DR: In this article, the conditional mean estimator of a random variable in Poisson noise with signal scaling coefficient and dark current as explicit parameters of the noise model is investigated and several identities in terms of derivatives are established.
References
More filters
Book

Elements of information theory

TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Book

Information Theory and Reliable Communication

TL;DR: This chapter discusses Coding for Discrete Sources, Techniques for Coding and Decoding, and Source Coding with a Fidelity Criterion.
Book

Foundations of modern probability

TL;DR: In this article, the authors discuss the relationship between Markov Processes and Ergodic properties of Markov processes and their relation with PDEs and potential theory. But their main focus is on the convergence of random processes, measures, and sets.
Book

Information Theory: Coding Theorems for Discrete Memoryless Systems

TL;DR: This new edition presents unique discussions of information theoretic secrecy and of zero-error information theory, including the deep connections of the latter with extremal combinatorics.
Related Papers (5)