scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A Reformulation of Gaussian Completely Monotone Conjecture: A Hodge Structure on the Fisher Information along Heat Flow

28 Aug 2022-arXiv.org-Vol. abs/2208.13108
TL;DR: In this article , the complete monotonicity of Gaussian distribution (GCMC) was reformulated in the form of a log-convex sequence, which can admit a logconcave sequence.
Abstract: In the past decade, J. Huh solved several long-standing open problems on log-concave sequences in com-binatorics. The ground-breaking techniques developed in those work are from algebraic geometry: “We believe that behind any log-concave sequence that appears in nature there is such a Hodge structure responsible for the log-concavity”. A function is called completely monotone if its derivatives alternate in signs; e.g., e − t . A fundamental conjecture in mathematical physics and Shannon information theory is on the complete monotonicity of Gaussian distribution (GCMC), which states that I ( X + Z t ) 1 is completely monotone in t , where I is Fisher information, random variables X and Z t are independent and Z t ∼ N (0 , t ) is Gaussian. Inspired by the algebraic geometry method introduced by J. Huh, GCMC is reformulated in the form of a log-convex sequence. In general, a completely monotone function can admit a log-convex sequence and a log-convex sequence can further induce a log-concave sequence. The new formulation may guide GCMC to the marvelous temple of algebraic geometry. Moreover, to make GCMC more accessible to researchers from both information theory and mathematics 2 , together with some new findings, a thorough summary of the origin, the implication and further study on GCMC is presented.
References
More filters
Journal ArticleDOI
TL;DR: In this paper, the generalized MGL can be applied in binary broadcast channel to simplify some discussion, and the result subsumes the original proof and simplifies the original MGL.
Abstract: Mrs. Gerber's Lemma (MGL) hinges on the convexity of $H(p*H^{-1}(u))$, where $H(u)$ is the binary entropy function. In this work, we prove that $H(p*f(u))$ is convex in $u$ for every $p\in [0,1]$ provided $H(f(u))$ is convex in $u$, where $f(u) : (a, b) \to [0, \frac12]$. Moreover, our result subsumes MGL and simplifies the original proof. We show that the generalized MGL can be applied in binary broadcast channel to simplify some discussion.

3 citations

Peer Review
TL;DR: In this article , the authors presented some known conjectures on successive derivatives of entropy and Fisher information along the heat flow of a random vector X on some probability space (Ω, A, P ) with values in R n , and G an independent standard normal vector, with probability density f t with respect to the Lebesgue measure on R n .
Abstract: The note presents some known conjectures on successive derivatives of entropy and Fisher information along the heat flow (with a few partial results): the McKean Conjecture, the Completely Monotone Conjecture, the Log-Convexity Conjecture, the Entropy Power Conjecture, and the MMSE Conjecture. Given a random vector X on some probability space (Ω , A , P ) with values in R n , and G an independent standard normal vector, consider X t = X + √ t , t > 0, with probability density f t with respect to the Lebesgue measure on R n . That is, if µ denotes the distribution of X on the Borel sets R n

2 citations