scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A Reformulation of Gaussian Completely Monotone Conjecture: A Hodge Structure on the Fisher Information along Heat Flow

28 Aug 2022-arXiv.org-Vol. abs/2208.13108
TL;DR: In this article , the complete monotonicity of Gaussian distribution (GCMC) was reformulated in the form of a log-convex sequence, which can admit a logconcave sequence.
Abstract: In the past decade, J. Huh solved several long-standing open problems on log-concave sequences in com-binatorics. The ground-breaking techniques developed in those work are from algebraic geometry: “We believe that behind any log-concave sequence that appears in nature there is such a Hodge structure responsible for the log-concavity”. A function is called completely monotone if its derivatives alternate in signs; e.g., e − t . A fundamental conjecture in mathematical physics and Shannon information theory is on the complete monotonicity of Gaussian distribution (GCMC), which states that I ( X + Z t ) 1 is completely monotone in t , where I is Fisher information, random variables X and Z t are independent and Z t ∼ N (0 , t ) is Gaussian. Inspired by the algebraic geometry method introduced by J. Huh, GCMC is reformulated in the form of a log-convex sequence. In general, a completely monotone function can admit a log-convex sequence and a log-convex sequence can further induce a log-concave sequence. The new formulation may guide GCMC to the marvelous temple of algebraic geometry. Moreover, to make GCMC more accessible to researchers from both information theory and mathematics 2 , together with some new findings, a thorough summary of the origin, the implication and further study on GCMC is presented.
References
More filters
Journal ArticleDOI
TL;DR: A simple proof of the “concavity of entropy power” is given and it is shown that entropy power is proportional to the number of particles in the system.
Abstract: We give a simple proof of the “concavity of entropy power”.

123 citations

Journal ArticleDOI
A. M. Fink1
TL;DR: In this paper, Schoenberg and Cavaretta proved inequalities about various clases of monotone functions which yield, as special cases, pointwise inequalities of the form (1) where the norm is as in (2).

52 citations

Dissertation
01 Jan 2014
TL;DR: In this article, a large part of this thesis is based on our joint work, and I would like to express my sincere gratitude to Mircea Mustat¸˘ a for being such a great advisor.
Abstract: you have taught will be invaluable for the remainder of my journey. Special thanks are due to my friend and collaborator Eric Katz. A large part of this thesis is based on our joint work. Lastly, I would like to express my sincere gratitude to Mircea Mustat¸˘ a for being such a great advisor. Thank you.

32 citations

Journal ArticleDOI
TL;DR: It is proved that the reciprocal of Fisher information of a log-concave probability density X in Rn is concave in t with respect to the addition of a Gaussian noise Zt=N(0,tIn) as a byproduct of this result.
Abstract: We prove that the reciprocal of Fisher information of a log-concave probability density X in R n is concave in t with respect to the addition of a Gaussian noise Z t = N ( 0 , t I n ) . As a byproduct of this result we show that the third derivative of the entropy power of a log-concave probability density X in R n is nonnegative in t with respect to the addition of a Gaussian noise Z t . For log-concave densities this improves the well-known Costa’s concavity property of the entropy power (Costa, 1985).

22 citations

Journal ArticleDOI
09 Mar 2018-Entropy
TL;DR: This work employs the technique of linear matrix inequalities to show that, when the probability density function of X+tZ is log-concave, McKean’s conjecture holds for orders up to at least five.
Abstract: Let Z be a standard Gaussian random variable, X be independent of Z, and t be a strictly positive scalar. For the derivatives in t of the differential entropy of X + t Z , McKean noticed that Gaussian X achieves the extreme for the first and second derivatives, among distributions with a fixed variance, and he conjectured that this holds for general orders of derivatives. This conjecture implies that the signs of the derivatives alternate. Recently, Cheng and Geng proved that this alternation holds for the first four orders. In this work, we employ the technique of linear matrix inequalities to show that: firstly, Cheng and Geng’s method may not generalize to higher orders; secondly, when the probability density function of X + t Z is log-concave, McKean’s conjecture holds for orders up to at least five. As a corollary, we also recover Toscani’s result on the sign of the third derivative of the entropy power of X + t Z , using a much simpler argument.

17 citations