scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A Reformulation of Gaussian Completely Monotone Conjecture: A Hodge Structure on the Fisher Information along Heat Flow

28 Aug 2022-arXiv.org-Vol. abs/2208.13108
TL;DR: In this article , the complete monotonicity of Gaussian distribution (GCMC) was reformulated in the form of a log-convex sequence, which can admit a logconcave sequence.
Abstract: In the past decade, J. Huh solved several long-standing open problems on log-concave sequences in com-binatorics. The ground-breaking techniques developed in those work are from algebraic geometry: “We believe that behind any log-concave sequence that appears in nature there is such a Hodge structure responsible for the log-concavity”. A function is called completely monotone if its derivatives alternate in signs; e.g., e − t . A fundamental conjecture in mathematical physics and Shannon information theory is on the complete monotonicity of Gaussian distribution (GCMC), which states that I ( X + Z t ) 1 is completely monotone in t , where I is Fisher information, random variables X and Z t are independent and Z t ∼ N (0 , t ) is Gaussian. Inspired by the algebraic geometry method introduced by J. Huh, GCMC is reformulated in the form of a log-convex sequence. In general, a completely monotone function can admit a log-convex sequence and a log-convex sequence can further induce a log-concave sequence. The new formulation may guide GCMC to the marvelous temple of algebraic geometry. Moreover, to make GCMC more accessible to researchers from both information theory and mathematics 2 , together with some new findings, a thorough summary of the origin, the implication and further study on GCMC is presented.
References
More filters
Book
01 Jan 1981
TL;DR: A First Course Algebraic methods in Markov Chains Ratio Theorems of Transition Probabilities and Applications Sums of Independent Random Variables as a Markov Chain Order Statistics, Poisson Processes, and Applications Continuous Time Markov chains Diffusion Processes Compounding Stochastic Processes Fluctuation Theory of Partial Sum of Independent Identically Distributed Random Variable Queueing Processes Miscellaneous Problems Index as discussed by the authors.
Abstract: Preface Preface to A First Course Preface to First Edition Contents of A First Course Algebraic Methods in Markov Chains Ratio Theorems of Transition Probabilities and Applications Sums of Independent Random Variables as a Markov Chain Order Statistics, Poisson Processes, and Applications Continuous Time Markov Chains Diffusion Processes Compounding Stochastic Processes Fluctuation Theory of Partial Sums of Independent Identically Distributed Random Variables Queueing Processes Miscellaneous Problems Index

2,987 citations

Book
28 Aug 2008
TL;DR: This book contains a thorough discussion of the classical topics in information theory together with the first comprehensive treatment of network coding, a subject first emerged under information theory in the mid 1990's that has now diffused into coding theory, computer networks, wireless communications, complexity theory, cryptography, graph theory, etc.
Abstract: This book contains a thorough discussion of the classical topics in information theory together with the first comprehensive treatment of network coding, a subject first emerged under information theory in the mid 1990's that has now diffused into coding theory, computer networks, wireless communications, complexity theory, cryptography, graph theory, etc. With a large number of examples, illustrations, and original problems, this book is excellent as a textbook or reference book for a senior or graduate level course on the subject, as well as a reference for researchers in related fields.

932 citations

Journal ArticleDOI
TL;DR: General bounds on the capacity region are obtained for discrete memoryless interference channels and for linear-superposition interference channels with additive white Gaussian noise.
Abstract: An interference channel is a communication medium shared by M sender-receiver pairs. Transmission of information from each sender to its corresponding receiver interferes with the communications between the other senders and their receivers. This corresponds to a frequent situation in communications, and defines an M -dimensional capacity region. In this paper, we obtain general bounds on the capacity region for discrete memoryless interference channels and for linear-superposition interference channels with additive white Gaussian noise. The capacity region is determined in special cases.

804 citations

Journal ArticleDOI
TL;DR: A certain analogy is found to exist between a special case of Fisher's quantity of information I and the inverse of the “entropy power” of Shannon and this constitutes a sharpening of the uncertainty relation of quantum mechanics for canonically conjugated variables.
Abstract: A certain analogy is found to exist between a special case of Fisher's quantity of information I and the inverse of the “entropy power” of Shannon (1949, p. 60). This can be inferred from two facts: (1) Both quantities satisfy inequalities that bear a certain resemblance to each other. (2) There is an inequality connecting the two quantities. This last result constitutes a sharpening of the uncertainty relation of quantum mechanics for canonically conjugated variables. Two of these relations are used to give a direct proof of an inequality of Shannon (1949, p. 63, Theorem 15). Proofs are not elaborated fully. Details will be given in a doctoral thesis that is in preparation.

792 citations