scispace - formally typeset
Proceedings ArticleDOI

Infinitely Many Information Inequalities

Reads0
Chats0
TLDR
No finite set of linear combinations generates all such inequalities, so an infinite sequence of new linear information inequalities and a curve in a special geometric position to the halfspaces defined by the inequalities are proved.
Abstract
When finite, Shannon entropies of all sub vectors of a random vector are considered for the coordinates of an entropic point in Euclidean space. A linear combination of the coordinates gives rise to an unconstrained information inequality if it is nonnegative for all entropic points. With at least four variables no finite set of linear combinations generates all such inequalities. This is proved by constructing explicitly an infinite sequence of new linear information inequalities and a curve in a special geometric position to the halfspaces defined by the inequalities. The inequalities are constructed recurrently by adhesive pasting of restrictions of polymatroids and the curve ranges in the closure of a set of the entropic points.

read more

Citations
More filters
Book ChapterDOI

Secret-sharing schemes: a survey

TL;DR: This survey describes the most important constructions of secret-sharing schemes and explains the connections between secret- sharing schemes and monotone formulae and monOTone span programs, and presents the known lower bounds on the share size.
Journal ArticleDOI

Axiomatic Characterizations of Information Measures

TL;DR: Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed and the relevance of the axiomatic approach for information theory is discussed.
BookDOI

Holographic Entanglement Entropy

TL;DR: In this paper, the authors review the developments in the past decade on holographic entanglement entropy, a subject that has garnered much attention owing to its potential to teach us about the emergence of spacetime in holography.
Journal ArticleDOI

The Holographic Entropy Cone

TL;DR: In this paper, a systematic enumeration and classification of entropy inequalities satisfied by the Ryu-Takayanagi formula for conformal field theory states with smooth holographic dual geometries was initiated.
Journal ArticleDOI

Network Coding Theory: A Survey

TL;DR: This article surveys all known fields of network coding theory and leads the reader through the antecedents of the network coding Theory to the most recent results, considering also information theory and matroid theory.
References
More filters
Book ChapterDOI

Submodular functions and convexity

TL;DR: In continuous optimization, convex functions play a central role as mentioned in this paper, and various methods for finding the minimum of a convex function constitute the main body of nonlinear optimization, which can be viewed as the optimization of very special (linear) objective functions over very special convex domains (polyhedra).
Book

A First Course in Information Theory

TL;DR: This book provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relation between entropy and group theory.
Journal ArticleDOI

On characterization of entropy function via information inequalities

TL;DR: The main discovery of this paper is a new information-theoretic inequality involving four discrete random variables which gives a negative answer to this fundamental problem in information theory: /spl Gamma/~*/sub n/ is strictly smaller than /spl gamma// Sub n/ whenever n>3.
Journal ArticleDOI

Polymatroidal dependence structure of a set of random variables

TL;DR: The present paper points out that the entropy function h is a β -function, i.e., a monotone non-decreasing and submodular function with h(O) = 0 and that the pair ( E, h ) is a polymatroid.