scispace - formally typeset
Search or ask a question
Posted Content

Conditional Information Inequalities and Combinatorial Applications.

TL;DR: This result generalizes a version of the conditional Ingleton inequality and presents a new method to prove lower bounds for biclique coverings of bipartite graphs.
Abstract: We show that the inequality $H(A \mid B,X) + H(A \mid B,Y) \le H(A\mid B)$ for jointly distributed random variables $A,B,X,Y$, which does not hold in general case, holds under some natural condition on the support of the probability distribution of $A,B,X,Y$. This result generalizes a version of the conditional Ingleton inequality: if for some distribution $I(X: Y \mid A) = H(A\mid X,Y)=0$, then $I(A : B) \le I(A : B \mid X) + I(A: B \mid Y) + I(X : Y)$. We present two applications of our result. The first one is the following easy-to-formulate combinatorial theorem: assume that the edges of a bipartite graph are partitioned into $K$ matchings such that for each pair (left vertex $x$, right vertex $y$) there is at most one matching in the partition involving both $x$ and $y$; assume further that the degree of each left vertex is at least $L$ and the degree of each right vertex is at least $R$. Then $K\ge LR$. The second application is a new method to prove lower bounds for biclique coverings of bipartite graphs.
Citations
More filters
Journal ArticleDOI
TL;DR: The communication complexity of secret key agreement protocols that produce a secret key of maximal length for protocols with public randomness is established and it is shown that if the communication complexity drops below the established threshold, then only very short secret keys can be obtained.
Abstract: We show that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties—one having x and the complexity profile of the pair and the other one having y and the complexity profile of the pair—can establish via a probabilistic protocol with interaction on a public channel. For e > 2, the longest shared secret that can be established from a tuple of strings (x1, …, xe) by e parties—each one having one component of the tuple and the complexity profile of the tuple—is equal, up to logarithmic precision, to the complexity of the tuple minus the minimum communication necessary for distributing the tuple to all parties. We establish the communication complexity of secret key agreement protocols that produce a secret key of maximal length for protocols with public randomness. We also show that if the communication complexity drops below the established threshold, then only very short secret keys can be obtained.

9 citations


Cites methods from "Conditional Information Inequalitie..."

  • ...This technique (Lemmas 4.6 and 5.10) is based on ideas similar to the conditional information inequalities in Kaced and Romashchenko (2013) and Kaced et al. (2015)....

    [...]

Proceedings ArticleDOI
10 Jul 2018
TL;DR: In this paper, it was shown that the mutual information of any pair of strings $x and $y$ is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties, one having $x$ and the complexity profile of the pair and the other one having$y$ and complexity profile, can establish via a probabilistic protocol with interaction on a public channel, and that if the communication complexity drops below the established threshold then only very short secret keys can be obtained.
Abstract: We show that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings $x$ and $y$ is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties, one having $x$ and the complexity profile of the pair and the other one having $y$ and the complexity profile of the pair, can establish via a probabilistic protocol with interaction on a public channel. For $\ell > 2$, the longest shared secret that can be established from a tuple of strings $(x_1, . . . , x_\ell)$ by $\ell$ parties, each one having one component of the tuple and the complexity profile of the tuple, is equal, up to logarithmic precision, to the complexity of the tuple minus the minimum communication necessary for distributing the tuple to all parties. We establish the communication complexity of secret key agreement protocols that produce a secret key of maximal length, for protocols with public randomness. We also show that if the communication complexity drops below the established threshold then only very short secret keys can be obtained.

3 citations

01 Jan 2018
TL;DR: In this paper, it was shown that the mutual information of any pair of strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties can establish via a probabilistic protocol with interaction on a public channel.
Abstract: We show that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties—one having x and the complexity profile of the pair and the other one having y and the complexity profile of the pair—can establish via a probabilistic protocol with interaction on a public channel. For e > 2, the longest shared secret that can be established from a tuple of strings (x1, …, xe) by e parties—each one having one component of the tuple and the complexity profile of the tuple—is equal, up to logarithmic precision, to the complexity of the tuple minus the minimum communication necessary for distributing the tuple to all parties. We establish the communication complexity of secret key agreement protocols that produce a secret key of maximal length for protocols with public randomness. We also show that if the communication complexity drops below the established threshold, then only very short secret keys can be obtained.

2 citations

Posted Content
TL;DR: The communication complexity of secret key agreement protocols that produce a secret key of maximal length, for protocols with public randomness is established and it is shown that if the communication complexity drops below the established threshold, then only very short secret keys can be obtained.
Abstract: We show that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings $x$ and $y$ is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties, one having $x$ and the complexity profile of the pair and the other one having $y$ and the complexity profile of the pair, can establish via a probabilistic protocol with interaction on a public channel. For $\ell > 2$, the longest shared secret that can be established from a tuple of strings $(x_1, \ldots , x_\ell)$ by $\ell$ parties, each one having one component of the tuple and the complexity profile of the tuple, is equal, up to logarithmic precision, to the complexity of the tuple minus the minimum communication necessary for distributing the tuple to all parties. We establish the communication complexity of secret key agreement protocols that produce a secret key of maximal length, for protocols with public randomness. We also show that if the communication complexity drops below the established threshold, then only very short secret keys can be obtained.
References
More filters
Book
01 Jan 2002
TL;DR: This book provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relation between entropy and group theory.
Abstract: This book provides an up-to-date introduction to information theory. In addition to the classical topics discussed, it provides the first comprehensive treatment of the theory of I-Measure, network coding theory, Shannon and non-Shannon type information inequalities, and a relation between entropy and group theory. ITIP, a software package for proving information inequalities, is also included. With a large number of examples, illustrations, and original problems, this book is excellent as a textbook or reference book for a senior or graduate level course on the subject, as well as a reference for researchers in related fields.

543 citations

Journal ArticleDOI
01 Jul 1998
TL;DR: The main discovery of this paper is a new information-theoretic inequality involving four discrete random variables which gives a negative answer to this fundamental problem in information theory: /spl Gamma/~*/sub n/ is strictly smaller than /spl gamma// Sub n/ whenever n>3.
Abstract: Given n discrete random variables /spl Omega/={X/sub 1/, /spl middot//spl middot//spl middot/, X/sub n/}, associated with any subset /spl alpha/ of (1, 2, /spl middot//spl middot//spl middot/, n), there is a joint entropy H(X/sub /spl alpha//) where X/sub /spl alpha//={X/sub i/:i/spl epsiv//spl alpha/}. This can be viewed as a function defined on 2/sup {1, 2, /spl middot//spl middot//spl middot/, n}/ taking values in (0, +/spl infin/). We call this function the entropy function of /spl Omega/. The nonnegativity of the joint entropies implies that this function is nonnegative; the nonnegativity of the conditional joint entropies implies that this function is nondecreasing; and the nonnegativity of the conditional mutual information implies that this function has the following property: for any two subsets /spl alpha/ and /spl beta/ of {1, 2, /spl middot//spl middot//spl middot/, n} H/sub /spl Omega//(/spl alpha/)+H/sub /spl Omega//(/spl beta/)/spl ges/H/sub /spl Omega//(/spl alpha//spl cup//spl beta/)+H/sub /spl Omega//(/spl alpha//spl cap//spl beta/). These properties are the so-called basic information inequalities of Shannon's information measures. Do these properties fully characterize the entropy function? To make this question more precise, we view an entropy function as a 2/sup n/-1-dimensional vector where the coordinates are indexed by the nonempty subsets of the ground set {1, 2, /spl middot//spl middot//spl middot/, n}. Let /spl Gamma//sub n/ be the cone in R/sup 2n-1/ consisting of all vectors which have these three properties when they are viewed as functions defined on 2/sup {1, 2, /spl middot//spl middot//spl middot/, n}/. Let /spl Gamma//sub n/* be the set of all 2/sup n/-1-dimensional vectors which correspond to the entropy functions of some sets of n discrete random variables. The question can be restated as: is it true that for any n, /spl Gamma/~/sub n/*=/spl Gamma//sub n/? Here /spl Gamma/~/sub n/* stands for the closure of the set /spl Gamma//sub n/*. The answer is "yes" when n=2 and 3 as proved in our previous work. Based on intuition, one may tend to believe that the answer should be "yes" for any n. The main discovery of this paper is a new information-theoretic inequality involving four discrete random variables which gives a negative answer to this fundamental problem in information theory: /spl Gamma/~*/sub n/ is strictly smaller than /spl Gamma//sub n/ whenever n>3. While this new inequality gives a nontrivial outer bound to the cone /spl Gamma/~/sub 4/*, an inner bound for /spl Gamma/~*/sub 4/ is also given. The inequality is also extended to any number of random variables.

387 citations

Journal ArticleDOI
01 Jan 1977
TL;DR: In this article, the minimum number of complete subgraphs of a graph G which include all of the edges of G, and the minimum bipartite subgraph of G which cover G are both shown to be NP-complete.
Abstract: Fundamental questions posed by Boole in 1868 on the theory of sets have in recent years been translated to problems in graph theory. The major problems that this paper deals with are determining the minimum number of complete subgraphs of graph G which include all of the edges of G, and determining the minimum number of complete bipartite subgraphs which cover G. The two problems are of a very similar nature. Determining whether there is a projective plane of order p is a special case of the former problem. The latter problem has a natural translation into matrix theory which yields tight upper and lower bounds. An elementary proof is given for Graham's theorem. Two non-obvious classes are given for which the above problems are easily handled; however, this author doubts that these classes can be extended significantly. Two new problems are shown in this paper to be NP-complete. Finally, several conjectures and unsolved problems are posed within the body of the paper.

301 citations


"Conditional Information Inequalitie..." refers background in this paper

  • ...he notions of bipartite partition and bipartite cover play the central role in communication complexity, [14]. The problem of determining the bipartite dimension is NP-hard even for bipartite graphs, [4]. A good approximation or a nontrivial lower bound for the bipartite dimension of some particular classes of graphs may imply substantial progress in various problems of computational complexity, see ...

    [...]

Journal ArticleDOI
TL;DR: The Vamos network is constructed, and it is proved that Shannon-type information inequalities are insufficient even for computing network coding capacities of multiple-unicast networks.
Abstract: We define a class of networks, called matroidal networks, which includes as special cases all scalar-linearly solvable networks, and in particular solvable multicast networks. We then present a method for constructing matroidal networks from known matroids. We specifically construct networks that play an important role in proving results in the literature, such as the insufficiency of linear network coding and the unachievability of network coding capacity. We also construct a new network, from the Vamos matroid, which we call the Vamos network, and use it to prove that Shannon-type information inequalities are in general not sufficient for computing network coding capacities. To accomplish this, we obtain a capacity upper bound for the Vamos network using a non-Shannon-type information inequality discovered in 1998 by Zhang and Yeung, and then show that it is smaller than any such bound derived from Shannon-type information inequalities. This is the first application of a non-Shannon-type inequality to network coding. We also compute the exact routing capacity and linear coding capacity of the Vamos network. Finally, using a variation of the Vamos network, we prove that Shannon-type information inequalities are insufficient even for computing network coding capacities of multiple-unicast networks.

279 citations


Additional excerpts

  • ...Hence∆ 6 ∆′....

    [...]

Proceedings ArticleDOI
24 Jun 2007
TL;DR: No finite set of linear combinations generates all such inequalities, so an infinite sequence of new linear information inequalities and a curve in a special geometric position to the halfspaces defined by the inequalities are proved.
Abstract: When finite, Shannon entropies of all sub vectors of a random vector are considered for the coordinates of an entropic point in Euclidean space. A linear combination of the coordinates gives rise to an unconstrained information inequality if it is nonnegative for all entropic points. With at least four variables no finite set of linear combinations generates all such inequalities. This is proved by constructing explicitly an infinite sequence of new linear information inequalities and a curve in a special geometric position to the halfspaces defined by the inequalities. The inequalities are constructed recurrently by adhesive pasting of restrictions of polymatroids and the curve ranges in the closure of a set of the entropic points.

227 citations


Additional excerpts

  • ...Hence∆ 6 ∆′....

    [...]