scispace - formally typeset
Open AccessPosted Content

Conditional Information Inequalities and Combinatorial Applications.

Reads0
Chats0
TLDR
This result generalizes a version of the conditional Ingleton inequality and presents a new method to prove lower bounds for biclique coverings of bipartite graphs.
Abstract
We show that the inequality $H(A \mid B,X) + H(A \mid B,Y) \le H(A\mid B)$ for jointly distributed random variables $A,B,X,Y$, which does not hold in general case, holds under some natural condition on the support of the probability distribution of $A,B,X,Y$. This result generalizes a version of the conditional Ingleton inequality: if for some distribution $I(X: Y \mid A) = H(A\mid X,Y)=0$, then $I(A : B) \le I(A : B \mid X) + I(A: B \mid Y) + I(X : Y)$. We present two applications of our result. The first one is the following easy-to-formulate combinatorial theorem: assume that the edges of a bipartite graph are partitioned into $K$ matchings such that for each pair (left vertex $x$, right vertex $y$) there is at most one matching in the partition involving both $x$ and $y$; assume further that the degree of each left vertex is at least $L$ and the degree of each right vertex is at least $R$. Then $K\ge LR$. The second application is a new method to prove lower bounds for biclique coverings of bipartite graphs.

read more

Citations
More filters
Journal ArticleDOI

An Operational Characterization of Mutual Information in Algorithmic Information Theory

TL;DR: The communication complexity of secret key agreement protocols that produce a secret key of maximal length for protocols with public randomness is established and it is shown that if the communication complexity drops below the established threshold, then only very short secret keys can be obtained.
Proceedings ArticleDOI

An operational characterization of mutual information in algorithmic information theory

TL;DR: In this paper, it was shown that the mutual information of any pair of strings $x and $y$ is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties, one having $x$ and the complexity profile of the pair and the other one having$y$ and complexity profile, can establish via a probabilistic protocol with interaction on a public channel, and that if the communication complexity drops below the established threshold then only very short secret keys can be obtained.

An operational characterization of mutual information in algorithmic information theory.

TL;DR: In this paper, it was shown that the mutual information of any pair of strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties can establish via a probabilistic protocol with interaction on a public channel.
Posted Content

An operational characterization of mutual information in algorithmic information theory.

TL;DR: The communication complexity of secret key agreement protocols that produce a secret key of maximal length, for protocols with public randomness is established and it is shown that if the communication complexity drops below the established threshold, then only very short secret keys can be obtained.
References
More filters
Posted Content

On essentially conditional information inequalities

TL;DR: In this article, Zhang and Yeung showed that some non-Shannon-type conditional inequalities cannot be extended to any unconditional inequality, i.e., they cannot be generalized to any other conditional inequalities.
Related Papers (5)