scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Multi-Party Protocols, Information Complexity and Privacy.

TL;DR: In this paper, the authors introduce the notion of public information complexity (PIC) as a lower bound on the communication complexity of a protocol and an upper bound on its randomness.
Abstract: We introduce the new measure of Public Information Complexity (PIC), as a tool for the study of multi-party computation protocols, and of quantities such as their communication complexity, or the amount of randomness they require in the context of information-theoretic private computations. We are able to use this measure directly in the natural asynchronous message-passing peer-to-peer model and show a number of interesting properties and applications of our new notion: the Public Information Complexity is a lower bound on the Communication Complexity and an upper bound on the Information Complexity; the difference between the Public Information Complexity and the Information Complexity provides a lower bound on the amount of randomness used in a protocol; any communication protocol can be compressed to its Public Information Cost; an explicit calculation of the zero-error Public Information Complexity of the k-party, n-bit Parity function, where a player outputs the bit-wise parity of the inputs. The latter result establishes that the amount of randomness needed for a private protocol that computes this function is Omega(n).
Citations
More filters
Journal ArticleDOI
TL;DR: Progress-to-date on manipulating correlated random variables in a distributed setting is described and pertinent measures, achievability results, limits of performance, and point to new directions are laid out.
Abstract: The task of manipulating correlated random variables in a distributed setting has received attention in the fields of both Information Theory and Computer Science. Often shared correlations can be converted, using a little amount of communication, into perfectly shared uniform random variables. Such perfect shared randomness, in turn, enables the solutions of many tasks. Even the reverse conversion of perfectly shared uniform randomness into variables with a desired form of correlation turns out to be insightful and technically useful. In this article, we describe progress-to-date on such problems and lay out pertinent measures, achievability results, limits of performance, and point to new directions.

21 citations

Posted Content
TL;DR: In this article, the authors introduced new models and new information theoretic measures for the study of communication complexity in the natural peer-to-peer, multi-party, number-in-hand setting.
Abstract: We introduce new models and new information theoretic measures for the study of communication complexity in the natural peer-to-peer, multi-party, number-in-hand setting. We prove a number of properties of our new models and measures, and then, in order to exemplify their effectiveness, we use them to prove two lower bounds. The more elaborate one is a tight lower bound of $\Omega(kn)$ on the multi-party peer-to-peer randomized communication complexity of the $k$-player, $n$-bit Disjointness function. The other one is a tight lower bound of $\Omega(kn)$ on the multi-party peer-to-peer randomized communication complexity of the $k$-player, $n$-bit bitwise parity function. Both lower bounds hold when ${n=\Omega(k)}$. The lower bound for Disjointness improves over the lower bound that can be inferred from the result of Braverman et al.~(FOCS 2013), which was proved in the coordinator model and can yield a lower bound of $\Omega(kn/\log k)$ in the peer-to-peer model. To the best of our knowledge, our lower bounds are the first tight (non-trivial)lower bounds on communication complexity in the natural {\em peer-to-peer} multi-party setting. In addition to the above results for communication complexity, we also prove, using the same tools, an $\Omega(n)$ lower bound on the number of random bits necessary for the (information theoretic) private computation of the $k$-player, $n$-bit Disjointness function .

4 citations

Proceedings ArticleDOI
07 Oct 2020
TL;DR: Theoretical research on strengthening intelligent realistic problems of ideological online education platform from the perspective of information theory is conducted, and the traditional model is enhanced to then construct the efficient method to segment the network sensitive data.
Abstract: Theoretical research on strengthening intelligent realistic problems of ideological online education platform from the perspective of information theory is conducted in this paper. The highlight of the paper is: (1) Data segmentation based on the network sensitive data tamper-proof method to segment the network sensitive data, obtain sub-data blocks, based on hash functions to obtain keys, the traditional model is enhanced to then construct the efficient method. (2) A switching control module is added to manage the node switching in the SDNC located at the control layer. The information security is then well guaranteed. (3) The online platform is constructed for improving efficiency.
References
More filters
Proceedings Article
01 Jan 1988
TL;DR: The above bounds on t, where t is the number of players in actors, are tight!
Abstract: Every function of n inputs can be efficiently computed by a complete network of n processors in such a way that:If no faults occur, no set of size t < n/2 of players gets any additional information (other than the function value), Even if Byzantine faults are allowed, no set of size t < n/3 can either disrupt the computation or get additional information. Furthermore, the above bounds on t are tight!

2,298 citations

Proceedings ArticleDOI
01 Jan 1988
TL;DR: It is shown that any reasonable multiparty protocol can be achieved if at least 2n/3 of the participants are honest and the secrecy achieved is unconditional.
Abstract: Under the assumption that each pair of participants can communicate secretly, we show that any reasonable multiparty protocol can be achieved if at least 2n/3 of the participants are honest. The secrecy achieved is unconditional. It does not rely on any assumption about computational intractability.

1,663 citations

Proceedings ArticleDOI
30 Apr 1979
TL;DR: The quantity of interest, which measures the information exchange necessary for computing f, is the minimum number of bits exchanged in any algorithm.
Abstract: Let M = {0, 1, 2, ..., m—1} , N = {0, 1, 2,..., n—1} , and f:M × N → {0, 1} a Boolean-valued function. We will be interested in the following problem and its related questions. Let i e M, j e N be integers known only to two persons P1 and P2, respectively. For P1 and P2 to determine cooperatively the value f(i, j), they send information to each other alternately, one bit at a time, according to some algorithm. The quantity of interest, which measures the information exchange necessary for computing f, is the minimum number of bits exchanged in any algorithm. For example, if f(i, j) = (i + j) mod 2. then 1 bit of information (conveying whether i is odd) sent from P1 to P2 will enable P2 to determine f(i, j), and this is clearly the best possible. The above problem is a variation of a model of Abelson [1] concerning information transfer in distributive computions.

1,349 citations

Journal ArticleDOI
16 Nov 2002
TL;DR: This work presents a new method for proving strong lower bounds in communication complexity based on the notion of the conditional information complexity of a function, and shows that it also admits a direct sum theorem.
Abstract: We present a new method for proving strong lower bounds in communication complexity. This method is based on the notion of the conditional information complexity of a function which is the minimum amount of information about the inputs that has to be revealed by a communication protocol for the function. While conditional information complexity is a lower bound on the communication complexity, we show that it also admits a direct sum theorem. Direct sum decomposition reduces our task to that of proving (conditional) information complexity lower bounds for simple problems (such as the AND of two bits). For the latter, we develop novel techniques based on Hellinger distance and its generalizations.

724 citations

Journal ArticleDOI
TL;DR: This paper studies the depth of noisy decision trees in which each node gives the wrong answer with some constant probability, giving tight bounds for several problems.
Abstract: This paper studies the depth of noisy decision trees in which each node gives the wrong answer with some constant probability. In the noisy Boolean decision tree model, tight bounds are given on the number of queries to input variables required to compute threshold functions, the parity function and symmetric functions. In the noisy comparison tree model, tight bounds are given on the number of noisy comparisons for searching, sorting, selection and merging. The paper also studies parallel selection and sorting with noisy comparisons, giving tight bounds for several problems.

338 citations