scispace - formally typeset
Search or ask a question
Topic

Communication complexity

About: Communication complexity is a research topic. Over the lifetime, 3870 publications have been published within this topic receiving 105832 citations.


Papers
More filters
Proceedings ArticleDOI
01 Oct 2016
TL;DR: The first true size-space trade-offs for the cutting planes proof system are obtained, where the upper bounds hold for size and total space for derivations with constantsize coefficients, and the lower bounds apply to length and formula space even for derivation with exponentially large coefficients.
Abstract: We obtain the first true size-space trade-offs for the cutting planes proof system, where the upper bounds hold for size and total space for derivations with constantsize coefficients, and the lower bounds apply to length and formula space (i.e., number of inequalities in memory) even for derivations with exponentially large coefficients. These are also the first trade-offs to hold uniformly for resolution, polynomial calculus and cutting planes, thus capturing the main methods of reasoning used in current state-of-the-art SAT solvers. We prove our results by a reduction to communication lower bounds in a round-efficient version of the real communication model of [Kraj´iˇcek ’98], drawing on and extending techniques in [Raz and McKenzie ’99] and [G¨o¨os et al. ’15]. The communication lower bounds are in turn established by a reduction to trade-offs between cost and number of rounds in the game of [Dymond and Tompa ’85] played on directed acyclic graphs. As a by-product of the techniques developed to show these proof complexity trade-off results, we also obtain an exponential separation between monotone-ACi1 and monotone-ACi, improving exponentially over the superpolynomial separation in [Raz and McKenzie ’99]. That is, we give an explicit Boolean function that can be computed by monotone Boolean circuits of depth logi n and polynomial size, but for which circuits of depth O(logi1 n) require exponential size.

59 citations

Journal ArticleDOI
TL;DR: It is shown that allowing randomization in the protocol can be crucial for obtaining small extended formulations, and it is proved that for the spanning tree and perfect matching polytopes, small variance in the Protocol forces large size in the extended formulation.
Abstract: An extended formulation of a polyhedron $$P$$P is a linear description of a polyhedron $$Q$$Q together with a linear map $$\pi $$? such that $$\pi (Q)=P$$?(Q)=P. These objects are of fundamental importance in polyhedral combinatorics and optimization theory, and the subject of a number of studies. Yannakakis' factorization theorem (Yannakakis in J Comput Syst Sci 43(3):441---466, 1991) provides a surprising connection between extended formulations and communication complexity, showing that the smallest size of an extended formulation of $$P$$P equals the nonnegative rank of its slack matrix $$S$$S. Moreover, Yannakakis also shows that the nonnegative rank of $$S$$S is at most $$2^c$$2c, where $$c$$c is the complexity of any deterministic protocol computing $$S$$S. In this paper, we show that the latter result can be strengthened when we allow protocols to be randomized. In particular, we prove that the base-$$2$$2 logarithm of the nonnegative rank of any nonnegative matrix equals the minimum complexity of a randomized communication protocol computing the matrix in expectation. Using Yannakakis' factorization theorem, this implies that the base-$$2$$2 logarithm of the smallest size of an extended formulation of a polytope $$P$$P equals the minimum complexity of a randomized communication protocol computing the slack matrix of $$P$$P in expectation. We show that allowing randomization in the protocol can be crucial for obtaining small extended formulations. Specifically, we prove that for the spanning tree and perfect matching polytopes, small variance in the protocol forces large size in the extended formulation.

59 citations

DOI
09 Jul 2017
TL;DR: It is proved that the sample complexity is essentially determined by a fundamental operator in the theory of interpolation of Banach spaces, known as Peetre's K-functional, which stems from an unexpected connection to functional analysis and refined concentration of measure inequalities, which arise naturally in the reduction.
Abstract: We present a new methodology for proving distribution testing lower bounds, establishing a connection between distribution testing and the simultaneous message passing (SMP) communication model. Extending the framework of Blais, Brody, and Matulef [15], we show a simple way to reduce (private-coin) SMP problems to distribution testing problems. This method allows us to prove new distribution testing lower bounds, as well as to provide simple proofs of known lower bounds.Our main result is concerned with testing identity to a specific distribution p, given as a parameter. In a recent and influential work, Valiant and Valiant [53] showed that the sample complexity of the aforementioned problem is closely related to the e2/3-quasinorm of p. We obtain alternative bounds on the complexity of this problem in terms of an arguably more intuitive measure and using simpler proofs. More specifically, we prove that the sample complexity is essentially determined by a fundamental operator in the theory of interpolation of Banach spaces, known as Peetre's K-functional We show that this quantity is closely related to the size of the effective support of p (loosely speaking, the number of supported elements that constitute the vast majority of the mass of p). This result, in turn, stems from an unexpected connection to functional analysis and refined concentration of measure inequalities, which arise naturally in our reduction.

59 citations

Journal ArticleDOI
TL;DR: The main goal of this paper is the comparison of the power of Las Vegas computation and deterministic respectively nondeterministic computation for the complexity measures of one-way communication, ordered binary decision diagrams, and finite automata.
Abstract: The study of the computational power of randomized computations is one of the central tasks of complexity theory. The main goal of this paper is the comparison of the power of Las Vegas computation and deterministic respectively nondeterministic computation. We investigate the power of Las Vegas computation for the complexity measures of one-way communication, ordered binary decision diagrams, and finite automata. (i) For the one-way communication complexity of two-party protocols we show that Las Vegas communication can save at most one half of the deterministic one-way communication complexity. We also present a language for which this gap is tight. (ii) The result (i) is applied to show an at most polynomial gap between determinism and Las Vegas for ordered binary decision diagrams. (iii) For the size (i.e., the number of states) of finite automata we show that the size of Las Vegas finite automata recognizing a language L is at least the square root of the size of the minimal deterministic finite automaton recognizing L. Using a specific language we verify the optimality of this bound. Copyright 2001 Academic Press.

58 citations

Book ChapterDOI
16 Aug 2015
TL;DR: A general upper bound and the first non-trivial lower bounds for conditional disclosure of secrets are presented, which explain the trade-off between ciphertext and secret key sizes of several existing attribute-based encryption schemes based on the dual system methodology.
Abstract: We initiate a systematic treatment of the communication complexity of conditional disclosure of secrets (CDS), where two parties want to disclose a secret to a third party if and only if their respective inputs satisfy some predicate. We present a general upper bound and the first non-trivial lower bounds for conditional disclosure of secrets. Moreover, we achieve tight lower bounds for many interesting setting of parameters for CDS with linear reconstruction, the latter being a requirement in the application to attribute-based encryption. In particular, our lower bounds explain the trade-off between ciphertext and secret key sizes of several existing attribute-based encryption schemes based on the dual system methodology.

58 citations


Network Information
Related Topics (5)
Upper and lower bounds
56.9K papers, 1.1M citations
84% related
Encryption
98.3K papers, 1.4M citations
82% related
Network packet
159.7K papers, 2.2M citations
81% related
Server
79.5K papers, 1.4M citations
81% related
Wireless network
122.5K papers, 2.1M citations
80% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202319
202256
2021161
2020165
2019149
2018141