scispace - formally typeset
Search or ask a question
Topic

Communication complexity

About: Communication complexity is a research topic. Over the lifetime, 3870 publications have been published within this topic receiving 105832 citations.


Papers
More filters
Posted Content
TL;DR: The first large lower bounds on the bounded error quantum communication complexity of functions, for which a polynomial quantum speedup is possible, are proved.
Abstract: We prove new lower bounds for bounded error quantum communication complexity. Our methods are based on the Fourier transform of the considered functions. First we generalize a method for proving classical communication complexity lower bounds developed by Raz to the quantum case. Applying this method we give an exponential separation between bounded error quantum communication complexity and nondeterministic quantum communication complexity. We develop several other lower bound methods based on the Fourier transform, notably showing that \sqrt{\bar{s}(f)/\log n}, for the average sensitivity \bar{s}(f) of a function f, yields a lower bound on the bounded error quantum communication complexity of f(x AND y XOR z), where x is a Boolean word held by Alice and y,z are Boolean words held by Bob. We then prove the first large lower bounds on the bounded error quantum communication complexity of functions, for which a polynomial quantum speedup is possible. For all the functions we investigate, the only previously applied general lower bound method based on discrepancy yields bounds that are O(\log n).

65 citations

Journal ArticleDOI
TL;DR: The main innovative aspect of the proof is a simple combinatorial argument showing that the rectangle covering number of the unique-disjointness matrix is at least $$1.5^n$$1.58n$$, which slightly improves on the previously best known lower bounds $1.24n and $.31n, respectively.
Abstract: We establish that the extension complexity of the $$n\times n$$n×n correlation polytope is at least $$1.5\,^n$$1.5n by a short proof that is self-contained except for using the fact that every face of a polyhedron is the intersection of all facets it is contained in. The main innovative aspect of the proof is a simple combinatorial argument showing that the rectangle covering number of the unique-disjointness matrix is at least $$1.5^n$$1.5n, and thus the nondeterministic communication complexity of the unique-disjointness predicate is at least $$.58n$$.58n. We thereby slightly improve on the previously best known lower bounds $$1.24^n$$1.24n and $$.31n$$.31n, respectively.

65 citations

Posted Content
TL;DR: In this paper, the authors show that disallowing after-the-fact removal is necessary for achieving subquadratic-communication Byzantine agreement (BA) protocols with near-optimal resilience and expected constant rounds under standard cryptographic assumptions and a public-key infrastructure.
Abstract: As Byzantine Agreement (BA) protocols find application in large-scale decentralized cryptocurrencies, an increasingly important problem is to design BA protocols with improved communication complexity. A few existing works have shown how to achieve subquadratic BA under an {\it adaptive} adversary. Intriguingly, they all make a common relaxation about the adaptivity of the attacker, that is, if an honest node sends a message and then gets corrupted in some round, the adversary {\it cannot erase the message that was already sent} --- henceforth we say that such an adversary cannot perform "after-the-fact removal". By contrast, many (super-)quadratic BA protocols in the literature can tolerate after-the-fact removal. In this paper, we first prove that disallowing after-the-fact removal is necessary for achieving subquadratic-communication BA. Next, we show new subquadratic binary BA constructions (of course, assuming no after-the-fact removal) that achieves near-optimal resilience and expected constant rounds under standard cryptographic assumptions and a public-key infrastructure (PKI) in both synchronous and partially synchronous settings. In comparison, all known subquadratic protocols make additional strong assumptions such as random oracles or the ability of honest nodes to erase secrets from memory, and even with these strong assumptions, no prior work can achieve the above properties. Lastly, we show that some setup assumption is necessary for achieving subquadratic multicast-based BA.

65 citations

Journal ArticleDOI
TL;DR: The main result regarding the first attempt is negative: one cannot use this method for proving superpolynomial lower bounds for formula size, and the main results regarding the second attempt is a "direct-sum" theorem for two-round communication complexity.
Abstract: It is possible to view communication complexity as the minimum solution of an integer programming problem. This integer programming problem is relaxed to a linear programming problem and from it information regarding the original communication complexity question is deduced. A particularly appealing avenue this opens is the possibility of proving lower bounds on the communication complexity (which is a minimization problem) by exhibiting upper bounds on the maximization problem defined by the dual of the linear program. This approach works very neatly in the case of nondeterministic communication complexity. In this case a special case of Lovasz's fractional cover measure is obtained. Through it the amortized nondeterministic communication complexity is completely characterized. The power of the approach is also illustrated by proving lower and upper bounds on the nondeterministic communication complexity of various functions. In the case of deterministic complexity the situation is more complicated. Two attempts are discussed and some results using each of them are obtained. The main result regarding the first attempt is negative: one cannot use this method for proving superpolynomial lower bounds for formula size. The main result regarding the second attempt is a "direct-sum" theorem for two-round communication complexity.

65 citations

Proceedings ArticleDOI
25 Jun 2012
TL;DR: This paper revisits the communication complexity of large-scale 3D fast Fourier transforms (FFTs) and asks what impact trends in current architectures will have on FFT performance at exascale, and develops suitable analytical models to derive suitable models.
Abstract: This paper revisits the communication complexity of large-scale 3D fast Fourier transforms (FFTs) and asks what impact trends in current architectures will have on FFT performance at exascale. We analyze both memory hierarchy traffic and network communication to derive suitable analytical models, which we calibrate against current software implementations; we then evaluate models to make predictions about potential scaling outcomes at exascale, based on extrapolating current technology trends. Of particular interest is the performance impact of choosing high-density processors, typified today by graphics co-processors (GPUs), as the base processor for an exascale system. Among various observations, a key prediction is that although inter-node all-to-all communication is expected to be the bottleneck of distributed FFTs, intra-node communication---expressed precisely in terms of the relative balance among compute capacity, memory bandwidth, and network bandwidth---will play a critical role.

64 citations


Network Information
Related Topics (5)
Upper and lower bounds
56.9K papers, 1.1M citations
84% related
Encryption
98.3K papers, 1.4M citations
82% related
Network packet
159.7K papers, 2.2M citations
81% related
Server
79.5K papers, 1.4M citations
81% related
Wireless network
122.5K papers, 2.1M citations
80% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202319
202256
2021161
2020165
2019149
2018141