The Bns Lower-Bound for Multiparty Protocols Is Nearly Optimal
TLDR
The protocol shows that the lower bound for the multi-party communication complexity of the GIP function, given by Babai et al., cannot be improved significantly.Abstract:
We present a multi-party protocol which computes the Generalized Inner Product (GIP) function, introduced by Babai et al. (1989, in "Proceedings, 21st ACM STOC," pp. 1-11). Our protocol shows that the lower bound for the multi-party communication complexity of the GIP function, given by Babai et al., cannot be improved significantly.read more
Citations
More filters
Book
Communication Complexity
Eyal Kushilevitz,Noam Nisan +1 more
TL;DR: This chapter surveys the theory of two-party communication complexity and presents results regarding the following models of computation: • Finite automata • Turing machines • Decision trees • Ordered binary decision diagrams • VLSI chips • Networks of threshold gates.
Book
Boolean Function Complexity: Advances and Frontiers
TL;DR: In this article, a comprehensive description of basic lower bound arguments, covering many of the gems of this complexity Waterloo that have been discovered over the past several decades, right up to results from the last year or two, is given.
Journal ArticleDOI
Multiparty protocols, pseudorandom generators for logspace, and time-space trade-offs
TL;DR: In this paper, the authors prove lower bounds of the form Ω(n · c−k), for the number of bits that need to be exchanged in order to compute some (explicitly given) polynomial time computable functions.
Proceedings ArticleDOI
On the possibility of faster SAT algorithms
Mihai Patrascu,Ryan Williams +1 more
TL;DR: Reductions from the problem of determining the satisfiability of Boolean CNF formulas (CNF-SAT) to several natural algorithmic problems are described, showing that attaining any of the following bounds would improve the state of the art in algorithms for SAT.
Journal ArticleDOI
Zero-error information theory
János Körner,Alon Orlitsky +1 more
TL;DR: The problem of error-free transmission capacity of a noisy channel was posed by Shannon in 1956 and remains unsolved, Nevertheless, partial results for this and similar channel and source coding problems have had a considerable impact on information theory, computer science, and mathematics.
References
More filters
Proceedings ArticleDOI
Some complexity questions related to distributive computing(Preliminary Report)
TL;DR: The quantity of interest, which measures the information exchange necessary for computing f, is the minimum number of bits exchanged in any algorithm.
Proceedings ArticleDOI
Multi-party protocols
TL;DR: A new model is studied, in which a collection of processes P0, ..., Pk−1 that share information about a set of integers {a0, …,ak−1}, communicate to determine a 0-1 predicate of the numbers.
Proceedings ArticleDOI
Lattices, mobius functions and communications complexity
László Lovász,Michael Saks +1 more
TL;DR: A general framework for the study of a broad class of communication problems is developed based on a recent analysis of the communication complexity of graph connectivity, which makes use of combinatorial lattice theory.
Proceedings ArticleDOI
Multiparty protocols and logspace-hard pseudorandom sequences
TL;DR: Lower bounds of the form &OHgr;(n-k), for the number of bits that need to be exchanged in order to compute some (explicitly given) functions in P, are proved.
Proceedings ArticleDOI
On the power of small-depth threshold circuits
Johan Håstad,Mikael Goldmann +1 more
TL;DR: It is proved that there are monotone functions f/sub k/ that can be computed on depth k and linear size AND, OR circuits but require exponential-size to be computed by a depth-(k-1) monot one weighted threshold circuit.