scispace - formally typeset
Search or ask a question
Topic

Communication complexity

About: Communication complexity is a research topic. Over the lifetime, 3870 publications have been published within this topic receiving 105832 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: A new way of characterizing the complexity of online problems is proposed, which measures the amount of problem-relevant information contained in the input to be the minimal number of bits communicated by the algorithm to the oracle in order to solve the problem optimally.
Abstract: We propose a new way of characterizing the complexity of online problems. Instead of measuring the degradation of the output quality caused by the ignorance of the future we choose to quantify the amount of additional global information needed for an online algorithm to solve the problem optimally. In our model, the algorithm cooperates with an oracle that can see the whole input. We define the advice complexity of the problem to be the minimal number of bits (normalized per input request, and minimized over all algorithm-oracle pairs) communicated by the algorithm to the oracle in order to solve the problem optimally. Hence, the advice complexity measures the amount of problem-relevant information contained in the input. We introduce two modes of communication between the algorithm and the oracle based on whether the oracle offers an advice spontaneously (helper) or on request (answerer). We analyze the Paging and DiffServ problems in terms of advice complexity and deliver upper and lower bounds in both communication modes; in the case of DiffServ problem in helper mode the bounds are tight.

91 citations

Proceedings ArticleDOI
17 May 2008
TL;DR: Lower bounds are proved for when the order of the items in the stream is chosen not adversarially but rather uniformly (or near-uniformly) from the set of all permuations, which gives stronger evidence for the inherent hardness of streaming problems.
Abstract: We study the communication complexity of evaluating functions when the input data is randomly allocated (according to some known distribution) amongst two or more players, possibly with information overlap. This naturally extends previously studied variable partition models such as the best-case and worst-case partition models [32,29]. We aim to understand whether the hardness of a communication problem holds for almost every allocation of the input, as opposed to holding for perhaps just a few atypical partitions. A key application is to the heavily studied data stream model. There is a strong connection between our communication lower bounds and lower bounds in the data stream model that are "robust" to the ordering of the data. That is, we prove lower bounds for when the order of the items in the stream is chosen not adversarially but rather uniformly (or near-uniformly) from the set of all permuations. This random-order data stream model has attracted recent interest, since lower bounds here give stronger evidence for the inherent hardness of streaming problems. Our results include the first random-partition communication lower bounds for problems including multi-party set disjointness and gap-Hamming-distance. Both are tight. We also extend and improve previous results [19,7] for a form of pointer jumping that is relevant to the problem of selection (in particular, median finding). Collectively, these results yield lower bounds for a variety of problems in the random-order data stream model, including estimating the number of distinct elements, approximating frequency moments, and quantile estimation.

91 citations

Journal ArticleDOI
TL;DR: A new and generic rate-distortion-complexity model is proposed that can generate DIA descriptions for image and video decoding algorithms running on various hardware architectures and explicitly model the complexity involved in decoding a bitstream by a generic receiver.
Abstract: Existing research on Universal Multimedia Access has mainly focused on adapting multimedia to the network characteristics while overlooking the receiver capabilities. Alternatively, part 7 of the MPEG-21 standard entitled Digital Item Adaptation (DIA) defines description tools to guide the multimedia adaptation process based on both the network conditions and the available receiver resources. In this paper, we propose a new and generic rate-distortion-complexity model that can generate such DIA descriptions for image and video decoding algorithms running on various hardware architectures. The novelty of our approach is in virtualizing complexity, i.e., we explicitly model the complexity involved in decoding a bitstream by a generic receiver. This generic complexity is translated dynamically into "real" complexity, which is architecture-specific. The receivers can then negotiate with the media server/proxy the transmission of a bitstream having a desired complexity level based on their resource constraints. Hence, unlike in previous streaming systems, multimedia transmission can be optimized in an integrated rate-distortion-complexity setting by minimizing the incurred distortion under joint rate-complexity constraints.

91 citations

01 Jan 2002
TL;DR: A suite of lower bound techniques that characterize the complexity of functions in these models, indicating which problems can be solved efficiently in them are developed, and a powerful method for proving lower bounds for general communication complexity is presented.
Abstract: Numerous massive data sets, ranging from flows of Internet traffic to logs of supermarket transactions, have emerged during the past few years. Their overwhelming size and the typically restricted access to them call for new computational models. This thesis studies three such models: sampling computations, data stream computations, and sketch computations. While most of the previous work focused on designing algorithms in the new models, this thesis revolves around the limitations of the models. We develop a suite of lower bound techniques that characterize the complexity of functions in these models, indicating which problems can be solved efficiently in them. We derive specific bounds for a multitude of practical problems, arising from applications in database, networking, and information retrieval, such as frequency statistics, selection functions, statistical moments, and distance estimation. We present general, powerful, and easy to use lower bound techniques for the sampling model. The techniques apply to all functions and address both oblivious and adaptive sampling. They frequently produce optimal bounds for a wide range of functions. They are stated in terms of new combinatorial and statistical properties of functions, which are easy to calculate. We obtain lower bounds for the data stream and sketch models through one-way and simultaneous communication complexity. We develop lower bounds for the latter via a new information-theoretic view of communication complexity. A highlight of this work is an optimal simultaneous communication complexity lower bound for the important multi-party set-disjointness problem. Finally, we present a powerful method for proving lower bounds for general communication complexity. The method is based on a direct sum property of a new measure of complexity for communication complexity protocols and on a novel statistical view of communication complexity. We use the technique to obtain improved communication complexity and data stream lower bounds for several problems, including multi-party set-disjointness, frequency moments, and Lp distance estimation. These results solve open problems of Alon, Matias, and Szegedy and of Saks and Sun.

90 citations

Journal ArticleDOI
TL;DR: An in-depth analysis of the media distortion characteristics allows us to define a low complexity algorithm for an optimal flow rate allocation in multipath network scenarios, and shows that a greedy allocation of rate along paths with increasing error probability leads to an optimal solution.
Abstract: We address the problem of joint path selection and source rate allocation in order to optimize the media specific quality of service in streaming of stored video sequences on multipath networks. An optimization problem is proposed in order to minimize the end-to-end distortion, which depends on video sequence dependent parameters, and network properties. An in-depth analysis of the media distortion characteristics allows us to define a low complexity algorithm for an optimal flow rate allocation in multipath network scenarios. In particular, we show that a greedy allocation of rate along paths with increasing error probability leads to an optimal solution. We argue that a network path shall not be chosen for transmission, unless all other available paths with lower error probability have been chosen. Moreover, the chosen paths should be used at their maximum available end-to-end bandwidth. Simulation results show that the optimal flow rate allocation carefully adapts the total streaming rate and the number of chosen paths, to the end-to-end transmission error probability. In many scenarios, the optimal rate allocation provides more than 20% improvement in received video quality, compared to heuristic-based algorithms. This motivates its use in multipath networks, where it optimizes media specific quality of service, and simultaneously saves network resources at the price of a very low computational complexity.

90 citations


Network Information
Related Topics (5)
Upper and lower bounds
56.9K papers, 1.1M citations
84% related
Encryption
98.3K papers, 1.4M citations
82% related
Network packet
159.7K papers, 2.2M citations
81% related
Server
79.5K papers, 1.4M citations
81% related
Wireless network
122.5K papers, 2.1M citations
80% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202319
202256
2021161
2020165
2019149
2018141