scispace - formally typeset
Search or ask a question
Topic

Average-case complexity

About: Average-case complexity is a research topic. Over the lifetime, 1749 publications have been published within this topic receiving 44972 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper the theory of slice functions is extended and a monotone representation of each Boolean function whosemonotone complexity is at most a factor n larger than its circuit complexity is presented.

11 citations

Journal ArticleDOI
TL;DR: In its final form, the theorem requires that the product of spatial, temporal, and fanin complexities equal or exceed the problem complexity.
Abstract: Given certain simple and well defined operations and complexity measures, the product of spatial complexity with temporal complexity must exceed a certain minimum problem complexity if that processor is to solve that problem. Some optical processors violate that condition in a favorable direction (anomalously small temporal complexity). We next extend the requirement to embrace those optical processors. In its final form, the theorem requires that the product of spatial, temporal, and fanin complexities equal or exceed the problem complexity.

11 citations

Proceedings ArticleDOI
15 Jun 2021
TL;DR: In this paper, it was shown that the average-case hardness of PH is NP-hard on average if UP requires (sub-)exponential worst-case complexity, where UP is a universal heuristic scheme whose running time is P-computable average case polynomial time.
Abstract: A long-standing and central open question in the theory of average-case complexity is to base average-case hardness of NP on worst-case hardness of NP. A frontier question along this line is to prove that PH is hard on average if UP requires (sub-)exponential worst-case complexity. The difficulty of resolving this question has been discussed from various perspectives based on technical barrier results, such as the limits of black-box reductions and the non-existence of worst-case hardness amplification procedures in PH. In this paper, we overcome these barriers and resolve the open question by presenting the following main results: 1. UP ⊈DTIME(2O(n / logn)) implies DistNP ⊈AvgP. 2. PH ⊈DTIME(2O(n / logn)) implies DistPH ⊈AvgP. 3. NP ⊈DTIME(2O(n / logn)) implies DistNP ⊈AvgP P. Here, AvgP P denotes P-computable average-case polynomial time, which interpolates average-case polynomial-time and worst-case polynomial-time. We complement this result by showing that DistPH ⊈AvgP if and only if DistPH ⊈AvgP P. At the core of all of our results is a new notion of universal heuristic scheme, whose running time is P-computable average-case polynomial time under every polynomial-time samplable distribution. Our proofs are based on the meta-complexity of time-bounded Kolmogorov complexity: We analyze average-case complexity through the lens of worst-case meta-complexity using a new “algorithmic” proof of language compression and weak symmetry of information for time-bounded Kolmogorov complexity.

11 citations

Proceedings ArticleDOI
01 Sep 2010
TL;DR: A randomized scheduling algorithm that can also stabilize the system for any admissible traffic that satisfies the strong law of large number and is highly scalable and a good choice for future high-speed switch designs is proposed.
Abstract: Internet traffic has increased at a very fast pace in recent years. The traffic demand requires that future packet switching systems should be able to switch packets in a very short time, i.e., just a few nanoseconds. Algorithms with lower computation complexity are more desirable for this high-speed switching design. Among the existing algorithms that can achieve 100% throughput for input-queued switches for any admissible Bernoulli traffic, ALGO3 [1] and EMHW [2] have the lowest computation complexity, which is O(logN), where N is the number of ports in the switch. In this paper, we propose a randomized scheduling algorithm, which can also stabilize the system for any admissible traffic that satisfies the strong law of large number. The algorithm has a complexity of O(1). Since the complexity does not increase with the size of a switch, the algorithm is highly scalable and a good choice for future high-speed switch designs. We also show that the algorithm can be implemented in a distributed way by using a low-rate control channel. Simulation results show that the algorithm can provide a good delay performance as compared to algorithms with higher computation complexity.

11 citations

Journal Article
TL;DR: The notion of traditional compression, which can be viewed as compressing protocols that involve only one way communication, is generalized by designing new compression schemes that can compress the communication in interactive processes that do not reveal too much information about their inputs.
Abstract: We prove a direct sum theorem for randomized communication complexity. Ignoring logarithmic factors, our results show that: • Computing n copies of a function requires p n times the communication. • For average case complexity, given any distribution µ on inputs, computing n copies of the function on n independent inputs sampled according to µ requires p n times the communication for computing one copy. • If µ is a product distribution, computing n copies on n independent inputs sampled according to µ requires n times the communication. We also study the complexity of computing the parity of n evaluations of f, and obtain results analogous to those above. Our results are obtained by designing new compression schemes that can compress the communication in interactive processes that do not reveal too much information about their inputs. This generalizes the notion of traditional compression, which can be viewed as compressing protocols that involve only one way communication.

11 citations


Network Information
Related Topics (5)
Time complexity
36K papers, 879.5K citations
89% related
Approximation algorithm
23.9K papers, 654.3K citations
87% related
Data structure
28.1K papers, 608.6K citations
83% related
Upper and lower bounds
56.9K papers, 1.1M citations
83% related
Computational complexity theory
30.8K papers, 711.2K citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20222
20216
202010
20199
201810
201732