scispace - formally typeset
Search or ask a question
Topic

Average-case complexity

About: Average-case complexity is a research topic. Over the lifetime, 1749 publications have been published within this topic receiving 44972 citations.


Papers
More filters
Journal ArticleDOI
Michael Rodeh1
TL;DR: This work considers the problem of computing the median of a bag of 2n numbers by using communicating processes, each having some of the numbers in its local memory, and gives an algorithm that is optimal up to a constant.

54 citations

Proceedings ArticleDOI
06 Oct 1965
TL;DR: The computational complexity of binary sequences as measured by the rapidity of their generation by multitape Turing machines is investigated and a "translational" method which escapes some of the limitations of earlier approaches leads to a refinement of the established hierarchy.
Abstract: This paper investigates the computational complexity of binary sequences as measured by the rapidity of their generation by multitape Turing machines. A "translational" method which escapes some of the limitations of earlier approaches leads to a refinement of the established hierarchy. The previous complexity classes are shown to possess certain translational properties. An related hierarchy of complexity classes of monotonic functions is examined

53 citations

MonographDOI
24 Aug 2004
TL;DR: This week's topics include complexity through reductions, quantum computation, probabilistic proof systems, and Randomness in computation Pseudorandomness.
Abstract: Week One: Complexity theory: From Godel to Feynman Complexity theory: From Godel to Feynman History and basic concepts Resources, reductions and P vs. NP Probabilistic and quantum computation Complexity classes Space complexity and circuit complexity Oracles and the polynomial time hierarchy Circuit lower bounds "Natural" proofs of lower bounds Bibliography Average case complexity Average case complexity Bibliography Exploring complexity through reductions Introduction PCP theorem and hardness of computing approximate solutions Which problems have strongly exponential complexity? Toda's theorem: $PH\subseteq P^{\ No. P}$ Bibliography Quantum computation Introduction Bipartite quantum systems Quantum circuits and Shor's factoring algorithm Bibliography Lower bounds: Circuit and communication complexity Communication complexity Lower bounds for probabilistic communication complexity Communication complexity and circuit depth Lower bound for directed $st$-connectivity Lower bound for $FORK$ (continued) Bibliography Proof complexity An introduction to proof complexity Lower bounds in proof complexity Automatizability and interpolation The restriction method Other research and open problems Bibliography Randomness in computation Pseudorandomness Preface Computational indistinguishability Pseudorandom generators Pseudorandom functions and concluding remarks Appendix Bibliography Pseudorandomness-Part II Introduction Deterministic simulation of randomized algorithms The Nisan-Wigderson generator Analysis of the Nisan-Wigderson generator Randomness extractors Bibliography Probabilistic proof systems-Part I Interactive proofs Zero-knowledge proofs Suggestions for further reading Bibliography Probabilistically checkable proofs Introduction to PCPs NP-hardness of PCS A couple of digressions Proof composition and the PCP theorem Bibliography.

53 citations

Proceedings ArticleDOI
30 Oct 1989
TL;DR: An Omega ((log n)/sup 2/) bound on the probabilistic communication complexity of monotonic st-connectivity is proved and it is deduced that every nonmonotonic NC/sup 1/ circuit for st-Connectivity requires a constant fraction of negated input variables.
Abstract: The authors demonstrate an exponential gap between deterministic and probabilistic complexity and between the probabilistic complexity of monotonic and nonmonotonic relations. They then prove, as their main result, an Omega ((log n)/sup 2/) bound on the probabilistic communication complexity of monotonic st-connectivity. From this they deduce that every nonmonotonic NC/sup 1/ circuit for st-connectivity requires a constant fraction of negated input variables. >

53 citations

Posted Content
TL;DR: It is shown that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data, and it is proved that the Kolmogorov complexity of a labeled graph is a good approximation of its unlabeled Kolmogsorv complexity and thus a robust definition of graph complexity.
Abstract: We survey and introduce concepts and tools located at the intersection of information theory and network biology. We show that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data. We show examples such as the emergence of giant components in Erdos-Renyi random graphs, and the recovery of topological properties from numerical kinetic properties simulating gene expression data. We provide exact theoretical calculations, numerical approximations and error estimations of entropy, algorithmic probability and Kolmogorov complexity for different types of graphs, characterizing their variant and invariant properties. We introduce formal definitions of complexity for both labeled and unlabeled graphs and prove that the Kolmogorov complexity of a labeled graph is a good approximation of its unlabeled Kolmogorov complexity and thus a robust definition of graph complexity.

53 citations


Network Information
Related Topics (5)
Time complexity
36K papers, 879.5K citations
89% related
Approximation algorithm
23.9K papers, 654.3K citations
87% related
Data structure
28.1K papers, 608.6K citations
83% related
Upper and lower bounds
56.9K papers, 1.1M citations
83% related
Computational complexity theory
30.8K papers, 711.2K citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20222
20216
202010
20199
201810
201732