scispace - formally typeset
Search or ask a question
Topic

Average-case complexity

About: Average-case complexity is a research topic. Over the lifetime, 1749 publications have been published within this topic receiving 44972 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: This work defines space complexity classes in the framework of membrane computing, giving some initial results about their mutual relations and their connection with time complexity classes, and identifying some potentially interesting problems which require further research.
Abstract: We define space complexity classes in the framework of membrane computing, giving some initial results about their mutual relations and their connection with time complexity classes, and identifying some potentially interesting problems which require further research

33 citations

Proceedings ArticleDOI
13 Oct 1975
TL;DR: Theorem 2.3.3 as mentioned in this paper Theorem 3 is in the same vein as the proof of Theorem 2 and Theorem 4 is in Section 4 and Section 5.
Abstract: 3. Summary of Results Ig 19 n-19 Ig(k/n) + 0(1) if k > n n/k + Ig 19 k + 0(1) if k < n We shall prove Theorems 1 and 2 in Sections 4 and 5. The proof of Theorem 3 is in the same vein and will not be given here.

32 citations

Journal ArticleDOI
01 Feb 1973
TL;DR: This paper presents an improved proof of the Blum speed-up theorem which has a straightforward generalization to obtain operator speed-ups and eliminates all priority mechanisms and all but the most transparent appeals to the recursion theorem.
Abstract: Perhaps the two most basic phenomena discovered by the recent application of recursion theoretic methods to the developing theories of computational complexity have been Blum's speed-up phenomena, with its extension to operator speed-up by Meyer and Fischer, and the Borodin gap phenomena, with its extension to operator gaps by Constable. In this paper we present a proof of the operator gap theorem which is much simpler than Constable's proof. We also present an improved proof of the Blum speed-up theorem which has a straightforward generalization to obtain operator speed-ups. The proofs of this paper are new; the results are not. The proofs themselves are entirely elementary: we have eliminated all priority mechanisms and all but the most transparent appeals to the recursion theorem. Even these latter appeals can be eliminated in some "reasonable" complexity measures. Imnplicit in the proofs is what we believe to be a new method for viewing the construction of "complexity sequences." Unspecified notation follows Rogers [12]. 2iqi is any standard indexing of the partial recursive functions. N is the set of all nonnegative integers. 2iDi is a canonical indexing of all finite subsets of N: from i we can list Di and know when the listing is completed. Similarly, 2iFi is a canonical indexing of all finite functions defined (exactly) on some initial segment {0, 1, 2, , n}. ;,D, is any Blum measure of computational complexity or resource. Specifically, for all i, domain (Di= domain Xi, and the ternary relation (Pi(x)

32 citations

Journal ArticleDOI
TL;DR: The existence of periodic sequences over a finite field which simultaneously achieve the maximum value (for the given period length) of the linear complexity and of the k-error linear complexity for small values of k is established.
Abstract: We establish the existence of periodic sequences over a finite field which simultaneously achieve the maximum value (for the given period length) of the linear complexity and of the k-error linear complexity for small values of k. This disproves a conjecture of Ding, Xiao, and Shan (1991). The result is of relevance for the theory of stream ciphers.

32 citations

Proceedings ArticleDOI
28 Jun 1994
TL;DR: In this paper, the authors describe three orthogonal complexity measures: parallel time, amount of hardware, and degree of non-uniformity, which together parametrize most complexity classes, and show that the descriptive complexity framework neatly captures these measures using the parameters: quantifier depth, number of variable bits, and type of numeric predicates respectively.
Abstract: We describe three orthogonal complexity measures: parallel time, amount of hardware, and degree of non-uniformity, which together parametrize most complexity classes. We show that the descriptive complexity framework neatly captures these measures using the parameters: quantifier depth, number of variable bits, and type of numeric predicates respectively. A fairly simple picture arises in which the basic questions in complexity theory-solved and unsolved-can be understood as questions about tradeoffs among these three dimensions. >

32 citations


Network Information
Related Topics (5)
Time complexity
36K papers, 879.5K citations
89% related
Approximation algorithm
23.9K papers, 654.3K citations
87% related
Data structure
28.1K papers, 608.6K citations
83% related
Upper and lower bounds
56.9K papers, 1.1M citations
83% related
Computational complexity theory
30.8K papers, 711.2K citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20222
20216
202010
20199
201810
201732