scispace - formally typeset
Search or ask a question
Topic

Average-case complexity

About: Average-case complexity is a research topic. Over the lifetime, 1749 publications have been published within this topic receiving 44972 citations.


Papers
More filters
Journal Article
TL;DR: The linear Games-Chan algorithm for computing the linear complexity c(s) of a binary sequence s of period = 2 requires the knowledge of the full sequence, while the quadratic Berlekamp-Massey algorithm only requires knowledge of 2c (s) terms as mentioned in this paper.
Abstract: The linear Games-Chan algorithm for computing the linear complexity c(s) of a binary sequence s of period = 2 requires the knowledge of the full sequence, while the quadratic Berlekamp-Massey algorithm only requires knowledge of 2c(s) terms. We show that we can modify the Games-Chan algorithm so that it computes the complexity in linear time knowing only 2c(s) terms. The algorithms of Stamp-Martin and Lauder-Paterson can also be modified, without loss of efficiency, to compute analogues of the k-error linear complexity and of the error linear complexity spectrum for finite binary sequences viewed as initial segments of infinite sequences with period a power of two. Lauder and Paterson apply their algorithm to decoding binary repeated-root cyclic codes of length = 2 in O( (log 2 ) 2 ) time. We improve on their result, developing a decoding algorithm with O( ) bit complexity.

14 citations

Journal ArticleDOI
TL;DR: The motivation for the use of interval computations in data processing and the basic problems of interval mathematics are explained.
Abstract: Before we start explaining why we need to go beyond interval computations, let us briefly recall our motivation for the use of interval computations in data processing. Traditional data processing methods of numerical mathematics are based on the assumptions that we know the exact values of the input quantities. In reafity, the data come from measurements, and measurements are never 100% precise; hence, the actual value x of each input quantity may differ from its measurement result ~. In some cases, we know the probabilities of different values of error Ax = ~ x, but in most case, we only know the guaranteed upper bound for the error; in these cases, the only information we have about the (unknown) actual value x is that x belongs to the interval x = [~ ~x, ~ + ~x]. One of the basic problems of interval mathematics is, therefore, as follows: given a data processing algorithm f ( x l , . . . , xn) and n intervals xl . . . . , x~, compute the range y of possible values of y = f ( x l . . . . , x , ) when xi 6 xi.

14 citations

Book ChapterDOI
22 Feb 2007
TL;DR: In this article, it was shown that if a language has a neutral letter and bounded communication complexity in the k-party game for some fixed k then the language is in fact regular.
Abstract: We study languages with bounded communication complexity in the multiparty "input on the forehead model" with worst-case partition In the two-party case, languages with bounded complexity are exactly those recognized by programs over commutative monoids [19] This can be used to show that these languages all lie in shallow ACC0 In contrast, we use coding techniques to show that there are languages of arbitrarily large circuit complexity which can be recognized in constant communication by k players for k ≥ 3 However, we show that if a language has a neutral letter and bounded communication complexity in the k-party game for some fixed k then the language is in fact regular We give an algebraic characterization of regular languages with this property We also prove that a symmetric language has bounded k-party complexity for some fixed k iff it has bounded two party complexity

14 citations

Journal Article
TL;DR: In 1984, Leonid Levin initiated a theory of average-case complexity as mentioned in this paper, and provided an exposition of the basic definitions suggested by Levin, and discussed some of the considerations underlying these definitions.
Abstract: In 1984, Leonid Levin initiated a theory of average-case complexity We provide an exposition of the basic definitions suggested by Levin, and discuss some of the considerations underlying these definitions

14 citations

Journal ArticleDOI
TL;DR: Some recent work on relaxed notions of derandomization that allow the deterministic simulation to err on some inputs are surveyed.
Abstract: A fundamental question in complexity theory is whether every randomized polynomial time algorithm can be simulated by a deterministic polynomial time algorithm (that is, whether BPP=P). A beautiful theory of derandomization was developed in recent years in attempt to solve this problem. In this article we survey some recent work on relaxed notions of derandomization that allow the deterministic simulation to err on some inputs. We use this opportunity to also provide a brief overview to some results and research directions in "classical derandomization".

14 citations


Network Information
Related Topics (5)
Time complexity
36K papers, 879.5K citations
89% related
Approximation algorithm
23.9K papers, 654.3K citations
87% related
Data structure
28.1K papers, 608.6K citations
83% related
Upper and lower bounds
56.9K papers, 1.1M citations
83% related
Computational complexity theory
30.8K papers, 711.2K citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20222
20216
202010
20199
201810
201732