scispace - formally typeset
Search or ask a question

Showing papers on "Average-case complexity published in 1987"


Journal ArticleDOI
01 Jul 1987-Nature
TL;DR: Information-based complexity seeks to develop general results about the intrinsic difficulty of solving problems where available information is partial or approximate and to apply these results to specific problems.
Abstract: Information-based complexity seeks to develop general results about the intrinsic difficulty of solving problems where available information is partial or approximate and to apply these results to specific problems. This allows one to determine what is meant by an optimal algorithm in many practical situations, and offers a variety of interesting and sometimes surprising theoretical results.

647 citations


Journal ArticleDOI
TL;DR: In this paper, the authors consider the problem of finding an algorithm that is fact on average with respect to a natural probability distribution on inputs and consider the Hamilto problem from that point of view.
Abstract: One way to cope with an NP-hard problem is to find an algorithm that is fact on average with respect to a natural probability distribution on inputs. We consider from that point of view the Hamilto...

158 citations


Book ChapterDOI
13 Apr 1987
TL;DR: Stream ciphers are based on pseudorandom key streams, i.e. on deterministically generated sequences of bits with acceptable properties of unpredictability and randomness.
Abstract: Stream ciphers are based on pseudorandom key streams, i.e. on deterministically generated sequences of bits with acceptable properties of unpredictability and randomness (see [4, ch. 9], [9]

92 citations



Journal ArticleDOI
TL;DR: The complexity gains of a new algorithm derived from Clifford's theorem are discussed and new upper bounds, also for the complexity of the underlying group algebras, are derived.

47 citations



01 Jun 1987
TL;DR: This a r t i c l e is d e d i c a t e d to t h e s t u d y of t h E complex i ty of c o m p u t i n g a n d dec id ing s e l ec t ed p r o b l e m s in a lgebra.

37 citations


Journal ArticleDOI
TL;DR: By reformulating DRA into a parallel computational tree and using a multiple tree-root pipelining scheme, time complexity is reduced to O(nm), while the space complexity is reduction by a factor of 2.
Abstract: Discrete relaxation techniques have proven useful in solving a wide range of problems in digital signal and digital image processing, artificial intelligence, operations research, and machine vision. Much work has been devoted to finding efficient hardware architectures. This paper shows that a conventional hardware design for a Discrete Relaxation Algorithm (DRA) suffers from O(n2m3) time complexity and O(n2m2) space complexity. By reformulating DRA into a parallel computational tree and using a multiple tree-root pipelining scheme, time complexity is reduced to O(nm), while the space complexity is reduced by a factor of 2. For certain relaxation processing, the space complexity can even be decreased to O(nm). Furthermore, a technique for dynamic configuring an architectural wavefront is used which leads to an O(n) time highly concurrent DRA3 architecture.

37 citations


Book ChapterDOI
TL;DR: In this article, the authors introduce the concept of complex I/O theory, which is a generalization of the complex algebraic theory of complex i/O. The concept is used in the context of algebraic programming.
Abstract: This a r t i c l e is d e d i c a t e d to t h e s t u d y of t h e complex i ty of c o m p u t i n g a n d dec id ing s e l ec t ed p r o b l e m s in a lgebra , The m e t h o d s used he re a re a lgeb ra i ca l a n d (a lgebra ica l ly ) g e o m e t r i c a l in n a t u r e . The p r o b l e m s d e a l t wi th may be looked a t as m a t h e m a t i c a l mode l s of q u e s t i o n s ar ising in t h e o r e t i c a l c o m p u t e r sc ience . This, however, does n o t c o r r e s p o n d exac t ly to t h e h i s to r i cal d e v e l o p m e n t of a l g e b r a i c complex i ty theory , where a s p e c t s of n u m e r i c a l ana lys i s p lay a role too [ S t r a s s e n 1984]. As fa r as p r a c t i c a l a p p l i c a t i o n s a re c o n c e r n e d , i t is c e r t a i n l y t h e d e v e l o p m e n t of fas t a n d eff icient a l g o r i t h m s which is p r o m i n e n t . Here t h e m a t h e m a t i c i a n is i n t e r e s t e d m a i n l y in s u c h a l g o r i t h m s which a r e b a s e d on m a t h e m a t i c a l p rob lems . But t h i s a lone does n o t c o n s t i t u t e c o m p l e x i t y t h e o r y in i t s m o d e r n sense. I t is d e d i c a t e d to t h e s y s t e m a t i c s t u d y of t h e c o m p l e x i t y of p r o b l e m s known to be so lvable by a lgor i thms . Firs t , t h i s r e q u i r e s t h e d e v e l o p m e n t of m a t h e m a t i c a l l y p rec i se mode l s of complex i ty which o f t en a re s impl i f i ca t ions of t h e u s e r ' s c o n c e p t i o n s of eff iciency an d i t i n c l u d e s t h e q u e s t i o n w h e t h e r a g iven p r o c e d u r e is op t imal .

30 citations


Proceedings ArticleDOI
01 Jan 1987
TL;DR: Mehlhorn and Schmidt as mentioned in this paper showed that a function f with deterministic communication complexity n 2 can have Las Vegas communication complexity O(n), which is the best possible, because the deterministic complexity cannot be more than the square of the Las Vegas complexity for any function.
Abstract: Improving a result of Mehlhorn and Schmidt, a function f with deterministic communication complexity n2 is shown to have Las Vegas communication complexity O(n). This is the best possible, because the deterministic complexity cannot be more than the square of the Las Vegas communication complexity for any function.

27 citations


Journal ArticleDOI
TL;DR: In the worst case setting the complexity remains infinite, whereas in the average caseSetting the complexity becomes finite if the dimension of the range of a linear operator is at least two.

01 Jan 1987
TL;DR: An explicit combinatorial lower bound for this complexity measure is presented, which leads to an exp(( p n)) lower bound on the complexity of depth-restricted contact schemes computing some natural Boolean functions in NP.
Abstract: A notion of "communication complexity" is used to formally measure the degree to which a Boolean function is "global". An explicit combinatorial lower bound for this complexity measure is presented. In particular, this leads to an exp(( p n)) lower bound on the complexity of depth-restricted contact schemes computing some natural Boolean functions in NP.

Journal ArticleDOI
TL;DR: An exchange algorithm for the solution of minimax optimization problems involving convex functions is presented, and the complexity of this algorithm is shown to be either linear in the number of functions, or at least squared in that number.
Abstract: We present an exchange algorithm for the solution of minimax optimization problems involving convex functions. For a certain class of functions, the complexity of this algorithm is shown to be either linear in the number of functions, or at least squared in that number.

Book ChapterDOI
01 Jan 1987
TL;DR: In this paper, the depth of a leaf and the length of its external path were investigated, and the average case complexity of improved lexicographical sorting was shown to be asymptotic.
Abstract: We study multiway asymmetric tries. Our main interest is to investigate the depth of a leaf and the external path length, however we also formulate and solve a more general problem. We consider a class of properties called additive properties. This class is specified by a common recurrence relation. We give an exact solution of the recurrence, and present an asymptotic approximation. In particular, we derive all (factorial) moments of the depth of a leaf and the external path length. In addition, we solve an open problem of Paige and Tarjan about the average case complexity of the improved lexicographical sorting. These results extend previous analyses by Knuth [12], Flajolet and Sedgewick [6], Jacquet and Regnier [10], and Kirschenhofer and Prodinger [11].

Journal ArticleDOI
TL;DR: Asymptotic results for the sum-of-digits function with respect to Standard-Gray-code representation of positive integers are established and estimates yield two bounds for the average case complexity of Batcher's odd-even merge.

Book ChapterDOI
12 Oct 1987
TL;DR: This work analyses the average case performance of a simple backtracking algorithm for determining all exact-satisfying truth assignments of boolean formulas in conjunctive normal form with r clauses over n variables and shows that the average number of nodes in the backtracking trees of formulas from these classes is bounded by a constant.
Abstract: We analyse the average case performance of a simple backtracking algorithm for determining all exact-satisfying truth assignments of boolean formulas in conjunctive normal form with r clauses over n variables. A truth assignment exact-satisfies a formula, if in each clause exactly one literal is set to true. We show: If formulas are chosen by generating clauses independently, where each variable occurs in a clause either unnegated with probability p or negated with probability q or none of both with probability 1-p-q (p,q>0, p+q≦1), then the average number of nodes in the backtracking trees of formulas from these classes is bounded by a constant, for all neN, if r≧1n2/(pq) is chosen. (In case of p=q=1/3 the result holds for all r≧6.)

Book ChapterDOI
01 Jan 1987
TL;DR: One of the main ways of attacking the famous P =?NP problem and its associates consists in the consideration of some classical tools from the recursive function theory (different kind of reducibilities, relativization, immunity, a.s.o.) in complexity bounded forms.
Abstract: One of the main ways of attacking the famous P = ?NP problem and its associates consists in the consideration of some classical tools from the recursive function theory (different kind of reducibilities, relativization, immunity, a.s.o.) in complexity bounded forms.