scispace - formally typeset
Search or ask a question

Showing papers on "Average-case complexity published in 1973"


Proceedings ArticleDOI
30 Apr 1973
TL;DR: A programming language for the partial computable functionals is used as the basis for a definition of the computational complexity of functionals (type2 functions), and an axiomatic account in the spirit of Blum is provided.
Abstract: A programming language for the partial computable functionals is used as the basis for a definition of the computational complexity of functionals (type2 functions). An axiomatic account in the spirit of Blum is then provided. The novel features of this approach are justified by applying it to problems in abstract complexity, specifically operator speed-up, and by using it to define the illusive notion of the polynomial degree of an arbitrary function. New results are obtained for these degrees.

56 citations


Journal ArticleDOI
01 Feb 1973
TL;DR: This paper presents an improved proof of the Blum speed-up theorem which has a straightforward generalization to obtain operator speed-ups and eliminates all priority mechanisms and all but the most transparent appeals to the recursion theorem.
Abstract: Perhaps the two most basic phenomena discovered by the recent application of recursion theoretic methods to the developing theories of computational complexity have been Blum's speed-up phenomena, with its extension to operator speed-up by Meyer and Fischer, and the Borodin gap phenomena, with its extension to operator gaps by Constable. In this paper we present a proof of the operator gap theorem which is much simpler than Constable's proof. We also present an improved proof of the Blum speed-up theorem which has a straightforward generalization to obtain operator speed-ups. The proofs of this paper are new; the results are not. The proofs themselves are entirely elementary: we have eliminated all priority mechanisms and all but the most transparent appeals to the recursion theorem. Even these latter appeals can be eliminated in some "reasonable" complexity measures. Imnplicit in the proofs is what we believe to be a new method for viewing the construction of "complexity sequences." Unspecified notation follows Rogers [12]. 2iqi is any standard indexing of the partial recursive functions. N is the set of all nonnegative integers. 2iDi is a canonical indexing of all finite subsets of N: from i we can list Di and know when the listing is completed. Similarly, 2iFi is a canonical indexing of all finite functions defined (exactly) on some initial segment {0, 1, 2, , n}. ;,D, is any Blum measure of computational complexity or resource. Specifically, for all i, domain (Di= domain Xi, and the ternary relation (Pi(x)

32 citations



Proceedings ArticleDOI
Paul Young1
15 Oct 1973
TL;DR: This work considers the extent to which it's possible, given 8 program p for computing a function, f, to Find an optimal program p' which also computes f and is either provably equivalent to p or else provably an optimalprogram.
Abstract: We consider the extent to which it's possible, given 8 program p for computing a function, f, to Find an optimal program p' which also computes f and is either provably equivalent to p or else provably an optimal program. Our methods and problems come chiefly from abstract recurslontheoretic complexity theory, but some of our results may be viewed as directly challenging the intuitive interpretatIon of earlier results In the area.

1 citations