scispace - formally typeset
Search or ask a question

Showing papers by "Juris Hartmanis published in 1971"


Journal ArticleDOI
TL;DR: It is the conviction that by now this theory is an essential part of the theory of computation, and that in the future it will be an important theory which will permeate much of the theoretical work in computer science.
Abstract: The purpose of this paper is to outline the theory of computational complexity which has emerged as a comprehensive theory during the last decade. This theory is concerned with the quantitative aspects of computations and its central theme is the measuring of the difficulty of computing functions. The paper concentrates on the study of computational com- plexity measures defined for all computable functions and makes no attempt to survey the whole field exhaustively nor to present the material in historical order. Rather it presents the basic concepts, results, and techniques of computational complexity from a new point of view from which the ideas are more easily understood and fit together as a coherent whole. It is clear that a viable theory of computation must deal realistically with the quantitative aspects of computing and must develop a general theory which studies the properties of possible measures of the difficulty of computing functions. Such a theory must go beyond the classification of functions as computable and non- computable, or elementary and primitive reeursive, etc. It must concern itself with computational complexity measures which are defined for all possible computations and which assign a complexity to each computation which terminates. Furthermore, this theory must eventually reflect some aspects of real computing to justify its existence by contributing to the general development of computer science. During the last decade, considerable progress has been made in the development of such a theory dealing with the complexity of computations. It is our conviction that by now this theory is an essential part of the theory of computation, and that in the future it will be an important theory which will permeate much of the theoretical work in computer science. Our purpose in this paper is to outline the recently developed theory of computa- tional complexity by presenting its central concepts, results, and techniques. The paper is primarily concerned with the study of computational complexity measures defined for all computable partial functions and no attempt is made to survey the whole field exhaustively nor to present the material in historical order. Rather, we

134 citations


Journal ArticleDOI
TL;DR: In this note, natural reference sets are presented which belong to the complete degrees at each level of the arithmetic hierarchy and provide simple methods of determining the degrees of unsolvability for several well-known problems.

13 citations


Proceedings ArticleDOI
03 May 1971
TL;DR: A model for the study of quantitative problems about formal translations from one programming language into another, as well as derive some initial results about the speed of programs produced by translations are given.
Abstract: The purpose of this paper is to give a model for the study of quantitative problems about formal translations from one programming language into another, as well as derive some initial results about the speed of programs produced by translations. The paper also contains a new speed-up result which shows that in any computational complexity measure there exist functions which have arbitrarily large speed-ups but that the size of the speed-up programs must grow non-computably fast.

7 citations