scispace - formally typeset
Search or ask a question

Showing papers on "Time complexity published in 1975"


Journal ArticleDOI
TL;DR: The problem of finding a longest common subsequence of two strings has been solved in quadratic time and space and an algorithm is presented which will solve this problem in QuadraticTime and in linear space.
Abstract: The problem of finding a longest common subsequence of two strings has been solved in quadratic time and space. An algorithm is presented which will solve this problem in quadratic time and in linear space.

1,164 citations


Journal ArticleDOI
01 Jan 1975-Networks
TL;DR: A large class of classical combinatorial problems, including most of the difficult problems in the literature of network flows and computational graph theory, are shown to be equivalent, in the sense that either all or none of them can be solved in polynomial time.
Abstract: A large class of classical combinatorial problems, including most of the difficult problems in the literature of network flows and computational graph theory, are shown to be equivalent, in the sense that either all or none of them can be solved in polynomial time. Moreover, they are solvable in polynomial time if and only if every problem solvable by a polynomial-depth backtrack search is also solvable by a polynomial-time algorithm. The technique of deriving these results through problem transformations is explained, and some comments are made as to the probable effect of these results on research in the field of combinatorial algorithms.

613 citations


Proceedings ArticleDOI
13 Oct 1975
TL;DR: The theorem that a meeting function always exists if all teachers and classes have no time constraints is proved and the multi-commodity integral flow problem is shown to be NP-complete even if the number of commodities is two.
Abstract: A very primitive version of Gotlieb's timetable problem is shown to be NP-complete, and therefore all the common timetable problems are NP-complete. A polynomial time algorithm, in case all teachers are binary, is shown. The theorem that a meeting function always exists if all teachers and classes have no time constraints is proved. The multi-commodity integral flow problem is shown to be NP-complete even if the number of commodities is two. This is true both in the directed and undirected cases. Finally, the two commodity real flow problem in undirected graphs is shown to be solvable in polynomial time. The time bound is O(|v|2|E|).

417 citations


Journal ArticleDOI
TL;DR: The worst-case time complexity of algorithms for multiprocessor computers with binary comparisons as the basic operations is investigated and the algorithm for finding the maximum is shown to be optimal for all values of k and n.
Abstract: The worst-case time complexity of algorithms for multiprocessor computers with binary comparisons as the basic operations is investigated. It is shown that for the problems of finding the maximum, sorting, and merging a pair of sorted lists, if n, the size of the input set, is not less than k, the number of processors, speedups of at least $O(k/\log \log k)$ can be achieved with respect to comparison operations. The algorithm for finding the maximum is shown to be optimal for all values of k and n.

412 citations


Journal ArticleDOI
TL;DR: Several variations of the single fault detection problem for combinational logic circuits are looked at and it is shown that deciding whether single faults are detectable by input-output (I/O) experiments is polynomially complete, i.e., there is a polynomial time algorithm to decide if they are detectable.
Abstract: We look at several variations of the single fault detection problem for combinational logic circuits and show that deciding whether single faults are detectable by input-output (I/O) experiments is polynomially complete, i.e., there is a polynomial time algorithm to decide if these single faults are detectable if and only if there is a polynomial time algorithm for problems such as the traveling salesman problem, knapsack problem, etc.

265 citations


Journal ArticleDOI
TL;DR: The present algorithm, based on the Knuth-Morris-Prat algorithm, solves the problem of recognizing the initial leftmost nonvoid palindrome of a string in time proportional to the length N of thePalindrome, and an extension allows one to recognize the initial odd or even palindromes of length 2 or greater.
Abstract: Despite significant advances in linear-time scanning algorithms, particularly those based wholly or in par t on either Cook's linear-time simulation of two-way deterministic pushdown automata or Weiner's algorithm, the problem of recognizing the initial leftmost nonvoid palindrome of a string in time proportional to the length N of the palindrome, examining no symbols other than those in the palindrome, has remained open. The present algorithm solves this problem, assuming tha t addition of two integers less than or equal to N may be performed in a single operation. Like th e Knuth-Morris-Prat t algorithm, i t runs in time independent of the size of the input alphabet. T h e algorithm as presented finds only even palindromes. However, an extension allows one to recognize the initial odd or even palindrome of length 2 or greater. Other easy extensions permit the recognition of strings (wwR) * of even palindromes and of all the initial palindromes. I t appears possible tha t further extension may be used to show tha t (wwR) * is in a sense recognizable in real time on a reasonably defined random access machine. xEv WORDS AND PHa~SES: linear-time algorithm, on-line recognition, palindrome CR CATEGORIES: 5.22, 5.25, 5.30

171 citations


Proceedings ArticleDOI
05 May 1975
TL;DR: The CELLAR algorithm is presented, and proof that ESSCP, with WI < WC = WD = @@@@, 0 < WS < @ @@@, suitably encoded, is NP-complete is proved.
Abstract: The Extended String-to-String Correction Problem [ESSCP] is defined as the problem of determining, for given strings A and B over alphabet V, a minimum-cost sequence S of edit operations such that S(A) = B. The sequence S may make use of the operations: Change, Insert, Delete and Swaps, each of constant cost WC, WI, WD, and WS respectively. Swap permits any pair of adjacent characters to be interchanged. The principal results of this paper are: (1) a brief presentation of an algorithm (the CELLAR algorithm) which solves ESSCP in time O(¦A¦* ¦B¦* ¦V¦s*s), where s = min(4WC, WI+WD)/WS + 1; (2) presentation of polynomial time algorithms for the cases (a) WS = 0, (b) WS > 0, WC= WI= WD=

117 citations


Journal ArticleDOI
TL;DR: The recognition speed of context-free languages (CFL’s) using arrays of finite state machines is considered and it is shown that CFLs can be recognized by 2-dimensional arrays in linear time and by 1- dimensional arrays in time.
Abstract: The recognition speed of context-free languages (CFL’s) using arrays of finite state machines is considered. It is shown that CFL’s can be recognized by 2-dimensional arrays in linear time and by 1-dimensional arrays in time $n^2 $.

74 citations


Proceedings ArticleDOI
Ken Kennedy1
01 Jan 1975
TL;DR: A new approach to global program data flow analysis which constructs a "node listing" for the control flow graph is discussed and a simple algorithm which uses a node listing to determine the live variables in a program is presented.
Abstract: A new approach to global program data flow analysis which constructs a "node listing" for the control flow graph is discussed and a simple algorithm which uses a node listing to determine the live variables in a program is presented. This algorithm combined with a fast node listing constructor due to Aho and Ullman has produced an 0(n log n) algorithm for live analysis. The utility of the node-listing method is demonstrated by an examination of the class of graphs for which "short" listings exist. This class is quite similar to the class of graphs for "understandable" programs.

66 citations


Journal ArticleDOI
TL;DR: The application of well- known results to a new problem to save computation time in sorting sets of numbers of the form X q- Y, and the main appeal of the new results, aside from their practical uses, is the diversity of the old ideas behind them.
Abstract: This paper consists of the application of well- known results to a new problem. The problem, posed by E. Berlekamp, was to save computation time in sorting sets of numbers of the form X q- Y. The main appeal of the new results, aside from their practical uses, is the diversity of the old ideas behind them. It is also interesting to see how the modeling of data and computing affect the results: when the model is of sorting n-tupules of real numbers with binary compari- sons, the best result we can do is nalogan comparisons, which is shown to be sharp under certain circumstances; when the model is of sorting n-tuples of integers in the range 0 to n -- 1, on a random-access stored program computer, the time complexity is shown to be on the order of n(logan).

38 citations


Journal ArticleDOI
TL;DR: It is shown that the problem of determining whether an arbitrary context-free grammar is a member of some easily parsed subclass of grammars such as the LR(k) Grammars is NP-complete when k is expressed in unary.
Abstract: The problem of determining whether an arbitrary context-free grammar is a member of some easily parsed subclass of grammars such as the LR(k) grammars is considered. The time complexity of this problem is analyzed both when k is considered to be a fixed integer and when k is considered to be a parameter of the test. In the first case, it is shown that for every k there exists an O(nk+2) algorithm for testing the LR(k) property, where n is the size of the grammar in question. On the other hand, if both k and the subject grammar are problem parameters, then the complexity of the problem depends very strongly on the representation chosen for k. More specifically, it is shown that this problem is NP-complete when k is expressed in unary. When k is expressed in binary the problem is complete for nondeterministic exponential time. These results carry over to many other parameterized classes of grammars, such as the LL(k), strong LL(k), SLR(k), LC(k), and strong LC(k) grammars.

Journal ArticleDOI
TL;DR: An additive degree of freedom is defined, which turns out to be an exact measure of the complexity of computation of a family 7 of linear forms in r variables over a field.
Abstract: The notion of the linear algorithm to compute a family 7 of linear forms in r variables over a field is defined. Ways to save addltmns are investigated by analyzing the combinatorial aspects of linear dependences between subrows of a given matrix F. Further, an additive degree of freedom is defined, which turns out to be an exact measure of the complexity of computation of 7.

Journal ArticleDOI
TL;DR: A bounded workspace copying algorithm for arbitrary list structures that operates in linear time and does not require tag bits and is applicable to fixed or variable size nodes.
Abstract: A bounded workspace copying algorithm for arbitrary list structures is given. This algorithm operates in linear time and does not require tag bits. The best previous bounded workspace copying algorithms achieved n2 time without tag bits and n log n time with one tag. The only restriction on the algorithm given here is that the copy must be placed into a contiguous section of memory. The method is applicable to fixed or variable size nodes.

Proceedings ArticleDOI
13 Oct 1975
TL;DR: Valiant's decision procedure for equivalence of deterministic finite-turn pushdown machines is improved upon and an algorithm for constructing a suitable function which determines the replacements is obtained.
Abstract: In this paper Valiant's decision procedure for equivalence of deterministic finite-turn pushdown machines is improved upon The improved equivalence test is: Given two mahcines, one constructs a pushdown machine that simulates them simultaneously and accepts a string iff it is accepted by exactly one of them The given machines are equivalent iff the simulating pda accepts the empty language The simulating machine uses its pushdown store to hold the contents of the stores of the two simulated machines In order ot prevent the tops of the two stores to get too far apart the simulating machine sometimes replaces the contents of one of the stores Valiant 1,2 has proved the existence of a suitable function which determines the replacements The crux of this paper is an algorithm for constructing such a function The existence of such an algorithm enables us to calculate an upper bound on the time complexity of the improved algorithm We obtain the results that equivalence of deterministic finite-turn pushdown automata can be tested in super-exponential time and equivalence of deterministic two-tape automata can be tested in exponential time

Proceedings ArticleDOI
05 May 1975
TL;DR: Several procedures based on (not necessarily regular) resolution for checking whether a formula in CN3 is contradictory are considered and all of the proposed procedures which are forced to run in polynomial time do not always work and do not identify all contradictory formulas.
Abstract: Several procedures based on (not necessarily regular) resolution for checking whether a formula in CN3 is contradictory are considered. The procedures use various methods of bounding the size of the clauses which are generated. The following results are obtained: 1) All of the proposed procedures which are forced to run in polynomial time do not always work—i.e., they do not identify all contradictory formulas. 2) Those which always work must run in exponential time. The exponential lower bounds for these procedures do not follow from Tseitin's lower bound for regular resolution since these procedures also allow nonregular resolution trees.

Proceedings ArticleDOI
01 Jan 1975
TL;DR: It is shown that both the upper and the lower bounds on the time complexity of the circularity test for Knuth's attribute grammars are exponential functions of the size of the grammar description.
Abstract: It is shown that both the upper and the lower bounds on the time complexity of the circularity test for Knuth's attribute grammars are exponential functions of the size of the grammar description. This result implies the "intractability" of the circularity test in the sense that the implementation of a general algorithm is not feasible. Another significance of this result is that this is one of the first problems actually arising in practice which has been proven to be of exponential time complexity.