scispace - formally typeset
Search or ask a question
Topic

Turing machine

About: Turing machine is a research topic. Over the lifetime, 5017 publications have been published within this topic receiving 125257 citations. The topic is also known as: deterministic Turing machine.


Papers
More filters
Journal ArticleDOI
TL;DR: The amount of storage needed to simulate a nondeterministic tape bounded Turingmachine on a deterministic Turing machine is investigated and a specific set is produced, namely the set of all codings of threadable mazes, such that, if there is any set which distinguishes nondeter microscopic complexity classes from deterministic tape complexity classes, then this is one such set.

1,414 citations

Journal ArticleDOI
TL;DR: A sequence of restrictions that limit grammars first to Turing machines, then to two types of system from which a phrase structure description of the generated language can be drawn, and finally to finite state Markov sources are shown to be increasingly heavy.
Abstract: A grammar can be regarded as a device that enumerates the sentences of a language. We study a sequence of restrictions that limit grammars first to Turing machines, then to two types of system from which a phrase structure description of the generated language can be drawn, and finally to finite state Markov sources (finite automata). These restrictions are shown to be increasingly heavy in the sense that the languages that can be generated by grammars meeting a given restriction constitute a proper subset of those that can be generated by grammars meeting the preceding restriction. Various formulations of phrase structure description are considered, and the source of their excess generative power over finite state sources is investigated in greater detail.

1,330 citations

Posted Content
Alex Graves1, Greg Wayne1, Ivo Danihelka1
TL;DR: Neural Turing Machines as discussed by the authors extend the capabilities of neural networks by coupling them to external memory resources, which they can interact with by attentional processes, analogous to a Turing Machine or Von Neumann architecture but is differentiable end-to-end.
Abstract: We extend the capabilities of neural networks by coupling them to external memory resources, which they can interact with by attentional processes. The combined system is analogous to a Turing Machine or Von Neumann architecture but is differentiable end-to-end, allowing it to be efficiently trained with gradient descent. Preliminary results demonstrate that Neural Turing Machines can infer simple algorithms such as copying, sorting, and associative recall from input and output examples.

1,328 citations

Posted Content
TL;DR: This paper introduced pointer networks (Ptr-Net) to learn the conditional probability of an output sequence with elements that are discrete tokens corresponding to positions in an input sequence using attention as a pointer to select a member of the input sequence as the output.
Abstract: We introduce a new neural architecture to learn the conditional probability of an output sequence with elements that are discrete tokens corresponding to positions in an input sequence Such problems cannot be trivially addressed by existent approaches such as sequence-to-sequence and Neural Turing Machines, because the number of target classes in each step of the output depends on the length of the input, which is variable Problems such as sorting variable sized sequences, and various combinatorial optimization problems belong to this class Our model solves the problem of variable size output dictionaries using a recently proposed mechanism of neural attention It differs from the previous attention attempts in that, instead of using attention to blend hidden units of an encoder to a context vector at each decoder step, it uses attention as a pointer to select a member of the input sequence as the output We call this architecture a Pointer Net (Ptr-Net) We show Ptr-Nets can be used to learn approximate solutions to three challenging geometric problems -- finding planar convex hulls, computing Delaunay triangulations, and the planar Travelling Salesman Problem -- using training examples alone Ptr-Nets not only improve over sequence-to-sequence with input attention, but also allow us to generalize to variable size output dictionaries We show that the learnt models generalize beyond the maximum lengths they were trained on We hope our results on these tasks will encourage a broader exploration of neural learning for discrete problems

1,082 citations

Journal ArticleDOI
TL;DR: A recognition algorithm is exhibited whereby an arbitrary string over a given vocabulary can be tested for containment in a given context-free language and it is shown that it is completed in a number of steps proportional to the “cube” of the number of symbols in the tested string.
Abstract: A recognition algorithm is exhibited whereby an arbitrary string over a given vocabulary can be tested for containment in a given context-free language. A special merit of this algorithm is that it is completed in a number of steps proportional to the “cube” of the number of symbols in the tested string. As a byproduct of the grammatical analysis, required by the recognition algorithm, one can obtain, by some additional processing not exceeding the “cube” factor of computational complexity, a parsing matrix—a complete summary of the grammatical structure of the sentence. It is also shown how, by means of a minor modification of the recognition algorithm, one can obtain an integer representing the ambiguity of the sentence, i.e., the number of distinct ways in which that sentence can be generated by the grammar. The recognition algorithm is then simulated on a Turing Machine. It is shown that this simulation likewise requires a number of steps proportional to only the “cube” of the test string length.

1,075 citations


Network Information
Related Topics (5)
Time complexity
36K papers, 879.5K citations
89% related
Data structure
28.1K papers, 608.6K citations
87% related
Approximation algorithm
23.9K papers, 654.3K citations
84% related
Concurrency
13K papers, 347.1K citations
84% related
Directed graph
12.2K papers, 302.4K citations
84% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202356
2022139
2021128
2020148
2019168
2018141