scispace - formally typeset
Search or ask a question
Topic

Pushdown automaton

About: Pushdown automaton is a research topic. Over the lifetime, 1868 publications have been published within this topic receiving 35399 citations.


Papers
More filters
Book ChapterDOI
11 Oct 2011
TL;DR: A novel canonical automaton model, based on register automata, that can easily be used to specify protocol or program behavior, and is easier to comprehend than previous proposals, but it can also be exponentially more succinct than these.
Abstract: We present a novel canonical automaton model, based on register automata, that can easily be used to specify protocol or program behavior. More concretely, register automata are reminiscent of control flow graphs: they comprise a finite control structure, assignments, and conditionals, allowing to assign values of an infinite domain to registers (variables) and to compare them for equality. A major contribution is the definition of a canonical automaton representation of any language recognizable by a deterministic register automaton, by means of a Nerode congruence. Not only is this canonical form easier to comprehend than previous proposals, but it can also be exponentially more succinct than these. Key to the canonical form is the symbolic treatment of data languages, which overcomes the structural restrictions in previous formalisms, and opens the way to new practical applications.

27 citations

Journal ArticleDOI
TL;DR: An elementary construction of a Buchi automaton with O(16 n 2 ) states which recognizes the complement of the ω-language recognized by the first one is proposed.

27 citations

Journal ArticleDOI
TL;DR: This paper describes a new construction, inspired by Piterman's improvement to Safra's method, that produces an automaton with fewer states, via a nondeterministic automaton having derivatives as states.
Abstract: In an earlier paper, the author used derivatives to construct a deterministic automaton recognizing the language defined by an ω-regular expression. The construction was related to a determinization method invented by Safra. This paper describes a new construction, inspired by Piterman's improvement to Safra's method. It produces an automaton with fewer states. In addition, the presentation and proofs are simplified by going via a nondeterministic automaton having derivatives as states.

27 citations

Book ChapterDOI
06 Sep 1997
TL;DR: The Neural Network Pushdown Automaton (NNPDA) model as discussed by the authors is a hybrid model that couples a recurrent network to an external stack memory to learn a class of context-free grammars.
Abstract: Recurrent neural networks are dynamical network structures which have the capabilities of processing and generating temporal information. To our knowledge the earliest neural network model that processed temporal information was that of MeCulloch and Pitts [McCulloch43]. Kleene [Kleene56] extended this work to show the equivalence of finite automata and McCulloch and Pitts' representation of nerve net activity. Minsky [Minsky67] showed that any hard-threshold neural network could represent a finite state automata and developed a method for actually constructing a neural network finite state automata. However, many different neural network models can be defined as recurrent; for example see [Grossberg82] and [Hopfield82]. Our focus is on discrete-time recurrent neural networks that dynamically process temporal information and follows in the tradition of dynamically (nonautonomous) recurrent network models defined by [Elman90, Jordan86, Narendra90, Pollack91,Tsoi94]. In particular this paper develops a new model, a neural network pushdown automaton (NNPDA), which is a hybrid system that couples a recurrent network to an external stack memory. More importantly, a NNPDA should be capable of learning and recognizing some class of context-free grammars. As such, this model is a significant extension of previous work where neural network finite state automata simulated and learned regular grammars. We explore the capabilities of such a model by inferring automata from sample strings the problem of grammatical inference. It is important to note that our focus is only on that of inference, not of prediction or translation. We will be concerned with problem of inferring an unknown system model based on observing sample strings and not on predicting the next string dement in a sequence. In some ways, our problem can be thought of as one of system identification [Ljung87].

26 citations

Journal ArticleDOI
TL;DR: The use of pushdown automata (PDA) in the context of statistical machine translation and alignment under a synchronous context-free grammar and a two-pass decoding strategy involving a weaker language model in the first-pass is proposed to address the results of PDA complexity analysis.
Abstract: This article describes the use of pushdown automata (PDA) in the context of statistical machine translation and alignment under a synchronous context-free grammar. We use PDAs to compactly represent the space of candidate translations generated by the grammar when applied to an input sentence. General-purpose PDA algorithms for replacement, composition, shortest path, and expansion are presented. We describe HiPDT, a hierarchical phrase-based decoder using the PDA representation and these algorithms. We contrast the complexity of this decoder with a decoder based on a finite state automata representation, showing that PDAs provide a more suitable framework to achieve exact decoding for larger synchronous context-free grammars and smaller language models. We assess this experimentally on a large-scale Chinese-to-English alignment and translation task. In translation, we propose a two-pass decoding strategy involving a weaker language model in the first-pass to address the results of PDA complexity analysis. We study in depth the experimental conditions and tradeoffs in which HiPDT can achieve state-of-the-art performance for large-scale SMT.

26 citations


Network Information
Related Topics (5)
Time complexity
36K papers, 879.5K citations
87% related
Finite-state machine
15.1K papers, 292.9K citations
86% related
Model checking
16.9K papers, 451.6K citations
84% related
Concurrency
13K papers, 347.1K citations
84% related
String (computer science)
19.4K papers, 333.2K citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202314
202234
202129
202052
201947
201834