scispace - formally typeset
Search or ask a question
Topic

ω-automaton

About: ω-automaton is a research topic. Over the lifetime, 2299 publications have been published within this topic receiving 68468 citations. The topic is also known as: stream automaton & ω-automata.


Papers
More filters
Book ChapterDOI
01 Oct 1989

5 citations

Proceedings ArticleDOI
01 Aug 2012
TL;DR: An improved tableau-based algorithm is proposed, which generates transition-based Büchi automata for given LTL formulas directly, by applying an efficient on-the-fly degeneralization.
Abstract: The translation from LTL formulas to Buchi automata plays a key role in explicit model checking. Most algorithms for obtaining Buchi automata from LTL formulas involve intermediate forms, and perform simplifications on the intermediate or the final translation product. In this paper an improved tableau-based algorithm is proposed, which generates transition-based Buchi automata for given LTL formulas directly, by applying an efficient on-the-fly degeneralization. The algorithm circumvents the intermediate steps and the simplification process that follows, and therefore performs more efficiently. On-the-fly simplifications as well as BDD presentations are adopted in the algorithm to gain significant reduction both on the size of result automata and on the computational complexity. With some experimental results, a comparison between our method and previous work is given. It is shown that our approach yields smaller automata for the formulas commonly found in the literature, especially for those containing a large portion of GU- or GF-formulas.

5 citations

Book ChapterDOI
06 Oct 1997
TL;DR: The class of simple deterministic finite-memory automata is exactly learnable via membership and equivalence queries and the running time is estimated.
Abstract: This paper establishes the learnability of simple deterministic finite-memory automata via membership and equivalence queries Simple deterministic finite-memory automata are a subclass of finite-memory automata introduced by Kaminski and Francez [9] as a model generalizing finite automata to infinite alphabets For arriving at a meaningful learning model, we first prove the equivalence problem for simple deterministic finite-memory automata to be decidable by reducing it to the equivalence problem for finite state automata In particular, there exists a decision algorithm taking as input any two simple deterministic finite-memory automata A and B which computes a number k from A and B as well as two finite-state automata MA and MB over a finite alphabet Σ of cardinality k such that A and B are equivalent over all alphabets iff MA and MB are equivalent over Σ Next, we provide the announced learning algorithm, show its correctness, and analyze its running time The algorithm is partially based on Angluin's [1] observation table In particular, for every target and each finite alphabet Σ, the algorithm outputs a hypothesis that is consistent with the target over Σ Together with the first result mentioned above, we obtain the main result of this paper, ie, the class of simple deterministic finite-memory automata is exactly learnable via membership and equivalence queries Finally, the running time is estimated

5 citations

01 Jan 2003
TL;DR: This chapter presents a new algorithm for incrementally building minimal acyclic deterministic finite automata, which facilitates the construction of very large automata in limited computer memory where other (nonincremental) algorithms would fail with intermediate data structures too large to fit in memory.
Abstract: This chapter presents a new algorithm for incrementally building minimal acyclic deterministic finite automata. Such minimal automata are a compact representation of a finite set of words (e.g. in a spell checker). The incremental aspect of such algorithms (where the intermediate automaton is minimal) facilitates the construction of very large automata in limited computer memory where other (nonincremental) algorithms would fail with intermediate data structures too large to fit in memory.

5 citations


Network Information
Related Topics (5)
Time complexity
36K papers, 879.5K citations
88% related
Data structure
28.1K papers, 608.6K citations
83% related
Model checking
16.9K papers, 451.6K citations
83% related
Approximation algorithm
23.9K papers, 654.3K citations
82% related
Petri net
25K papers, 406.9K citations
82% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20238
202219
20201
20191
20185
201748