Topic
Automaton
About: Automaton is a research topic. Over the lifetime, 2389 publications have been published within this topic receiving 53824 citations. The topic is also known as: automata & automated machine.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: Theoretically, it is proved that the automaton converges to the global optimum with a probability arbitrarily close to one, and the simulation result shows that this automata model converges faster than the existing models in the literature.
Abstract: The multi-modal optimization problem is considered. An automata model with improved learning schemes is proposed to solve the global optimization problem. The numerical simulation shows that the automata approach is better than the well-known gradient approach because the gradient approach is easily trapped inside the local optimal states. Theoretically, we prove that the automaton converges to the global optimum with a probability arbitrarily close to one. The simulation result also shows that our automata model converges faster than the existing models in the literature
•
01 Jan 2016
TL;DR: An inductive algorithm to compute the expansion of an expression from which the automaton follows is introduced, independent of the size of the alphabet, and actually even supports infinite alphabets.
Abstract: We present an algorithm to build an automaton from a rational expression. This approach introduces support for extended weighted expressions. Inspired by derived-term based algorithms, its core relies on a dierent construct, rational expansions. We introduce an inductive algorithm to compute the expansion of an expression from which the automaton follows. This algorithm is independent of the size of the alphabet, and actually even supports infinite alphabets. It can easily be accommodated to generate deterministic (weighted) automata. These constructs are implemented in Vcsn, a free-software platform dedicated to weighted automata and rational expressions.
01 Jan 2011
TL;DR: It is demonstrated that BNAs provide a useful framework for developing and analysing models and algorithms for structure prediction, and the complexity of inference with a BNA is bounded by the complex of inference in the bounded Bayesian Network times the complexity for the equivalent stochastic automaton.
Abstract: This paper proposes a framework which unifies graphical model theory and formal language theory through automata theory. Specifically, we propose Bayesian Network Automata (BNAs) as a formal framework for specifying graphical models of arbitrarily large structures, or equivalently, specifying probabilistic grammars in terms of graphical models. BNAs use a formal automaton to specify how to construct an arbitrarily large Bayesian Network by connecting multiple copies of a bounded Bayesian Network. Using a combination of results from graphical models and formal language theory, we show that, for a large class of automata, the complexity of inference with a BNA is bounded by the complexity of inference in the bounded Bayesian Network times the complexity of inference for the equivalent stochastic automaton. This illustrates that BNAs provide a useful framework for developing and analysing models and algorithms for structure prediction.
••
18 Sep 1997TL;DR: The means to directly use rules with the added symbols in these rules are presented, leading to a smaller memory usage and a faster implementation of local grammars.
Abstract: This article presents a practical implementation of local grammars used in Natural Language Processing. The application o£ local grammars is a problem o£ multiple pattern matching. The usual strategy derived from [1] is to build a deterministic automaton of Σ * G where G is the local grammar and then compute the intersection (or the difference in the case of a negative grammar) of this automaton and an automaton representing a tagged text. This is very expensive in terms of memory usage, the Σ * G uses already more than 30 Megabytes, and our grammars are not finished yet. These figures comes from two facts: first, the number of grammar rules and, second, the extensive use of generic or “don't care” symbols in these rules. We present here the means to directly use rules with the added symbols, leading to a smaller memory usage and a faster implementation.
••
01 Jul 1997
TL;DR: The simulation of the behavior of physical systems on the coarsest level, namely, restoring the logical structure of propositions about the system is suggested, and the special class of finite automata, called normalized, is introduced.
Abstract: The simulation of the behavior of physical systems on the coarsest level, namely, restoring the logical structure of propositions about the system is suggested. To realize this simulation, the special class of finite automata, called normalized, is introduced. Relevant graph-theoretical techniques are considered.