scispace - formally typeset
Search or ask a question

Showing papers on "Nested word published in 2015"


Proceedings Article
25 Jul 2015
TL;DR: This paper introduces UL* -- a learning algorithm for universal automata (the dual of non-deterministic automata); and AL* -- the dual of alternating automata(s), which generalize both universal and non-trivial automata.
Abstract: Nearly all algorithms for learning an unknown regular language, in particular the popular L* algorithm, yield deterministic finite automata. It was recently shown that the ideas of L* can be extended to yield non-deterministic automata, and that the respective learning algorithm, NL*, outperforms L* on randomly generated regular expressions. We conjectured that this is due to the existential nature of regular expressions, and NL* might not outperform L* on languages with a universal nature. In this paper we introduce UL* -- a learning algorithm for universal automata (the dual of non-deterministic automata); and AL* -- a learning algorithm for alternating automata (which generalize both universal and non-deterministic automata). Our empirical results illustrate the advantages and trade-offs among L*, NL*, UL* and AL*.

41 citations


DOI
01 Jan 2015
TL;DR: One of the stated goals of this thesis is to change this situation, by giving a rigorously formal description of an approach to active automata learning that is independent of specific data structures or algorithmic realizations.
Abstract: The wealth of model-based techniques in software engineering—such as model checking or model-based testing—is starkly contrasted with a frequent lack of formal models in practical settings. Sophisticated static analysis techniques for obtaining models from a sourceor bytecode representation have matured to close this gap to a large extent, yet they might fall short on more complex systems: be it that no sufficiently robust decision procedures are available, or that the system performs calls to external, closed source libraries or even remote web services. Active automata learning has been proposed as a means of overcoming this problem: by executing test cases on a system, finite-state machine models reflecting a portion of the actual runtime behavior of the targeted system can be inferred. This positions active automata learning as an enabler technology, extending the range of application for a whole array of formal, model-based techniques. Its usefulness has been proven in many different subfields of formal methods, such as black-box model checking, test-case generation, interface synthesis, or compositional verification. In a much-noted case study, active automata learning played a key role in analyzing the internal structure of a botnet with the aim of devising countermeasures. One of the major obstacles of applying active automata learning in practice is, however, the fact that it is a rather costly technique: to gain sufficient information for inferring a model, a large number of test cases need to be executed, which is also referred to as “posing queries.” These test cases may be rather heavy-weighted, comprising high-latency operations such as interactions with hardware or remote network services, and learning systems of moderate size may take hours or days even when using algorithms with polynomial query complexities. The costliness of the technique calls for highly efficient algorithms that do not waste any information. The reality is surprisingly different from that ideal: many active automata learning algorithms that are being used in practice—including the well-known L∗ algorithm, which was the first one with a polynomial query complexity—frequently resort to heuristics to ensure certain properties, resulting in an increased overall query complexity. However, it has rarely been investigated why or even if these properties are necessary to ensure correctness, or what violating them entails. Related is the observation that descriptions of active automata learning algorithms are often less-than-formal, and merely focus on somehow arriving at a correctness proof instead of motivating and justifying the single steps. It is one of the stated goals of this thesis to change this situation, by giving a rigorously formal description of an approach to active automata learning that is independent of specific data structures or algorithmic realizations. This formal description allows the identification of a number of properties, some of which are necessary, while others are merely desirable. The connection between these properties, as well as possible reasons for their violation, are investigated. This leads to the observation that, while for each property there is an existing algorithm maintaining it, no algorithm manages to simultaneously maintain all desirable properties. Based on these observations, and exploiting further insights attained through the formalization, a novel active automata learning algorithm, called TTT, is developed. The distinguishing

38 citations


Journal ArticleDOI
01 Jan 2015
TL;DR: The equivalence between 2-limited automata and pushdown automata is investigated, and it is proved exponential upper and lower bounds for the sizes of push down automata simulating 2- limited automata are proved.
Abstract: Limited automata are one-tape Turing machines which are allowed to rewrite each tape cell only in the first d visits, for a given constant d. For each d ≥ 2, these devices characterize the class of context-free languages. We investigate the equivalence between 2-limited automata and pushdown automata, comparing the relative sizes of their descriptions. We prove exponential upper and lower bounds for the sizes of pushdown automata simulating 2-limited automata. In the case of the conversion of deterministic 2-limited automata into deterministic pushdown automata the upper bound is double exponential and we conjecture that it cannot be reduced. On the other hand, from pushdown automata we can obtain equivalent 2-limited automata of polynomial size, also preserving determinism. From our results, it follows that the class of languages accepted by deterministic 2-limited automata coincides with the class of deterministic context-free languages.

30 citations


Book ChapterDOI
16 Sep 2015
TL;DR: In this paper, it was shown that one-way bounded-error probabilistic automata can solve a family of unary promise problems with only two states, and that the number of states required to solve these problems cannot be bounded by a constant.
Abstract: Promise problems were mainly studied in quantum automata theory. Here we focus on state complexity of classical automata for promise problems. First, it was known that there is a family of unary promise problems solvable by quantum automata by using a single qubit, but the number of states required by corresponding one-way deterministic automata cannot be bounded by a constant. For this family, we show that even two-way nondeterminism does not help to save a single state. By comparing this with the corresponding state complexity of alternating machines, we then get a tight exponential gap between two-way nondeterministic and one-way alternating automata solving unary promise problems. Second, despite of the existing quadratic gap between Las Vegas realtime probabilistic automata and one-way deterministic automata for language recognition, we show that, by turning to promise problems, the tight gap becomes exponential. Last, we show that the situation is different for one-way probabilistic automata with two-sided bounded-error. We present a family of unary promise problems that is very easy for these machines; solvable with only two states, but the number of states in two-way alternating or any simpler automata is not limited by a constant. Moreover, we show that one-way bounded-error probabilistic automata can solve promise problems not solvable at all by any other classical model.

25 citations


Proceedings ArticleDOI
06 Jul 2015
TL;DR: In this article, it was shown that the model of dense-timed pushdown automata with timeless stack is expressively equivalent to the one of PDA with infinite stack, and that the general model obtained in this way is Turing complete.
Abstract: This paper contains two results on timed extensions of pushdown automata (PDA). As our first result we prove that the model of dense-timed PDA of Abdulla et al. Collapses: it is expressively equivalent to dense-timed PDA with timeless stack. Motivated by this result, we advocate the framework of first-order definable PDA, a specialization of PDA in sets with atoms, as the right setting to define and investigate timed extensions of PDA. The general model obtained in this way is Turing complete. As our second result we prove NEXPTIME upper complexity bound for the non-emptiness problem for an expressive subclass. As a byproduct, we obtain a tight EXPTIME complexity bound for a more restrictive subclass of PDA with timeless stack, thus subsuming the complexity bound known for dense-timed PDA.

25 citations


01 Jan 2015
TL;DR: This thesis extends automata learning to infer automata models that capture both control flow and data flow, and defines a formalism for register automata, a model that extends finite automata by adding registers that can store data values and be used in guards and assignments on transitions.
Abstract: Formal models are often used to describe the behavior of a computer program or component. Behavioral models have many different usages, e.g., in model-based techniques for software development and verification,such as model checking and model based testing.The application of model-based techniques is hampered by the current lack of adequate models for most actual systems, largely due to the significant manual effort typically needed to construct them. To remedy this, automata learning techniques (whereby models can be inferred by performing tests on a component) have been developed for finite automata that capture control flow. However, many usages requiremodels also to capture data flow, i.e., how behavior is affected by data values in method invocations and commands. Unfortunately, techniques are less developed for models that combinecontrol flow with data.In this thesis, we extend automata learning to infer automata models that captureboth control flow and data flow. We base our technique on a corresponding extension of classical automata theoryto capture data.We define a formalism for register automata, a model that extends finite automata by adding registers that can store data values and be used in guards and assignments on transitions. Our formalism is parameterized on a theory, i.e., a set of relations on a data domain. We present a Nerode congruence for the languages that our register automata can recognize, and provide a Myhill-Nerode theorem for constructing canonical register automata, thereby extending classical automata theory to register automata.We also present a learning algorithm for register automata: the new SL* algorithm, which extends the well-known L* algorithm for finite automata. The SL* algorithm is based on our new Nerode congruence, and uses a novel technique to infer symbolic data constraints on parameters. We evaluated our algorithm in a black-box scenario, inferring, e.g., the connection establishment phase of TCP and a priority queue, in addition to a number of smaller models. The SL* algorithm is implemented in a tool, which allows for use in more realistic settings, e.g., where models have both input and output, and for directly inferring Java classes.

24 citations


Journal ArticleDOI
TL;DR: It is shown that emptiness and several other problems are decidable and closure under Boolean operations is investigated, and the existence of an infinite and tight hierarchy depending on the number of turns is proved.

23 citations


Journal ArticleDOI
TL;DR: Results on upper and lower bounds on the conversion of finite automata to regular expressions and vice versa are summarized and results on recent results on the average case descriptional complexity bounds are reported.
Abstract: The equivalence of finite automata and regular expressions d ates back to the seminal paper of Kleene on events in nerve nets and finite automata from 1956. In the pr esent paper we tour a fragment of the literature and summarize results on upper and lower bounds on the conversion of finite automata to regular expressions and vice versa. We also briefly recall the known bounds for the removal of spontaneous transitions (e-transitions) on non-e-free nondeterministic devices. Moreover, we report on recent results on the average case descriptional complexity bounds for the conversion of regular expressions to finite automata and brand new developments on the state elimination algorithm that converts finite automata to regular expressions.

23 citations


Journal ArticleDOI
TL;DR: It turns out that the early query answering algorithm is earliest in practice on most queries from the XPathMark benchmark, and is proved tight upper complexity bounds on time and space consumption.

22 citations


Journal ArticleDOI
TL;DR: A fairly complete theory of operator precedence languages is provided, introducing a class of automata with the same recognizing power as the generative power of their grammars and characterization of their sentences in terms of monadic second-order logic.
Abstract: Operator precedence languages were introduced half a century ago by Robert Floyd to support deterministic and efficient parsing of context-free languages. Recently, we renewed our interest in this class of languages thanks to a few distinguishing properties that make them attractive for exploiting various modern technologies. Precisely, their local parsability enables parallel and incremental parsing, whereas their closure properties make them amenable to automatic verification techniques, including model checking. In this paper we provide a fairly complete theory of this class of languages: we introduce a class of automata with the same recognizing power as the generative power of their grammars; we provide a characterization of their sentences in terms of monadic second-order logic as has been done in previous literature for more restricted language classes such as regular, parenthesis, and input-driven ones; we investigate preserved and lost properties when extending the language sentences from finite ...

22 citations


Journal ArticleDOI
TL;DR: The translation from protocols to pushdown automata is implemented, yielding the first tool that decides equivalence of (some class of) protocols, for an unbounded number of sessions, and it is shown that checking for equivalences of protocols is actually equivalent to checking for interchangeability of generalized, real-time deterministic push down automata.
Abstract: Formal methods have been very successful in analyzing security protocols for reachability properties such as secrecy or authentication. In contrast, there are very few results for equivalence-based properties, crucial for studying, for example, privacy-like properties such as anonymity or vote secrecy. We study the problem of checking equivalence of security protocols for an unbounded number of sessions. Since replication leads very quickly to undecidability (even in the simple case of secrecy), we focus on a limited fragment of protocols (standard primitives but pairs, one variable per protocol’s rules) for which the secrecy preservation problem is known to be decidable. Surprisingly, this fragment turns out to be undecidable for equivalence. Then, restricting our attention to deterministic protocols, we propose the first decidability result for checking equivalence of protocols for an unbounded number of sessions. This result is obtained through a characterization of equivalence of protocols in terms of equality of languages of (generalized, real-time) deterministic pushdown automata. We further show that checking for equivalence of protocols is actually equivalent to checking for equivalence of generalized, real-time deterministic pushdown automata. Very recently, the algorithm for checking for equivalence of deterministic pushdown automata has been implemented. We have implemented our translation from protocols to pushdown automata, yielding the first tool that decides equivalence of (some class of) protocols, for an unbounded number of sessions. As an application, we have analyzed some protocols of the literature including a simplified version of the basic access control (BAC) protocol used in biometric passports.

Journal ArticleDOI
TL;DR: A generic model where transitions are labeled by elements of a finite partition of the alphabet is defined and Angluin's L* algorithm is extended for learning regular languages from examples for such automata.
Abstract: This work is concerned with regular languages defined over large alphabets, either infinite or just too large to be expressed enumeratively. We define a generic model where transitions are labeled by elements of a finite partition of the alphabet. We then extend Angluin's L* algorithm for learning regular languages from examples for such automata. We have implemented this algorithm and we demonstrate its behavior where the alphabet is a subset of the natural or real numbers. We sketch the extension of the algorithm to a class of languages over partially ordered alphabets.

Proceedings ArticleDOI
06 Jul 2015
TL;DR: An almost complete decidability picture for the basic decision problems about nested weighted automata is established, and their applicability in several domains is illustrated, including average response time properties.
Abstract: Recently there has been a significant effort to handle quantitative properties in formal verification and synthesis. While weighted automata over finite and infinite words provide a natural and flexible framework to express quantitative properties, perhaps surprisingly, some basic system properties such as average response time cannot be expressed using weighted automata, nor in any other know decidable formalism. In this work, we introduce nested weighted automata as a natural extension of weighted automata which makes it possible to express important quantitative properties such as average response time. In nested weighted automata, a master automaton spins off and collects results from weighted slave automata, each of which computes a quantity along a finite portion of an infinite word. Nested weighted automata can be viewed as the quantitative analogue of monitor automata, which are used in run-time verification. We establish an almost complete decidability picture for the basic decision problems about nested weighted automata, and illustrate their applicability in several domains. In particular, nested weighted automata can be used to decide average response time properties.

Journal ArticleDOI
TL;DR: It is shown that it is impossible to define a right congruence in the context of lattices, and that no canonical minimal automaton exists, and it is proved that the minimization problem for deterministic automata is NP-complete.
Abstract: Traditional automata accept or reject their input and are therefore Boolean. In contrast, weighted automata map each word to a value from a semiring over a large domain. The special case of lattice automata, in which the semiring is a finite lattice, has interesting theoretical properties as well as applications in formal methods. A minimal deterministic automaton captures the combinatorial nature and complexity of a formal language. Deterministic automata are used in runtime monitoring, pattern recognition, and modeling systems. Thus, the minimization problem for deterministic automata is of great interest, both theoretically and in practice. For deterministic traditional automata on finite words, a minimization algorithm, based on the Myhill-Nerode right congruence on the set of words, generates in polynomial time a canonical minimal deterministic automaton. A polynomial algorithm is known also for deterministic weighted automata over the tropical semiring. For general deterministic weighted automata, the problem of minimization is open. In this article, we study minimization of deterministic lattice automata. We show that it is impossible to define a right congruence in the context of lattices, and that no canonical minimal automaton exists. Consequently, the minimization problem is much more complicated, and we prove that it is NP-complete. As good news, we show that while right congruence fails already for finite lattices that are fully ordered, for this setting we are able to combine a finite number of right congruences and generate a minimal deterministic automaton in polynomial time.

Journal ArticleDOI
TL;DR: This paper demonstrates that the same worst-case 2 ?

Book ChapterDOI
02 Mar 2015
TL;DR: A new approach to program verification is based on automata to first construct an automaton for the candidate proof and then check its validity via automata inclusion.
Abstract: A new approach to program verification is based on automata. The notion of automaton depends on the verification problem at hand (nested word automata for recursion, Buchi automata for termination, a form of data automata for parametrized programs, etc.). The approach is to first construct an automaton for the candidate proof and then check its validity via automata inclusion. The originality of the approach lies in the construction of an automaton from a correctness proof of a given sequence of statements. A sequence of statements is at the same time a word over a finite alphabet and it is (a very simple case of) a program. Just as we ask whether a word has an accepting run, we can ask whether a sequence of statements has a correctness proof (of a certain form). The automaton accepts exactly the sequences that do.

Journal ArticleDOI
TL;DR: This paper analyzes the state complexity of deterministic Watson–Crick automata with respect to non-deterministic block automata and non-Deterministic finite Automata and introduces new finite automata combining the properties of Watson-CrickAutomata and time varying automata.
Abstract: Watson---Crick automata are finite automata working on double strands. Extensive research work has already been done on non-deterministic Watson---Crick automata and on deterministic Watson---Crick automata. State complexity of Watson---Crick automata has also been discussed. In this paper we analyze the state complexity of deterministic Watson---Crick automata with respect to non-deterministic block automata and non-deterministic finite automata. We also introduce new finite automata combining the properties of Watson---Crick automata and time varying automata. These automata have the unique property that the 1-limited stateless variant of these automata have the power to count. We further discuss the state complexity of time varying automata and time varying Watson---Crick automata.

Book ChapterDOI
02 Mar 2015
TL;DR: This work identifies a restriction – which is called weakness – of CMA, and shows that they are equivalent to three existing forms of automata over data languages, and that in the deterministic case they are closed under all Boolean operations, and hence have an ExpSpace-complete equivalence problem.
Abstract: Automata over infinite alphabets have recently come to be studied extensively as potentially useful tools for solving problems in verification and database theory. One popular model of automata studied is the Class Memory Automata (CMA), for which the emptiness problem is equivalent to Petri Net Reachability. We identify a restriction – which we call weakness – of CMA, and show that they are equivalent to three existing forms of automata over data languages. Further, we show that in the deterministic case they are closed under all Boolean operations, and hence have an ExpSpace-complete equivalence problem. We also extend CMA to operate over multiple levels of nested data values, and show that while these have undecidable emptiness in general, adding the weakness constraint recovers decidability of emptiness, via reduction to coverability in well-structured transition systems. We also examine connections with existing automata over nested data.

Proceedings ArticleDOI
12 Oct 2015
TL;DR: The authors define a new class of pushdown systems where the pushdown is a tree instead of a word and show that the resulting pushdown enjoys a decidable reachability problem, based on a preservation of recognizability result for the backward reachability relation.
Abstract: We define a new class of pushdown systems where the pushdown is a tree instead of a word. We allow a limited form of lookahead on the pushdown conforming to a certain ordering restriction, and we show that the resulting class enjoys a decidable reachability problem. This follows from a preservation of recognizability result for the backward reachability relation of such systems. As an application, we show that our simple model can encode several formalisms generalizing pushdown systems, such as ordered multi-pushdown systems, annotated higher-order pushdown systems, the Krivine machine, and ordered annotated multi-pushdown systems. In each case, our procedure yields tight complexity.

Journal ArticleDOI
TL;DR: The ordered restarting automaton (processing strings) is introduced, and it is shown that its nondeterministic variant is very expressive, as it accepts some languages that are not even context-free, while the deterministic ordering automata just accept the regular languages.
Abstract: The ordered restarting automaton (processing strings) is introduced, and it is shown that its nondeterministic variant is very expressive, as it accepts some languages that are not even context-free, while the deterministic ordered restarting automata just accept the regular languages. Then three different extensions of the deterministic ordered restarting automaton to two-dimensional inputs are defined that differ in the way in which they can move their read/write windows. We compare the classes of picture languages that these types of automata accept to each other and to some well studied classes of picture languages from the literature, and we present some closure and non-closure properties for them.

Journal ArticleDOI
TL;DR: It is shown that symbolic tree automata are closed under Boolean operations, and that the operations are effectively uniform in the given alphabet theory, which generalizes the corresponding classical properties known for finite tree Automata.

Journal ArticleDOI
TL;DR: This study introduces several problems related to finding reset words for deterministic finite automata, and presents motivations for these problems for practical applications in areas such as robotics and bio-engineering, and investigates the complexity of some synchronizability problems for automata that are both monotonic and partially specified.
Abstract: In this study, we first introduce several problems related to finding reset words for deterministic finite automata, and present motivations for these problems for practical applications in areas such as robotics and bio-engineering. We then analyse computational complexities of these problems. Second, we consider monotonic and partially specified automata. Monotonicity is known to be a feature simplyfing the synchronizability problems. On the other hand for partially specified automata, synchronizability problems are known to be harder than the completely specified automata. We investigate the complexity of some synchronizability problems for automata that are both monotonic and partially specified. We show that checking the existence, computing one, and computing a shortest reset word for a monotonic partially specified automaton is NP-hard. We also show that finding a reset word that synchronizes 𝓚 number of states (or maximum number of states) of a given monotonic non-synchronizable automaton to a given set of states is NP-hard.

Book ChapterDOI
24 Aug 2015
TL;DR: A general model of weighted automata acting on graphs is introduced, which form a quantitative version of Thomas’ unweighted model of graph acceptors and it is derived that a suitable weighted MSO logic is expressively equivalent to weighted graph automata.
Abstract: Weighted automata model quantitative features of the behavior of systems and have been investigated for various structures like words, trees, traces, pictures, and nested words. In this paper, we introduce a general model of weighted automata acting on graphs, which form a quantitative version of Thomas’ unweighted model of graph acceptors. We derive a Nivat theorem for weighted graph automata which shows that their behaviors are precisely those obtainable from very particular weighted graph automata and unweighted graph acceptors with a few simple operations. We also show that a suitable weighted MSO logic is expressively equivalent to weighted graph automata. As a consequence, we obtain corresponding Buchi-type equivalence results known from the recent literature for weighted automata and weighted logics on words, trees, pictures, and nested words. Establishing such a general result has been an open problem for weighted logic for some time.

Journal ArticleDOI
TL;DR: The notion of multipass automata as a generalization of pushdown automata is introduced and the classes of languages accepted by such machines are studied, including languages which are the intersection of finitely many context-free languages.

Book ChapterDOI
24 Jan 2015
TL;DR: This model can overcome the reversibility restriction by exploiting the garbage tape to store popped symbols, and it is shown that the proposed model can simulate any quantum pushdown automaton with a classical stack as well as any probabilistic push down automaton.
Abstract: Several kinds of quantum pushdown automaton models have been proposed, and their computational power is investigated intensively However, for some quantum pushdown automaton models, it is not known whether quantum models are at least as powerful as classical counterparts or not This is due to the reversibility restriction In this paper, we introduce a new quantum pushdown automaton model that has a garbage tape This model can overcome the reversibility restriction by exploiting the garbage tape to store popped symbols We show that the proposed model can simulate any quantum pushdown automaton with a classical stack as well as any probabilistic pushdown automaton We also show that our model can solve a certain promise problem exactly while deterministic pushdown automata cannot These results imply that our model is strictly more powerful than classical counterparts in the setting of exact, one-sided error and non-deterministic computation

Proceedings ArticleDOI
06 Jul 2015
TL;DR: In this article, the complexity of the bisimilarity problem for classes of register and fresh-register automata is investigated, and it is shown that the problem is EXPTIME-complete.
Abstract: Register automata are a basic model of computation over infinite alphabets. Fresh-register automata extend register automata with the capability to generate fresh symbols in order to model computational scenarios involving name creation. This paper investigates the complexity of the bisimilarity problem for classes of register and fresh-register automata. We examine all main disciplines that have appeared in the literature: general register assignments, assignments where duplicate register values are disallowed, and assignments without duplicates in which registers cannot be empty. In the general case, we show that the problem is EXPTIME-complete. However, the absence of duplicate values in registers enables us to identify inherent symmetries inside the associated bisimulation relations, which can be used to establish a polynomial bound on the depth of Attacker-winning strategies. Furthermore, they enable a highly succinct representation of the corresponding bisimulations. By exploiting results from group theory and computational group theory, we can then show solvability in PSPACE and NP respectively for the latter two register disciplines. In each case, we find that freshness does not affect the complexity class of the problem. The results allow us to close a complexity gap for language equivalence of deterministic register automata. We show that deterministic language in equivalence for the no-duplicates fragment is NP-complete, which disproves an old conjecture of Sakamoto. Finally, we discover that, unlike in the finite-alphabet case, the addition of pushdown store makes bisimilarity undecidable, even in the case of visibly pushdown storage.

Journal ArticleDOI
TL;DR: This paper proves several structural results about sets accepted by such automata, and analyzes decidability as well as complexity of several classical questions about automata in the new framework.

Proceedings ArticleDOI
01 Jan 2015
TL;DR: In this article, the safe rewriting problem in active context-free games is studied, where the goal is to decide whether the first player, Juliet, has a winning strategy for a given game and (nested) word.
Abstract: The paper studies the rewriting mechanisms for intensional documents in the Active XML framework, abstracted in the form of active context-free games. The safe rewriting problem studied in this paper is to decide whether the first player, Juliet, has a winning strategy for a given game and (nested) word; this corresponds to a successful rewriting strategy for a given intensional document. The paper examines several extensions to active context-free games. The primary extension allows more expressive schemas (namely XML schemas and regular nested word languages) for both target and replacement languages and has the effect that games are played on nested words instead of (flat) words as in previous studies. Other extensions consider validation of input parameters of web services, and an alternative semantics based on insertion of service call results. In general, the complexity of the safe rewriting problem is highly intractable (doubly exponential time), but the paper identifies interesting tractable cases.

Book ChapterDOI
18 May 2015
TL;DR: It is shown that one-way deterministic reversal-bounded multicounter languages are closed under right quotient with languages from many different language families; even those defined by nondeterministic machines such as the context-free languages, or languages accepted by nond deterministic pushdown machines augmented by any number of reversal- bounded counters.
Abstract: Many different deletion operations are investigated applied to languages accepted by one-way and two-way deterministic reversal-bounded multicounter machines as well as finite automata. Operations studied include the prefix, suffix, infix and outfix operations, as well as left and right quotient with languages from different families. It is often expected that language families defined from deterministic machines will not be closed under deletion operations. However, here, it is shown that one-way deterministic reversal-bounded multicounter languages are closed under right quotient with languages from many different language families; even those defined by nondeterministic machines such as the context-free languages, or languages accepted by nondeterministic pushdown machines augmented by any number of reversal-bounded counters. Also, it is shown that when starting with one-way deterministic machines with one counter that makes only one reversal, taking the left quotient with languages from many different language families, again including those defined by nondeterministic machines such as the context-free languages, yields only one-way deterministic reversal-bounded multicounter languages (by increasing the number of counters). However, if there are even just two more reversals on the counter, or a second 1-reversal-bounded counter, taking the left quotient (or even just the suffix operation) yields languages that can neither be accepted by deterministic reversal-bounded multicounter machines, nor by 2-way nondeterministic machines with one reversal-bounded counter. A number of other results with deletion operations are also shown.

Journal ArticleDOI
TL;DR: It is shown that adding pebble strictly increases the expressive power of two-way automata recognizing languages of tiles, but the hierarchy induced by the number of allowed pebbles collapses to level one.
Abstract: We consider classes of languages of overlapping tiles, i.e., subsets of the McAlister monoid: the class REG of languages definable by Kleene’s regular expressions, the class MSO of languages definable by formulas of monadic second-order logic, and the class REC of languages definable by morphisms into finite monoids. By extending the semantics of finite-state two-way au- tomata (possibly with pebbles) from languages of words to languages of tiles, we obtain a complete characterization of the classes REG and MSO. In particular, we show that adding pebbles strictly increases the expressive power of two-way automata recognizing languages of tiles, but the hierarchy induced by the number of allowed pebbles collapses to level one.