scispace - formally typeset
Search or ask a question

Showing papers on "Formal language published in 2011"


Book
14 Oct 2011
TL;DR: The aim of this book is to clarify the role that language plays in the development of set theory and to provide a framework for the future development of such a system.
Abstract: Preface. Part A. Set Theory. 1. Basic Concepts of Set Theory. 2. Relations and Functions. 3. Properties of Relations. 4. Infinities. Appendix A1. Part B. Logic and Formal Systems. 5. Basic Concepts of Logic. 6.Statement Logic. 7. Predicate Logic. 8. Formal Systems, Axiomatization, and Model Theory. Appendix B1. Appendix BII. Part C. Algebra. 9. Basic Concepts of Algebra. 10. Operational Structures. 11. Lattices. 12. Boolean and Heyting Algebras. Part D. English as a Formal Language. 13. Basic Concepts of Formal Languages. 14. Generalized Quantifiers. 15. Intensionality. Part E. Languages, Grammars, and Automata. 16. Basic Concepts of Languages, Grammars, and Automata. 17. Finite Automata, Regular Languages and Type 3 Grammars. 18. Pushdown Automata, Context-Free Grammars and Languages. 19. Turing Machines, Recursively Enumberable Languages, and Type 0 Grammars. 20. Linear Bounded Automata, Context-Sensitive Languages and Type 1 Grammars. 21. Languages Between Context-Free and Context-Sensitive. 22. Transformational Grammars. Appendix EI. Appendix EII. Review Problems. Index.

425 citations


Book
10 Dec 2011
TL;DR: This is a motivated presentation of recent results on tree transducers applied to studying the general properties of formal models and for providing semantics to context-free languages.
Abstract: From the Publisher: This is a motivated presentation of recent results on tree transducers, applied to studying the general properties of formal models and for providing semantics to context-free languages.

141 citations


Proceedings Article
19 Jun 2011
TL;DR: It is found that these languages contain the Strictly Local languages, are star-free, are incomparable with other known sub-star-free classes, and have other interesting properties.
Abstract: Beginning with Goldsmith (1976), the phonological tier has a long history in phonological theory to describe non-local phenomena. This paper defines a class of formal languages, the Tier-based Strictly Local languages, which begin to describe such phenomena. Then this class is located within the Subregular Hierarchy (McNaughton and Papert, 1971). It is found that these languages contain the Strictly Local languages, are star-free, are incomparable with other known sub-star-free classes, and have other interesting properties.

96 citations


Journal ArticleDOI
19 Sep 2011
TL;DR: A functional approach to parsing unrestricted context-free grammars based on Brzozowski's derivative of regular expressions and equip the functional programmer with two equational theories that, when combined, make for an abbreviated understanding and implementation of a system for parsing context- free languages.
Abstract: We present a functional approach to parsing unrestricted context-free grammars based on Brzozowski's derivative of regular expressions. If we consider context-free grammars as recursive regular expressions, Brzozowski's equational theory extends without modification to context-free grammars (and it generalizes to parser combinators). The supporting actors in this story are three concepts familiar to functional programmers - laziness, memoization and fixed points; these allow Brzozowski's original equations to be transliterated into purely functional code in about 30 lines spread over three functions.Yet, this almost impossibly brief implementation has a drawback: its performance is sour - in both theory and practice. The culprit? Each derivative can double the size of a grammar, and with it, the cost of the next derivative.Fortunately, much of the new structure inflicted by the derivative is either dead on arrival, or it dies after the very next derivative. To eliminate it, we once again exploit laziness and memoization to transliterate an equational theory that prunes such debris into working code. Thanks to this compaction, parsing times become reasonable in practice.We equip the functional programmer with two equational theories that, when combined, make for an abbreviated understanding and implementation of a system for parsing context-free languages.

66 citations


Journal ArticleDOI
TL;DR: A way to use automata, as the CP regular constraint does, to express allowed patterns for the values taken by the constrained sequence of variables in a MIP model is suggested.
Abstract: This paper approaches the problem of modeling optimization problems containing substructures involving constraints on sequences of decision variables. Such constraints can be very complex to express with Mixed Integer Programming (MIP). We suggest an approach inspired by global constraints used in Constraint Programming (CP) to exploit formal languages for the modeling of such substructures with MIP. More precisely, we first suggest a way to use automata, as the CP regular constraint does, to express allowed patterns for the values taken by the constrained sequence of variables. Secondly, we present how context-free grammars can contribute to formulate constraints on sequences of variables in a MIP model. Experimental results on both approaches show that they facilitate the modeling, but also give models easier to solve by MIP solvers compared to compact assignment MIP formulations.

62 citations


Journal ArticleDOI
TL;DR: A very simple construction is presented that, given a context-free grammar, produces a finite automaton recognizing such a regular language.

62 citations


Book
14 Feb 2011
TL;DR: Written to address the fundamentals of formal languages, automata, and computabilty, An Introduction to Formal Languages and Automata provides an accessible, student-friendly presentation of all material essential to an introductory Theory of Computation course.
Abstract: Written to address the fundamentals of formal languages, automata, and computabilty, An Introduction to Formal Languages and Automata provides an accessible, student-friendly presentation of all material essential to an introductory Theory of Computation course It is designed to familiarize students with the foundations and principles of computer science and to strengthen the students' ability to carry out formal and rigorous mathematical arguments In the new Fifth Edition, Peter Linz continues to offer a straightforward, uncomplicated treatment of formal languages and automata and avoids excessive mathematical detail so that students may focus on and understand the underlying principles In an effort to further the accessibility and comprehension of the text, the author has added new illustrative examples and exercises throughout New and Key Features of the revised and updated Fifth Edition: Includes a new chapter within the appendices on finite-state transducers, including basic results on Mealy and Moore machines This optional chapter can be used to prepare students for further related study Provides an introduction to JFLAP, also within the appendices Many of the exercises in the text require creating structures that are complicated and that have to be tested for correctness JFLAP can greatly reduce students time spent on testing as well as help them visualize abstract concepts A CD-ROM accompanies every new copy of the text and contains the following: A summary description of JFLAP; Numerous new exercises that illustrate the value and efficiency of JFLAP; JFLAP implementations of most of the examples in the text that allow students and instructors to experiment dynamically with these examples

61 citations


Journal ArticleDOI
TL;DR: This paper provides in a single accessible document an updated development of the foundations of nominal techniques using urelemente with properties that turn out to have been investigated by Fraenkel and Mostowski in the first half of the 20th century for a completely different purpose than modelling formal language.
Abstract: We are used to the idea that computers operate on numbers, yet another kind of data is equally important: the syntax of formal languages, with variables, binding, and alpha-equivalence. The original application of nominal techniques, and the one with greatest prominence in this paper, is to reasoning on formal syntax with variables and binding. Variables can be modelled in many ways: for instance as numbers (since we usually take countably many of them); as links (since they may `point' to a binding site in the term, where they are bound); or as functions (since they often, though not always, represent `an unknown'). None of these models is perfect. In every case for the models above, problems arise when trying to use them as a basis for a fully formal mechanical treatment of formal language. The problems are practical—but their underlying cause may be mathematical. The issue is not whether formal syntax exists, since clearly it does, so much as what kind of mathematical structure it is. To illustrate this point by a parody, logical derivations can be modelled using a Godel encoding (i.e., injected into the natural numbers). It would be false to conclude from this that proof-theory is a branch of number theory and can be understood in terms of, say, Peano's axioms. Similarly, as it turns out, it is false to conclude from the fact that variables can be encoded e.g., as numbers, that the theory of syntax-with-binding can be understood in terms of the theory of syntax-without-binding, plus the theory of numbers (or, taking this to a logical extreme, purely in terms of the theory of numbers). It cannot; something else is going on. What that something else is, has not yet been fully understood. In nominal techniques, variables are an instance of names, and names are data. We model names using urelemente with properties that, pleasingly enough, turn out to have been investigated by Fraenkel and Mostowski in the first half of the 20th century for a completely different purpose than modelling formal language. What makes this model really interesting is that it gives names distinctive properties which can be related to useful logic and programming principles for formal syntax. Since the initial publications, advances in the mathematics and presentation have been introduced piecemeal in the literature. This paper provides in a single accessible document an updated development of the foundations of nominal techniques. This gives the reader easy access to updated results and new proofs which they would otherwise have to search across two or more papers to find, and full proofs that in other publications may have been elided. We also include some new material not appearing elsewhere.

59 citations


Journal ArticleDOI
TL;DR: The theory of timed automata enables FCMs to effectively deal with a double-layered temporal granularity, extending the standard idea of B-time that characterizes the iterative nature of a cognitive inference engine and offering model checking techniques to test the cognitive and dynamic comportment of the framework being designed.
Abstract: The theory of fuzzy cognitive maps (FCMs) is a powerful approach to modeling human knowledge that is based on causal reasoning. Taking advantage of fuzzy logic and cognitive map theories, FCMs enable system designers to model complex frameworks by defining degrees of causality between causal objects. They can be used to model and represent the behavior of simple and complex systems by capturing and emulating the human being to describe and present systems in terms of tolerance, imprecision, and granulation of information. However, FCMs lack the temporal concept that is crucial in many real-world applications, and they do not offer formal mechanisms to verify the behavior of systems being represented, which limit conventional FCMs in knowledge representation. In this paper, we present an extension to FCMs by exploiting a theory from formal languages, namely, the timed automata, which bridges the aforementioned inadequacies. Indeed, the theory of timed automata enables FCMs to effectively deal with a double-layered temporal granularity, extending the standard idea of B-time that characterizes the iterative nature of a cognitive inference engine and offering model checking techniques to test the cognitive and dynamic comportment of the framework being designed.

53 citations


Journal ArticleDOI
TL;DR: This paper proposes a method to obtain, from a GraFCet specification, an equivalent Mealy machine, without semantics loss, and permits to describe explicitly and formally all the states and transitions that are implicitly represented in a Grafcet model.

49 citations


Book
01 Jan 2011
TL;DR: The refereed proceedings of the 36th International Symposium on Mathematical Foundations of Computer Science (MFCS 2011) as discussed by the authors were published in 2011, with 48 revised full papers presented together with 6 invited talks.
Abstract: This volume constitutes the refereed proceedings of the 36th International Symposium on Mathematical Foundations of Computer Science, MFCS 2011, held in Warsaw, Poland, in August 2011. The 48 revised full papers presented together with 6 invited talks were carefully reviewed and selected from 129 submissions. Topics covered include algorithmic game theory, algorithmic learning theory, algorithms and data structures, automata, grammars and formal languages, bioinformatics, complexity, computational geometry, computer-assisted reasoning, concurrency theory, cryptography and security, databases and knowledge-based systems, formal specifications and program development, foundations of computing, logic in computer science, mobile computing, models of computation, networks, parallel and distributed computing, quantum computing, semantics and verification of programs, and theoretical issues in artificial intelligence.

Book ChapterDOI
19 Sep 2011
TL;DR: An automated process based on model-to-model transformations is described to generate a Repairable Fault Trees model from the MARTE-DAM specification of the Radio Block Centre - a modern railway controller.
Abstract: Maintenance of real-world systems is a complex task involving several actors, procedures and technologies. Proper approaches are needed in order to evaluate the impact of different maintenance policies considering cost/benefit factors. To that aim, maintenance models may be used within availability, performability or safety models, the latter developed using formal languages according to the requirements of international standards. In this paper, a model-driven approach is described for the development of formal maintenance and reliability models for the availability evaluation of repairable systems. The approach facilitates the use of formal models which would be otherwise difficult to manage, and provides the basis for automated models construction. Starting from an extension to maintenance aspects of the MARTE-DAM profile for dependability analysis, an automated process based on model-to-model transformations is described. The process is applied to generate a Repairable Fault Trees model from the MARTE-DAM specification of the Radio Block Centre - a modern railway controller.

Journal ArticleDOI
TL;DR: It is proved that (deterministic) union-freeness of languages does not accelerate regular operations, except for the reversal in the nondeterministic case.
Abstract: We continue the investigation of union-free regular languages that are described by regular expressions without the union operation. We also define deterministic union-free languages as languages accepted by one-cycle-free-path deterministic finite automata, and show that they are properly included in the class of union-free languages. We prove that (deterministic) union-freeness of languages does not accelerate regular operations, except for the reversal in the nondeterministic case.

Journal ArticleDOI
TL;DR: A structured quantum programming theorem is presented, which provides a technique of translating quantum flowchart programs into programs written in a high-level language, namely, a quantum extension of the while-language.
Abstract: Several high-level quantum programming languages have been proposed in the previous research. In this paper, we define a low-level flowchart language for quantum programming, which can be used in implementation of high-level quantum languages and in design of quantum compilers. The formal semantics of the flowchart language is given, and the notion of correctness for programs written in this language is introduced. A structured quantum programming theorem is presented, which provides a technique of translating quantum flowchart programs into programs written in a high-level language, namely, a quantum extension of the while-language.

Journal ArticleDOI
TL;DR: The goal of the refinement patterns is to provide an easy way to represent and correctly refine timing constraints on abstract atomic events with more elaborate timingconstraints on the refined events.
Abstract: Event-B is a formal language for systems modeling, based on set theoryand predicate logic. It has the advantage of mechanized proof, and it is possible tomodel a system in several levels of abstraction by using refinement. Discrete timingproperties are important in many critical systems. However, modeling of timingproperties is not directly supported in Event-B. In this paper we identify three maincategories of discrete timing properties for trigger-response pattern, deadline, delayand expiry. We introduce language constructs for each of these timing properties thataugment the Event-B language. We describe how these constructs can be mappedto standard Event-B constructs. To ease the process of using the timing constructsin a refinement-based development, we introduce patterns for refining the timingconstructs that allow timing properties on abstract models to be replaced by timingproperties on refined models. The language constructs and refinement patternsare illustrated through some generic examples. Event-B refinement allows atomicevents at the abstract level to be broken down into sub-steps at the refined level.The goal of our refinement patterns is to provide an easy way to represent and correctlyrefine timing constraints on abstract atomic events with more elaborate timingconstraints on the refined events. This paper presents an initial set of patterns.

Book ChapterDOI
06 Jun 2011
TL;DR: A transformation rules based approach to automate the process of SBVR to OCL transformation and also assists the designers by simplifying software designing process is presented.
Abstract: In design of component based applications, the designers have to produce visual such as Unified Modeling Language (UML) models, and describe the software component interfaces. Business rules and constraints are the key components in the skeletons of software components. Semantic of Business Vocabulary and Rules (SBVR) language is typically used to express constraints in natural language and then a software engineer manually maps SBVR business rules to other formal languages such as UML, Object Constraint Language (OCL) expressions. However, OCL is the only medium used to write constraints for UML models but manual translation of SBVR rules to OCL constraints is difficult, complex and time consuming. Moreover, the lack of tool support for automated creation of OCL constraints from SBVR makes this scenario more complex. As, both SBVR and OCL are based on First-Order Logic (FOL), model transformation technology can be used to automate the transformation of SBVR to OCL. In this research paper, we present a transformation rules based approach to automate the process of SBVR to OCL transformation. The presented approach implemented in SBVR2OCL prototype tool as a proof of concept. The presented method softens the process of creating OCL constraints and also assists the designers by simplifying software designing process.

Journal ArticleDOI
01 Mar 2011-Noûs
TL;DR: In the Begriffsschrift formal language as mentioned in this paper, the logical rules can be expressed in purely formal terms, without any reference to the contents expressed by the sentences of the formal language.
Abstract: Among the many innovations that mark Frege’s Begriffsschrift as a revolutionary work, perhaps the most important is its presentation of the first formal system of logic. Frege believed that the introduction of a new notation, especially for the expression of generality, was necessary if the logical relationships between contents were to be made apparent. This new notation made it possible, at least in an important range of cases, to establish that a given content could be inferred from certain other contents simply by examining the structure of the sentences that expressed those contents.1 In this regard, Frege’s great advance was that, in his system, the logical rules could be stated in purely formal terms, without any reference to the contents expressed by the sentences of his formal language, Begriffsschrift, the conceptual notation. But it is important to recognize that logic, according to Frege, is not ‘formal’ in any sense that would oppose form to content: The sentences of Begriffsschrift are not mere forms. Frege’s goal was “not. . . to present an abstract logic in formulas, but to express a content through written symbols in a more precise and perspicuous way than is possible with words” (AimCN, pp. 90–1). Frege’s formal system is intended to be one we can actually use in reasoning, that is, in inferring truths from other truths, which is to say that we can prove theorems in this system, where theorems are true contents.2 If so, the sentences of Begriffsschrift must express those contents. The point of presenting proofs in a formal system is thus not to empty mathematics of content (FTA, esp. opp. 97ff.),3

Proceedings ArticleDOI
30 Aug 2011
TL;DR: A new language, HYDI, is proposed for modeling Hybrid systems with Discrete Interaction to apply state-of-the-art symbolic model checkers for infinite-state systems to the verification of complex embedded systems design.
Abstract: Complex embedded systems consist of software and hardware components that operate autonomous devices interacting with the physical environment. The complexity of such systems makes the design very challenging and demands for advanced validation techniques. Hybrid automata are a clean and consolidated formal language for modeling embedded systems which include discrete and continuous dynamics. They are based on a finite-state automaton structure enriched with invariant and flow conditions to model the continuous dynamics. In this paper, we propose a new language, HYDI, for modeling Hybrid systems with Discrete Interaction. The purpose of the language is to apply state-of-the-art symbolic model checkers for infinite-state systems to the verification of complex embedded systems design. HYDI extends the standard symbolic language SMV with timing and synchronization aspects. The language distinguishes between discrete and continuous variables. Variables inside SMV modules evolve synchronously. Top-level modules represent the asynchronous components of a network and use explicit events to synchronize. The new language is automatically compiled into equivalent discrete-time infinite-state transition systems.

Proceedings ArticleDOI
09 May 2011
TL;DR: The Motion Grammar is described, some of the formal guarantees it can provide, and the entire game of human-robot chess through a single formal language is represented, including game-play, safe handling of human motion, uncertainty in piece positions, misplaced and collapsed pieces.
Abstract: We introduce the Motion Grammar, a powerful new representation for robot decision making, and validate its properties through the successful implementation of a physical human-robot game. The Motion Grammar is a formal tool for task decomposition and hybrid control in the presence of significant online uncertainty. In this paper, we describe the Motion Grammar, introduce some of the formal guarantees it can provide, and represent the entire game of human-robot chess through a single formal language. This language includes game-play, safe handling of human motion, uncertainty in piece positions, misplaced and collapsed pieces. We demonstrate the simple and effective language formulation through experiments on a 14-DOF manipulator interacting with 32 objects (chess pieces) and an unpredictable human adversary.

Journal ArticleDOI
TL;DR: In this article, the authors re-examine the Kuratowski theorem in the setting of formal languages, where by "closure" we mean either Kleene closure or positive closure, and classify languages according to the structure of the algebras they generate under iterations of complement and closure.
Abstract: A famous theorem of Kuratowski states that, in a topological space, at most 14 distinct sets can be produced by repeatedly applying the operations of closure and complement to a given set. We re-examine this theorem in the setting of formal languages, where by "closure" we mean either Kleene closure or positive closure. We classify languages according to the structure of the algebras they generate under iterations of complement and closure. There are precisely 9 such algebras in the case of positive closure, and 12 in the case of Kleene closure. We study how the properties of being open and closed are preserved under concatenation. We investigate analogues, in formal languages, of the separation axioms in topological spaces; one of our main results is that there is a clopen partition separating two words if and only if the words do not commute. We can decide in quadratic time if the language specified by a DFA is closed, but if the language is specified by an NFA, the problem is PSPACE-complete.

Book
Ernest Davis1
27 Aug 2011
TL;DR: A logical theory is presented that supports high-level reasoning about knowledge and perception and makes it possible in some cases to infer that an agent does not know about a particular event, because he has had no way to find out about it.
Abstract: This paper presents a logical theory that supports high-level reasoning about knowledge and perception. We construct a formal language in which perception can be described. Using this language, we state some fundamental axioms, and we show that these are sufficient to justify some elementary but interesting inferences about perception. In particular, our axioms make it possible in some cases to infer that an agent does not know about a particular event, because he has had no way to find out about it.

Journal ArticleDOI
TL;DR: All Chomsky classes and all standard complexity classes are closed under iterated bounded hairpin completion, and the paper addresses the question whether the iterated hairpins completion of a singleton is always regular.

Book ChapterDOI
09 Feb 2011
TL;DR: This paper introduces a simple grammar system that encompasses many bio-molecular structures including the above mentioned structures and discusses how the ambiguity levels defined for insertion-deletion grammar systems can be realized in bio-numbers structures, thus the ambiguity issues in gene sequences can be studied in terms of grammar systems.
Abstract: Insertion and deletion are considered to be the basic operations in Biology, more specifically in DNA processing and RNA editing. Based on these evolutionary transformations, a computing model has been formulated in formal language theory known as insertion-deletion systems. Since the biological macromolecules can be viewed as symbols, the gene sequences can be represented as strings. This suggests that the molecular representations can be theoretically analyzed if a biologically inspired computing model recognizes various bio-molecular structures like pseudoknot, hairpin, stem and loop, cloverleaf and dumbbell. In this paper, we introduce a simple grammar system that encompasses many bio-molecular structures including the above mentioned structures. This new grammar system is based on insertion-deletion and matrix grammar systems and is called Matrix insertion-deletion grammars. Finally, we discuss how the ambiguity levels defined for insertion-deletion grammar systems can be realized in bio-molecular structures, thus the ambiguity issues in gene sequences can be studied in terms of grammar systems.

Proceedings ArticleDOI
25 Sep 2011
TL;DR: Computer Science as a research discipline has always struggled with its identity and has inherited its research methods from the same disciplines: on the one hand, the mathematical approach with axioms, postulates and proofs; on the other hand the engineering approach with quantification, measurements and comparison.
Abstract: Computer Science as a research discipline has always struggled with its identity. On the one hand, it is a field deeply rooted in mathematics which resulted in strong theories.1 For example, there is computational complexity theory (turing machines, the halting problem), database theory (the relational model, expresive power of query languages), formal language theory (the chomsky hierarchy, well-formedness, formal semantics). On the other hand, it is a field deeply rooted in engineering which resulted in machines that have completely warped our society: the von Neumann architecture (the basis for digital computers), parallel processors (the new generation of multi-core machines), distributed computers (a prerequisite for the success of the internet and recent phenomena like grid computing). Consequently, computer science has inherited its research methods from the same disciplines: on the one hand, the mathematical approach with axioms, postulates and proofs; on the other hand the engineering approach with quantification, measurements and comparison.

Journal ArticleDOI
TL;DR: A novel approach to building a model-checker for Z is discussed, which involves implementing a translation from Z into SAL, the input language for the Symbolic Analysis Laboratory, a toolset which includes a number of model- checkers and a simulator.
Abstract: Despite being widely known and accepted in industry, the Z formal specification language has not so far been well supported by automated verification tools, mostly because of the challenges in handling the abstraction of the language. In this paper we discuss a novel approach to building a model-checker for Z, which involves implementing a translation from Z into SAL, the input language for the Symbolic Analysis Laboratory, a toolset which includes a number of model-checkers and a simulator. The Z2SAL translation deals with a number of important issues, including: mapping unbounded, abstract specifications into bounded, finite models amenable to a BDD-based symbolic checker; converting a non-constructive and piecemeal style of functional specification into a deterministic, automaton-based style of specification; and supporting the rich set-based vocabulary of the Z mathematical toolkit. This paper discusses progress made towards implementing as complete and faithful a translation as possible, while highlighting certain assumptions, respecting certain limitations and making use of available optimisations. The translation is illustrated throughout with examples; and a complete working example is presented, together with performance data.

Book ChapterDOI
22 Aug 2011
TL;DR: It is shown in this paper that a central result from formal language theory--the Myhill-Nerode theorem--can be recreated using only regular expressions.
Abstract: There are numerous textbooks on regular languages. Nearly all of them introduce the subject by describing finite automata and only mentioning on the side a connection with regular expressions. Unfortunately, automata are difficult to formalise in HOL-based theorem provers. The reason is that they need to be represented as graphs, matrices or functions, none of which are inductive datatypes. Also convenient operations for disjoint unions of graphs and functions are not easily formalisiable in HOL. In contrast, regular expressions can be defined conveniently as a datatype and a corresponding reasoning infrastructure comes for free. We show in this paper that a central result from formal language theory--the Myhill-Nerode theorem--can be recreated using only regular expressions.

Book ChapterDOI
19 Jul 2011
TL;DR: This paper presents key contributions to the area, its state-of-the-art, and conjectures that suggest answers to some natural unsolved problems in formal language theory.
Abstract: The aim of this paper is to give a short survey of the area formed by the intersection of two popular lines of investigation in formal language theory. The first line, originated by Thue in 1906, concerns about repetition-free words and languages. The second line is the study of growth functions for words and languages; it can be traced back to the classical papers by Morse and Hedlund on symbolic dynamics (1938, 1940). Growth functions of repetition-free languages are investigated since 1980's. Most of the results were obtained for power-free languages, but some ideas can be applied for languages avoiding patterns and Abelian-power-free languages as well. In this paper, we present key contributions to the area, its state-of-the-art, and conjectures that suggest answers to some natural unsolved problems. Also, we pay attention to the tools and techniques that made possible the progress in the area and suggest some technical results that would be useful to solve open problems.

Book ChapterDOI
19 Jul 2011
TL;DR: This work proves that a necessary condition for a regular language L to be a splicing language is that L must have a constant in the Schutzenberger's sense, based on properties of strongly connected components of the minimal deterministic finite state automaton for aRegular splicinglanguage.
Abstract: In spite of wide investigations of finite splicing systems in formal language theory, basic questions, such as their characterization, remain unsolved. In search for understanding the class of finite splicing systems, it has been conjectured that a necessary condition for a regular language L to be a splicing language is that L must have a constant in the Schutzenberger's sense. We prove this longstanding conjecture to be true. The result is based on properties of strongly connected components of the minimal deterministic finite state automaton for a regular splicing language.

Journal ArticleDOI
TL;DR: Ellul, Krawetz, Shallit and Wang prove an exponential lower bound on the size of any context-free grammar generating the language of all permutations over some alphabet, and obtain exponential lower bounds for many other languages.

Journal ArticleDOI
TL;DR: A formalism that combines features that have been developed for access control and conformance checking and integrates the axioms into a logic programming approach, which lets us use quantification in policies while preserving decidability of access control decisions.