scispace - formally typeset
Search or ask a question

Showing papers on "Chomsky hierarchy published in 2012"


Journal ArticleDOI
TL;DR: It is argued that the left inferior frontal region is a generic on-line sequence processor that unifies information from various sources in an incremental and recursive manner, independent of whether there are any processing requirements related to syntactic movement or hierarchically nested structures.

174 citations


Journal ArticleDOI
TL;DR: The arguments why neither regular nor context-free grammar is sufficiently expressive to capture all phenomena in the natural language syntax are recapitulated.
Abstract: The first part of this article gives a brief overview of the four levels of the Chomsky hierarchy, with a special emphasis on context-free and regular languages. It then recapitulates the arguments why neither regular nor context-free grammar is sufficiently expressive to capture all phenomena in the natural language syntax. In the second part, two refinements of the Chomsky hierarchy are reviewed, which are both relevant to the extant research in cognitive science: the mildly context-sensitive languages (which are located between context-free and context-sensitive languages), and the sub-regular hierarchy (which distinguishes several levels of complexity within the class of regular languages).

122 citations


Book
15 Mar 2012
TL;DR: The authors The Science of Language and Mind: 1. Language, function, communication: language and the use of language 2. The distinctive nature of human concepts 3. Representation and computation 4. Reflections on the study of language 6. Parameters, canalization, innateness, universal grammar 7. Development, master/control genes, etc.
Abstract: Introduction Part I. The Science of Language and Mind: 1. Language, function, communication: language and the use of language 2. On a formal theory of language and its accommodation to biology. The distinctive nature of human concepts 3. Representation and computation 4. More on human concepts 5. Reflections on the study of language 6. Parameters, canalization, innateness, universal grammar 7. Development, master/control genes, etc. 8. Perfection and design (interview 20 January 2009) 9. Universal grammar and simplicity 10. On some intellectual ailments of scientists 11. The place of language in the mind 12. Chomsky's intellectual contributions 13. Simplicity and its role in Chomsky's work 14. Chomsky and Nelson Goodman Part II. Human Nature and its Study: 15. Chomsky on human nature and human understanding 16. Human nature and evolution: thoughts on sociobiology and evolutionary psychology 17. Human nature again 18. Morality and universalization 19. Optimism and grounds for it 20. Language, agency, common sense, and science 21. Philosophers and their roles 22. Biophysical limitations on understanding 23. Epistemology and biological limits 24. Studies of mind and behavior and their limitations 25. Linguistics and politics.

99 citations


Book
08 Nov 2012
TL;DR: This book discusses the history, purposes and limitations of formal languages, and how to do reason, and the need for counterbalance in science.
Abstract: Formal languages are widely regarded as being above all mathematical objects and as producing a greater level of precision and technical complexity in logical investigations because of this. Yet defining formal languages exclusively in this way offers only a partial and limited explanation of the impact which their use (and the uses of formalisms more generally elsewhere) actually has. In this book, Catarina Dutilh Novaes adopts a much wider conception of formal languages so as to investigate more broadly what exactly is going on when theorists put these tools to use. She looks at the history and philosophy of formal languages and focuses on the cognitive impact of formal languages on human reasoning, drawing on their historical development, psychology, cognitive science and philosophy. Her wide-ranging study will be valuable for both students and researchers in philosophy, logic, psychology and cognitive and computer science.

73 citations


Journal ArticleDOI
TL;DR: The brain represents grammars in its connectivity, and its ability for syntax is based on neurobiological infrastructure for structured sequence processing, and the acquisition of this ability is accounted for in an adaptive dynamical systems framework.
Abstract: The human capacity to acquire language is an outstanding scientific challenge to understand. Somehow our language capacities arise from the way the human brain processes, develops and learns in interaction with its environment. To set the stage, we begin with a summary of what is known about the neural organization of language and what our artificial grammar learning (AGL) studies have revealed. We then review the Chomsky hierarchy in the context of the theory of computation and formal learning theory. Finally, we outline a neurobiological model of language acquisition and processing based on an adaptive, recurrent, spiking network architecture. This architecture implements an asynchronous, event-driven, parallel system for recursive processing. We conclude that the brain represents grammars (or more precisely, the parser/generator) in its connectivity, and its ability for syntax is based on neurobiological infrastructure for structured sequence processing. The acquisition of this ability is accounted for in an adaptive dynamical systems framework. Artificial language learning (ALL) paradigms might be used to study the acquisition process within such a framework, as well as the processing properties of the underlying neurobiological infrastructure. However, it is necessary to combine and constrain the interpretation of ALL results by theoretical models and empirical studies on natural language processing. Given that the faculty of language is captured by classical computational models to a significant extent, and that these can be embedded in dynamic network architectures, there is hope that significant progress can be made in understanding the neurobiology of the language faculty.

54 citations


01 Jan 2012
TL;DR: In this paper, a computational analysis of metathesis patterns is presented, which distinguishes three categories of metatonhesis that differ in their computational complexity: the subsequential class is more restrictive than the regular class.
Abstract: This paper presents a computational analysis of metathesis patterns that distinguishes three categories of metathesis that differ in their computational complexity. These categories are local metathesis, bounded long distance metathesis, and unbounded long distance metathesis. Using the formalism of finite state automata, it is established that the first two categories are subsequential, while the third category is not (in fact, it is not even regular). These terms will be discussed in more detail below, but the overall distinction is one of complexity: the subsequential class is more restrictive than the regular class. Assigning a pattern to the subsequential class then identifies it as less complex than a pattern that is non-regular. Furthermore, the patterns identified as subsequential are robustly attested in the world’s languages, whereas the non-regular patterns are much less common and in fact only attested diachronically. Thus this result suggests an upper bound for how complex a synchronic phonological pattern can be. The outline of this paper is as follows. Section 2 presents the Chomsky Hierarchy of language patterns and discusses recent findings and hypotheses for where to classifying phonological patterns on the hierarchy. Section 3 defines subsequential FSTs and demonstrates how they can be used to describe phonological patterns. Section 4 presents the computational analysis of metathesis patterns. Section 5 discusses the typological implications of the analysis, as well as the implications for learning. Section 6 indicates directions for future work and concludes. 2. The Chomsky Hierarchy and the Subregular Hypothesis Chomsky (1956) presents a means to classify languages and language patterns based on their degree of complexity or how expressive they are. This hierarchy is shown in Figure 1.

27 citations


Journal ArticleDOI
TL;DR: In this article, reaction automata with string acceptors with multiset manipulation as a computing mechanism have been proposed, and it has been shown that these automata are computationally Turing universal.

21 citations


Journal ArticleDOI
TL;DR: The position of the family of languages generated by (circular) splicing systems within the Chomsky hierarchy has been investigated in this paper, where it is shown that given a circular splicing language and a regular language, whether they are equal is decidable.

18 citations


Proceedings ArticleDOI
25 Jun 2012
TL;DR: It is shown that the complexity of the verbal and pattern languages (in terms of level on the Chomsky hierarchy) does not depend on the Thurston automatic representation and that verbal languages cannot be context-free (unless they are either the empty word or the full group).
Abstract: This paper investigates the complexity of verbal languages and pattern languages of Thurston automatic groups in terms of the Chomsky hierarchy. Here the language generated by a pattern is taken as the set of representatives of all strings obtained when chosing values for the various variables. For noncommutative free groups, it is shown that the complexity of the verbal and pattern languages (in terms of level on the Chomsky hierarchy) does not depend on the Thurston automatic representation and that verbal languages cannot be context-free (unless they are either the empty word or the full group). They can however be indexed languages. Furthermore, it is shown that in the general case, it might depend on the exactly chosen Thurston automatic representation which level a verbal language takes in the Chomsky hierarchy. There are examples of groups where, in an appropriate representation, all pattern languages are regular or context-free, respectively.

15 citations


Journal ArticleDOI
TL;DR: In this article, an extended model of reaction automata, called @l-moves in the accepting process of reaction, was introduced, and the closure properties of language classes accepted by both linear-bounded reaction automaton (LRAs) and exponentially bounded reaction automaten (ERAs) were investigated.

14 citations


01 Jan 2012
TL;DR: In this article, the authors investigated the formal language theoretic properties of linearbounded reaction automata (LRAs) and exponentially-bounded RAs (ERAs), and established new relationships of languages accepted by LRAs and ERAs with the Chomsky hierarchy.
Abstract: Reaction automata are a formal model that has been introduced to investigate the computing powers of interactive behaviors of biochemical reactions([14]). Reaction automata are language acceptors with multiset rewriting mechanism whose basic frameworks are based on reaction systems introduced in [4]. In this paper we continue the investigation of reaction automata with a focus on the formal language theoretic properties of subclasses of reaction automata, called linearbounded reaction automata (LRAs) and exponentially-bounded reaction automata (ERAs). Besides LRAs, we newly introduce an extended model (denoted by �-LRAs) by allowing �-moves in the accepting process of reaction, and investigate the closure properties of language classes accepted by both LRAs and �-LRAs. Further, we establish new relationships of language classes accepted by LRAs and by ERAs with the Chomsky hierarchy. The main results include the following : (i) the class of languages accepted by �-LRAs forms an AFL with additional closure properties, (ii) any recursively enumerable language can be expressed as a homomorphic image of a language accepted by an LRA, (iii) the class of languages accepted by ERAs coincides with the class of context-sensitive languages.

Book ChapterDOI
29 Aug 2012
TL;DR: Formal language theory, introduced by Noam Chomsky in the 1950s as a tool for a description of natural languages, has also been widely involved in modeling and investigating phenomena appearing in computer science, artificial intelligence and other related fields because the symbolic representation of a modeled system in the form of strings makes its processes by information processing tools very easy.
Abstract: Formal language theory, introduced by Noam Chomsky in the 1950s as a tool for a description of natural languages [8–10], has also been widely involved in modeling and investigating phenomena appearing in computer science, artificial intelligence and other related fields because the symbolic representation of a modeled system in the form of strings makes its processes by information processing tools very easy: coding theory, cryptography, computation theory, computational linguistics, natural computing, and many other fields directly use sets of strings for the description and analysis of modeled systems. In formal language theory a model for a phenomenon is usually constructed by representing it as a set of words, i.e., a language over a certain alphabet, and defining a generative mechanism, i.e., a grammar which identifies exactly the words of this set. With respect to the forms of their rules, grammars and their languages are divided into four classes of Chomsky hierarchy: recursively enumerable, context-sensitive, context-free and regular.



Posted Content
TL;DR: The main results include the following: any recursively enumerable language can be expressed as a homomorphic image of a language accepted by an LRA, and the class of languages accepted by ERAs coincides with theclass of context-sensitive languages.
Abstract: Reaction automata are a formal model that has been introduced to investigate the computing powers of interactive behaviors of biochemical reactions([14]). Reaction automata are language acceptors with multiset rewriting mechanism whose basic frameworks are based on reaction systems introduced in [4]. In this paper we continue the investigation of reaction automata with a focus on the formal language theoretic properties of subclasses of reaction automata, called linearbounded reaction automata (LRAs) and exponentially-bounded reaction automata (ERAs). Besides LRAs, we newly introduce an extended model (denoted by lambda-LRAs) by allowing lambda-moves in the accepting process of reaction, and investigate the closure properties of language classes accepted by both LRAs and lambda-LRAs. Further, we establish new relationships of language classes accepted by LRAs and by ERAs with the Chomsky hierarchy. The main results include the following : (i) the class of languages accepted by lambda-LRAs forms an AFL with additional closure properties, (ii) any recursively enumerable language can be expressed as a homomorphic image of a language accepted by an LRA, (iii) the class of languages accepted by ERAs coincides with the class of context-sensitive languages.

Proceedings ArticleDOI
29 May 2012
TL;DR: The theory foundation of research and realized method that can construct structure of partial order relations of attributes which can obtain semantics of concept hierarchy through intent of attribute concept are discussed and the experimental process shows that this method is effective.
Abstract: The semantics of concept hierarchy can show mapping hierarchy relations among attributes in Database System. The Formal Concept Analysis (FCA) is a mathematical tool for analysis based on formal concept and concept hierarchy and can express generalization and specialization relations among formal concepts. In this paper the theories of FCA are introduced into the research of semantics of concept hierarchy in database system firstly and partial order relations of attributes can be used as formal rules for generating semantics of concept hierarchy. The theory foundation of research and realized method that can construct structure of partial order relations of attributes which can obtain semantics of concept hierarchy through intent of attribute concept are discussed. The experimental process shows that this method is effective.

Journal Article
TL;DR: In this article, it was shown that the class of enumerable languages is not reflexive, and that the classes of context-free, context-sensitive, and computable languages are reflexive.
Abstract: The class of regular languages can be generated from the regular expressions. These regular expressions, however, do not themselves form a regular language, as can be seen using the pumping lemma. On the other hand, the class of enumerable languages can be enumerated by a universal language that is one of its elements. We say that the enumerable languages are reflexive. In this paper we investigate what other classes of the Chomsky Hierarchy are reflexive in this sense. To make this precise we require that the decoding function is itself specified by a member of the same class. Could it be that the regular languages are reflexive, by using a different collection of codes? It turns out that this is impossible: the collection of regular languages is not reflexive. Similarly the collections of the context-free, context-sensitive, and computable languages are not reflexive. Therefore the class of enumerable languages is the only reflexive one in the Chomsky Hierarchy.