scispace - formally typeset
Search or ask a question

Showing papers on "Chomsky hierarchy published in 2010"


01 Jan 2010
TL;DR: This note proposes a distributed architecture (based on cell-like P systems, with their skin membranes communicating through channels as in tissue-likeP systems, according to specified rules of the antiport type), where parts of a problem can be introduced as inputs in various components and then processed in parallel.
Abstract: Although P systems are distributed parallel computing devices, no explicit way of handling the input in a distributed way in this framework was considered so far. This note proposes a distributed architecture (based on cell-like P systems, with their skin membranes communicating through channels as in tissue-like P systems, according to specified rules of the antiport type), where parts of a problem can be introduced as inputs in various components and then processed in parallel. The respective devices are called dP systems, with the case of accepting strings called dP automata. The communication complexity can be evaluated in various ways: statically (counting the communication rules in a dP system which solves a given problem), or dynamically (counting the number of communication steps, of communication rules used in a computation, or the number of objects communicated). For each measure, two notions of "parallelizability" can be introduced. Besides (informal) definitions, some illustrations of these idea are provided for dP automata: each regular language is "weakly parallelizable" (i.e., it can be recognized in this framework, using a constant number of communication steps), and there are languages of various types with respect to Chomsky hierarchy which are "efficiently parallelizable" (they are parallelizable and, moreover, are accepted in a faster way by a dP automaton than by a single P automaton). Several suggestions for further research are made.

61 citations


Journal ArticleDOI
TL;DR: In this article, a distributed architecture based on cell-like P systems, with their skin membranes communicating through channels according to specified rules of the antiport type, where parts of a problem can be introduced as inputs in various components and then processed in parallel is considered.
Abstract: Although P systems are distributed parallel computing devices, no explicit way of handling the input in a distributed way in this framework was considered so far. This note proposes a distributed architecture (based on cell-like P systems, with their skin membranes communicating through channels as in tissue-like P systems, according to specified rules of the antiport type), where parts of a problem can be introduced as inputs in various components and then processed in parallel. The respective devices are called dP systems, with the case of accepting strings called dP automata. The communication complexity can be evaluated in various ways: statically (counting the communication rules in a dP system which solves a given problem), or dynamically (counting the number of communication steps, of communication rules used in a computation, or the number of objects communicated). For each measure, two notions of "parallelizability" can be introduced. Besides (informal) definitions, some illustrations of these idea are provided for dP automata: each regular language is "weakly parallelizable" (i.e., it can be recognized in this framework, using a constant number of communication steps), and there are languages of various types with respect to Chomsky hierarchy which are "efficiently parallelizable" (they are parallelizable and, moreover, are accepted in a faster way by a dP automaton than by a single P automaton). Several suggestions for further research are made.

57 citations


Book ChapterDOI
01 Jan 2010
TL;DR: The main research interest is in understanding the types of rules that structure acoustic signals and cognition in a wide variety of different vertebrate species, with the hope that this will provide a better basis for understanding the biology and evolution of the computational subsystems involved in human language.
Abstract: Introduction There has recently been a resurgence of scientific interest concerning the importance of various types of phrase structure in human language, and their potential presence in other species (Hauser, Chomsky, and Fitch 2002; Fitch and Hauser 2004; Everett 2005; Fitch, Hauser, and Chomsky 2005; Pinker and Jackendoff 2005; Gentner et al . 2006). Following Hauser, Chomsky, and Fitch (2002, HCF hereafter), many of these recent discussions have used the term “recursion,” but the term has rarely been defined explicitly. There are several possible interpretations of this word, which is used somewhat differently in different disciplines, without there being one universally accepted scientific definition. With the recent advent of intense inter-disciplinary discussion of these issues, it has become clear that several different interpretations of the term are being used interchangeably. We seem to have reached a point where serious misunderstandings are in danger of propagating through the literature. The main purpose of this paper is to clarify these different meanings, and in the process to examine the implications of recent and ongoing experiments for different types of grammars in animals and humans. My main research interest is in understanding the types of rules that structure acoustic signals and cognition in a wide variety of different vertebrate species, with the hope that this will provide a better basis for understanding the biology and evolution of the computational subsystems involved in human language.

47 citations


Book ChapterDOI
31 Dec 2010

23 citations


Journal ArticleDOI
01 Jan 2010
TL;DR: A time- and space-efficient incremental arc-consistency algorithm for context-free grammars, investigate when logic combinations of grammar constraints are tractable, and show how to exploit non-constant size Grammars and reorderings of languages.
Abstract: With the introduction of the Regular Membership Constraint, a new line of research has opened where constraints are based on formal languages. This paper is taking the next step, namely to investigate constraints based on grammars higher up in the Chomsky hierarchy. We devise a time- and space-efficient incremental arc-consistency algorithm for context-free grammars, investigate when logic combinations of grammar constraints are tractable, show how to exploit non-constant size grammars and reorderings of languages, and study where the boundaries run between regular, context-free, and context-sensitive grammar filtering.

21 citations


Journal ArticleDOI
01 Jan 2010
TL;DR: In this paper, a restricted version of restarting automata called clearing restarted automata (CSA) is proposed, which is a linguistically motivated method for checking correctness of a sentence.
Abstract: Restarting automata were introduced as a model for analysis by reduction, which is a linguistically motivated method for checking correctness of a sentence. We propose a new restricted version of restarting automata called clearing restarting automata with a very simple definition but simultaneously with interesting properties with respect to their possible applications. The new model can be learned very efficiently from positive examples and its stronger version can be used to learn effectively a large class of languages. We relate the class of languages recognized by clearing restarting automata to the Chomsky hierarchy.

18 citations


Book ChapterDOI
24 May 2010
TL;DR: This work defines context free grammars where the non-terminals of the grammar correspond to the syntactic congruence classes, and a residuated lattice structure from the Galois connection between strings and contexts is defined, which allows a class of languages that includes some non-context free languages, many context-free languages and all regular languages.
Abstract: Learnability is a vital property of formal grammars: representation classes should be defined in such a way that they are learnable. One way to build learnable representations is by making them objective or empiricist: the structure of the representation should be based on the structure of the language. Rather than defining a function from representation to language we should start by defining a function from the language to the representation: following this strategy gives classes of representations that are easy to learn. We illustrate this approach with three classes, defined in analogy to the lowest three levels of the Chomsky hierarchy. First, we recall the canonical deterministic finite automaton, where the states of the automaton correspond to the right congruence classes of the language. Secondly, we define context free grammars where the non-terminals of the grammar correspond to the syntactic congruence classes, and where the productions are defined by the syntactic monoid; finally we define a residuated lattice structure from the Galois connection between strings and contexts, which we call the syntactic concept lattice, and base a representation on this, which allows us to define a class of languages that includes some non-context free languages, many context-free languages and all regular languages. All three classes are efficiently learnable under suitable learning paradigms.

15 citations


Journal ArticleDOI
TL;DR: It is proved that twelve nonterminals are enough for cooperating distributed grammar systems working in the terminal derivation mode with two left-forbidding components (including erasing productions) to characterize the family of recursively enumerable languages.

13 citations


BookDOI
08 Apr 2010
TL;DR: The authors discuss the historical role of genitives in the emergence of generative grammar and the role of noun phrases in the English noun phrase movement and control in Chaucer's verse in comparison with those in his prose.
Abstract: 1. Preface (by Roberts, Ian) 2. 1. Genesis of generative grammar 3. Systems of syntactic analysis (by Chomsky, Noam) 4. Some methodological remarks on generative grammar (by Chomsky, Noam) 5. Knowledge of language: Its elements and origins (by Chomsky, Noam) 6. 2. Current issues in language descriptions 7. Germanic passive constructions (by Askedal, John Ole) 8. Prosodic constraints on Old English alliteration (by Fujiwara, Yasuaki) 9. The historical role of genitives in the emergence of DP (by Miyamae, Kazuyo) 10. The word pairs in Chaucer's verse in comparison with those in his prose (by Tani, Akinobu) 11. A short note on movement and control in the English noun phrase (by Hamamatsu, Junji) 12. Coordinating and subordinating conjunctions in spoken American English (by Iyeiri, Yoko) 13. Complement capacities in German: Three types of complements (by Hosaka, Yasuhito) 14. Index of names 15. Index of subjects 16. Editors & contributors

10 citations


Journal ArticleDOI
TL;DR: A result related to word composition is presented, showing that for every two arbitrary words there exist two other words such that their composition (two by two) is amiable.
Abstract: Restricting the Parikh Matrix mapping to a language rather than to an alphabet rises a set of problems that seem interesting to us. Moreover, amiability (or M-equivalence as it is named by other authors) is exploited in order to characterise certain types of languages. The paper also proposes a series of results establishing relations between classes of languages defined by Parikh matrix mappings and the Chomsky hierarchy. Finally, a result related to word composition concludes the paper, showing that for every two arbitrary words there exist two other words such that their composition (two by two) is amiable.

9 citations


Proceedings Article
01 Jan 2010
TL;DR: A variant of Gold-style learners that is not required to infer precise descriptions of the languages in a class, but that must find descriptive patterns, i.e., optimal generalisations within a class of pattern languages is introduced.
Abstract: In the present paper, we introduce a variant of Gold-style learners that is not required to infer precise descriptions of the languages in a class, but that must find descriptive patterns, i.e., optimal generalisations within a class of pattern languages. Our first main result characterises those indexed families of recursive languages that can be inferred by such learners, and we demonstrate that this characterisation shows enlightening connections to Angluin?s corresponding result for exact inference. Furthermore, this result reveals that our model can be interpreted as an instance of a natural extension of Gold?s model of language identification in the limit. Using a notion of descriptiveness that is restricted to the natural subclass of terminal-free E-pattern languages, we introduce a generic inference strategy, and our second main result characterises those classes of languages that can be generalised by this strategy. This characterisation demonstrates that there are major classes of languages that can be generalised in our model, but not be inferred by a normal Gold-style learner. Our corresponding technical considerations lead to insights of intrinsic interest into combinatorial and algorithmic properties of pattern languages. Highlights? We study limit learners that output descriptive patterns instead of exact grammars. ? Inferrability of indexed families depends on concepts related to Angluin?s telltales. ? Our model is shown to be an instance of a natural extension of Gold-style learning. ? Terminal-free E-descriptive patterns can be inferred for rich classes of languages. ? Our proofs make use of novel insights into combinatorial properties of such patterns.

Journal ArticleDOI
Kaoru Fujioka1
TL;DR: The Chomsky-Schutzenberger representation theorem is improved and it is shown that each context-free language L can be represented in the form L = h (D ∩ R), where D is a Dyck language, R is a strictly 3-testable language, and h is a morphism.
Abstract: In this paper, we obtain some refinement of representation theorems for context-free languages by using Dyck languages, insertion systems, strictly locally testable languages, and morphisms. For instance, we improved the Chomsky-Schutzenberger representation theorem and show that each context-free language L can be represented in the form L = h (D ∩ R), where D is a Dyck language, R is a strictly 3-testable language, and h is a morphism. A similar representation for context-free languages can be obtained, using insertion systems of weight (3, 0) and strictly 4-testable languages.

Journal ArticleDOI
TL;DR: In this paper, the authors highlight an individualist streak in both Davidson's and Chomsky's notions of language and compare them with a view of language as a social and historical reality that cannot be exhausted by any formal theory and cannot be reduced to properties of individual speakers.
Abstract: The aim of this paper is to highlight an individualist streak in both Davidson’s conception of language and Chomsky’s. In the first part of the paper, I argue that in Davidson’s case this individualist streak is a consequence of an excessively strong conception of what the compositional nature of linguistic meaning requires, and I offer a weaker conception of that requirement that can do justice to both the publicity and the compositionality of language. In the second part of the paper, I offer a comparison between Davidson’s position on the unreality of public languages, and Chomsky’s position regarding the epiphenomenal status of “externalized” languages. In Chomsky’s case, as in Davidson’s, languages are individuated in terms of the formal theories that serve to account for their systematic structure, and this assumption rests upon a similarly strong and similarly questionable understanding of what it is to employ finite means in pursuit of an infinite task. The alternative, at which I can only hint, is a view of language as a social and historical reality, i.e., a realm of social fact that cannot be exhausted by any formal theory and cannot be reduced to properties of individual speakers.

Journal ArticleDOI
TL;DR: In this paper, the authors discuss the relationship between the language organ, universal grammar, and natural language, and present all the requirements proposed by Chomsky that should be fulfilled by linguistics if it is to be considered an empirical science.
Abstract: According to Chomsky linguistics should be treated as an empirical science. The foundations of language, if it is considered as a discrete and infinite communication tool, are provided by the specifically human biological system described by Chomsky as the faculty of language in the narrow sense. The faculty of language in the broad sense, on the other hand, embraces all the mechanisms that take a part in language production but are not exclusively human. The properties of language are a starting point for the discussion of the biological aspects of Chomsky's theory. Subsequently, I describe the theory of universal grammar. After that, issues of the language organ, the modular character of the mind, and the narrow and broad senses of the theory of the faculty of language are introduced. Next, I present all the requirements proposed by Chomsky that should be fulfilled by linguistics if it is to be considered an empirical science. Finally, I summarize the relationships between the language organ, universal grammar, and natural language.

Book ChapterDOI
07 Jun 2010
TL;DR: A new family of stateless (non-deterministic) pushdown automata are used to accept languages of the Chomsky hierarchy, allowing a kind of λ-transitions the automata accept any recursively enumerable languages.
Abstract: In this paper a new family of stateless (non-deterministic) pushdown automata are used to accept languages of the Chomsky hierarchy Having only a stack with at most 1 symbol the regular languages can be recognized The usual pushdown automata accept the context-free languages The extended version which uses additional half-translucent shadow symbols accept the context-sensitive languages Finally, allowing a kind of λ-transitions the automata accept any recursively enumerable languages.

01 Jan 2010
TL;DR: Results on growing context-sensitive grammars are reviewed, some open problems are proposed and various characterizations of the model, efficient recognition algorithm and the properties of its deterministic variant justify the practical value.
Abstract: Growing context-sensitive grammars were introduced in 1986 as a restricted variant of context-sensitive grammars, where all productions are length increasing. Several interesting properties of these grammars have been shown since then, including polynomial time complexity of the membership problem and machine model characterizations. Various characterizations of the model, efficient recognition algorithm and the properties of its deterministic variant (possessing a characterization by string-rewriting systems) justify the practical value. Moreover, as pointed out by McNaughton in 1999, growing context-sensitive grammars complement the Chomsky hierarchy in a very natural way. This article reviews results on this topic and proposes some open problems.