scispace - formally typeset
Search or ask a question
Topic

Chomsky hierarchy

About: Chomsky hierarchy is a research topic. Over the lifetime, 601 publications have been published within this topic receiving 31067 citations. The topic is also known as: Chomsky–Schützenberger hierarchy.


Papers
More filters
Journal ArticleDOI
01 Feb 1991
TL;DR: In this paper, it was shown that the class of concepts defined by formal systems consisting of at mostn expressions is monotonic, i.e., a formal system can define only finitely many concepts for any finite set X and anyn.
Abstract: A formal system is a finite set of expressions, such as a grammar or a Prolog program. A semantic mapping from formal systems to concepts is said to be monotonic if it maps larger formal systems to larger concepts. A formal system Γ is said to be reduced with respect to a finite setX if the concept defined by Γ containsX but the concepts defined by any proper subset Γ′ of Γ cannot contain some part ofX. Assume a semantic mapping is monotonic and formal systems consisting of at mostn expressions that are reduced with respect toX can define only finitely many concepts for any finite setX and anyn. Then, the class of concepts defined by formal systems consisting of at mostn expressions is shown to be inferable from positive data. As corollaries, the class of languages defined by length-bounded elementary formal systems consisting of at most,n axioms, the class of languages generated by context-sensitive grammars consisting of at mostn productions, and the class of minimal models of linear Prolog programs consisting of at mostn definite clauses are all shown to be inferable from positive data.

65 citations

Journal ArticleDOI
TL;DR: This paper gives a progression of automata and shows that it corresponds exactly to the language hierarchy defined with control grammars, the first member of which is context-free languages.

63 citations

BookDOI
01 Jan 1987
TL;DR: An Elementary Proof of the Peters-Ritchie Theorem and Computationally Relevant Properties of Natural Languages and Their Grammars are presented.
Abstract: Prologue.- What is Mathematical Linguistics?.- I. Early Nontransformational Grammar.- to Part I.- Formal Linguistics and Formal Logic.- An Elementary Proof of the Peters-Ritchie Theorem.- On Constraining the Class of Transformational Languages.- Generative Grammars without Transformation Rules-A Defense of Phrase Structure.- A Program for Syntax.- II Modern Context-Free-Like Models.- to Part II.- Natural Languages and Context-Free Languages.- Unbounded Dependency and Coordinate Structure.- On Some Formal Properties of MetaRules.- Some Generalizations of Categorial Grammars.- III More than Context-Free and Less than Transformational Grammar.- to Part III.- Cross-serial Dependencies in Dutch.- Evidence Against the Context-Freeness of Natural Language.- English is not a Context-Free Language.- The Complexity of the Vocabulary of Bambara.- Context-Sensitive Grammar and Natural Language Syntax.- How Non-Context Free is Variable Binding?.- Prologue.- Computationally Relevant Properties of Natural Languages and Their Grammars.- Index of Languages.- Name Index.

62 citations

01 Jan 2010
TL;DR: This note proposes a distributed architecture (based on cell-like P systems, with their skin membranes communicating through channels as in tissue-likeP systems, according to specified rules of the antiport type), where parts of a problem can be introduced as inputs in various components and then processed in parallel.
Abstract: Although P systems are distributed parallel computing devices, no explicit way of handling the input in a distributed way in this framework was considered so far. This note proposes a distributed architecture (based on cell-like P systems, with their skin membranes communicating through channels as in tissue-like P systems, according to specified rules of the antiport type), where parts of a problem can be introduced as inputs in various components and then processed in parallel. The respective devices are called dP systems, with the case of accepting strings called dP automata. The communication complexity can be evaluated in various ways: statically (counting the communication rules in a dP system which solves a given problem), or dynamically (counting the number of communication steps, of communication rules used in a computation, or the number of objects communicated). For each measure, two notions of "parallelizability" can be introduced. Besides (informal) definitions, some illustrations of these idea are provided for dP automata: each regular language is "weakly parallelizable" (i.e., it can be recognized in this framework, using a constant number of communication steps), and there are languages of various types with respect to Chomsky hierarchy which are "efficiently parallelizable" (they are parallelizable and, moreover, are accepted in a faster way by a dP automaton than by a single P automaton). Several suggestions for further research are made.

61 citations

Journal ArticleDOI
TL;DR: It is argued that the hypothesis that humans employ distinct learning mechanisms for phonology and syntax currently offers the best explanation for this difference.
Abstract: An important distinction between phonology and syntax has been overlooked. All phonological patterns belong to the regular region of the Chomsky Hierarchy, but not all syntactic patterns do. We argue that the hypothesis that humans employ distinct learning mechanisms for phonology and syntax currently offers the best explanation for this difference.

61 citations


Network Information
Related Topics (5)
Rule-based machine translation
8.8K papers, 240.5K citations
72% related
Syntax
16.7K papers, 518.6K citations
71% related
Time complexity
36K papers, 879.5K citations
71% related
Type (model theory)
38.9K papers, 670.5K citations
70% related
Semantics
24.9K papers, 653K citations
70% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20232
20223
20219
20208
201912
201810