scispace - formally typeset
Search or ask a question
Topic

Chomsky hierarchy

About: Chomsky hierarchy is a research topic. Over the lifetime, 601 publications have been published within this topic receiving 31067 citations. The topic is also known as: Chomsky–Schützenberger hierarchy.


Papers
More filters
Journal ArticleDOI
TL;DR: These four classes form a deterministic counterpart of the classical Chomsky hierarchy, which is a kind of phrase structure grammar having a restricted type of rewriting rules, where parsing can be performed without backtracking.
Abstract: We introduce a new class of grammars called uniquely parsable grammars (UPGs). A UPG is a kind of phrase structure grammar having a restricted type of rewriting rules, where parsing can be performed without backtracking. We show that, in spite of such restriction to the rules, UPGs are universal in their generating ability. We then define three subclasses of UPGs. They are M-UPGs (monotonic UPGs), RC-UPGs (UPGs with right-terminating and context-free-like rules), and REG-UPGs (regular UPGs). It is proved that the generating abilities of the classes of M-UPGs, RC-UPGs, and REG-UPGs are exactly characterized by the classes of deterministic linear-bounded automata, deterministic pushdown automata, and deterministic finite automata, respectively. Especially, the class of RC-UPGs gives a very simple grammatical characterization of the class of deterministic context-free languages. Thus, these four classes form a deterministic counterpart of the classical Chomsky hierarchy.

15 citations

Book ChapterDOI
03 Sep 1979

15 citations

Journal ArticleDOI
01 May 2019-Synthese
TL;DR: This paper argues that a central misconstrual of formal apparatus of recursive operations such as the set-theoretic operation merge has led to a mathematisation of the object of inquiry, producing a strong analogy with discrete mathematics and especially arithmetic.
Abstract: The concept of linguistic infinity has had a central role to play in foundational debates within theoretical linguistics since its more formal inception in the mid-twentieth century. The conceptualist tradition, marshalled in by Chomsky and others, holds that infinity is a core explanandum and a link to the formal sciences. Realism/Platonism takes this further to argue that linguistics is in fact a formal science with an abstract ontology. In this paper, I argue that a central misconstrual of formal apparatus of recursive operations such as the set-theoretic operation merge has led to a mathematisation of the object of inquiry, producing a strong analogy with discrete mathematics and especially arithmetic. The main product of this error has been the assumption that natural, like some formal, languages are discretely infinite. I will offer an alternative means of capturing the insights and observations related to this posit in terms of scientific modelling. My chief aim will be to draw from the larger philosophy of science literature in order to offer a position of grammars as models compatible with various foundational interpretations of linguistics while being informed by contemporary ideas on scientific modelling for the natural and social sciences.

15 citations

Journal ArticleDOI
17 Apr 2015-PLOS ONE
TL;DR: Previous findings in demonstrating learning effects for nested and cross-serial dependencies with more natural stimulus materials in a classical AGL paradigm are extended, taking as a starting point for further exploring the degree to which the Chomsky Hierarchy reflects cognitive processes.
Abstract: This study investigated whether formal complexity, as described by the Chomsky Hierarchy, corresponds to cognitive complexity during language learning. According to the Chomsky Hierarchy, nested dependencies (context-free) are less complex than cross-serial dependencies (mildly context-sensitive). In two artificial grammar learning (AGL) experiments participants were presented with a language containing either nested or cross-serial dependencies. A learning effect for both types of dependencies could be observed, but no difference between dependency types emerged. These behavioral findings do not seem to reflect complexity differences as described in the Chomsky Hierarchy. This study extends previous findings in demonstrating learning effects for nested and cross-serial dependencies with more natural stimulus materials in a classical AGL paradigm after only one hour of exposure. The current findings can be taken as a starting point for further exploring the degree to which the Chomsky Hierarchy reflects cognitive processes.

15 citations

Book ChapterDOI
24 May 2010
TL;DR: This work defines context free grammars where the non-terminals of the grammar correspond to the syntactic congruence classes, and a residuated lattice structure from the Galois connection between strings and contexts is defined, which allows a class of languages that includes some non-context free languages, many context-free languages and all regular languages.
Abstract: Learnability is a vital property of formal grammars: representation classes should be defined in such a way that they are learnable. One way to build learnable representations is by making them objective or empiricist: the structure of the representation should be based on the structure of the language. Rather than defining a function from representation to language we should start by defining a function from the language to the representation: following this strategy gives classes of representations that are easy to learn. We illustrate this approach with three classes, defined in analogy to the lowest three levels of the Chomsky hierarchy. First, we recall the canonical deterministic finite automaton, where the states of the automaton correspond to the right congruence classes of the language. Secondly, we define context free grammars where the non-terminals of the grammar correspond to the syntactic congruence classes, and where the productions are defined by the syntactic monoid; finally we define a residuated lattice structure from the Galois connection between strings and contexts, which we call the syntactic concept lattice, and base a representation on this, which allows us to define a class of languages that includes some non-context free languages, many context-free languages and all regular languages. All three classes are efficiently learnable under suitable learning paradigms.

15 citations


Network Information
Related Topics (5)
Rule-based machine translation
8.8K papers, 240.5K citations
72% related
Syntax
16.7K papers, 518.6K citations
71% related
Time complexity
36K papers, 879.5K citations
71% related
Type (model theory)
38.9K papers, 670.5K citations
70% related
Semantics
24.9K papers, 653K citations
70% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20232
20223
20219
20208
201912
201810