scispace - formally typeset
Search or ask a question
Topic

Chomsky hierarchy

About: Chomsky hierarchy is a research topic. Over the lifetime, 601 publications have been published within this topic receiving 31067 citations. The topic is also known as: Chomsky–Schützenberger hierarchy.


Papers
More filters
Proceedings ArticleDOI
05 May 1975
TL;DR: The main goal has been to describe languages which, for instance, are not context-free but are still context- sensitive, without using the powerful and complex concept of context-sensitive grammars.
Abstract: The so-called Chomsky hierarchy [5], consisting of regular, context-free, context-sensitive, and recursively enumerable languages, does not account for many “real world” classes of languages, e.g., programming languages and natural languages [4]. This is one of the reasons why many attempts have been made to “refine” the original Chomsky classification. The main goal has been to describe languages which, for instance, are not context-free but are still context-sensitive, without using the powerful and complex concept of context-sensitive grammars.

4 citations

Proceedings ArticleDOI
21 May 2015
TL;DR: This paper investigates an alternative approach to inferring grammars via pattern languages and elementary formal system frameworks and summarizes inferability results for subclasses of both frameworks and discusses how they map to the Chomsky hierarchy.
Abstract: Formal Language Theory for Security (LANGSEC) has proposed that formal language theory and grammars be used to define and secure protocols and parsers. The assumption is that by restricting languages to lower levels of the Chomsky hierarchy, it is easier to control and verify parser code. In this paper, we investigate an alternative approach to inferring grammars via pattern languages and elementary formal system frameworks. We summarize inferability results for subclasses of both frameworks and discuss how they map to the Chomsky hierarchy. Finally, we present initial results of pattern language learning on logged HTTP sessions and suggest future areas of research.

4 citations

Book ChapterDOI
18 May 2017
TL;DR: The first experiments show that although that composing large finite-state machines is extremely costly theoretically, the fact that linguistic resources in a typical NooJ cascade depend on each other heavily keeps the size of all intermediary machines manageable.
Abstract: NooJ is a linguistic development environment that allows linguists to construct large linguistic resources of the four types in the Chomsky hierarchy. NooJ uses a bottom-up, “cascade” approach to sequentially apply these linguistic resources: each parsing operation accesses a Text Annotation Structure, and enriches it by adding or removing linguistic annotations to it. We discuss the drawbacks of this approach, and we present a new approach that requires that all NooJ linguistic resources be represented by a single type of finite-state machine. In order to do that, we must solve theoretical problems such as “how to handle Context-Sensitive Grammars with finite-state machines”, as well as some engineering problems such as “how to compose sets of large dictionaries and grammars into a single finite-state machine”. Our first experiments show that although that composing large finite-state machines is extremely costly theoretically, the fact that linguistic resources in a typical NooJ cascade depend on each other heavily keeps the size of all intermediary machines manageable. Once the final resulting finite-state machine has been compiled and loaded in memory (e.g. on a webserver) it can be used to parse large texts in linear time.

4 citations


Network Information
Related Topics (5)
Rule-based machine translation
8.8K papers, 240.5K citations
72% related
Syntax
16.7K papers, 518.6K citations
71% related
Time complexity
36K papers, 879.5K citations
71% related
Type (model theory)
38.9K papers, 670.5K citations
70% related
Semantics
24.9K papers, 653K citations
70% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20232
20223
20219
20208
201912
201810