scispace - formally typeset
Search or ask a question
Topic

Context-sensitive grammar

About: Context-sensitive grammar is a research topic. Over the lifetime, 1938 publications have been published within this topic receiving 45911 citations. The topic is also known as: CSG.


Papers
More filters
Journal ArticleDOI
TL;DR: The closure properties of the language family generated by linear conjunctive grammars are investigated; the main result is its closure under complement, which implies that it is closed under all set-theoretic operations.

27 citations

Journal ArticleDOI
TL;DR: A number of decision problems are resolved for these classes of grammars and the languages they generate, largely in the negative.
Abstract: Phrase-structure grammars were first introduced and studied by Chomsky as devices for generating the sentences of a language. By means of increasingly heavy restrictions on the productions (rewriting rules), four types of grammars were singled out by Chomsky: type 0 (unrestricted), type 1 (context-dependent), type 2 (context-free), and type 3 (finite state). In this paper, a number of decision problems are resolved for these classes of grammars and the languages they generate, largely in the negative. A table of decision problems for grammars of the four different types is presented. This table indicates the problems which have been found to be decidable or undecidable. The ambiguity problem for type 3 grammars and the emptiness and infiniteness problems for type 2 grammars are shown to be decidable. A known unsolvable problem, the Post correspondence problem, is the key to the undecidability proofs which are given. For type 2 grammars, the ambiguity and equivalence problems are proved undecidable; the emptiness and infiniteness problems for type 1 grammars are shown to be undecidable. There is no algorithm to decide whether a language of a given type can be generated by a grammar of a more restricted type. The results on type 2 grammars were first obtained by Bar-Hillel, Perles, and Shamir.

27 citations

Patent
Mehryar Mohri1
18 Sep 2007
TL;DR: In this paper, the output rules are output in a specific format that specifies, for each rule, the lefthand non-terminal symbol, a single right-hand nonterminal symbols, and zero, one or more terminal symbols.
Abstract: Context-free grammars generally comprise a large number of rules, where each rule defines how a string of symbols is generated from a different series of symbols. While techniques for creating finite-state automata from the rules of context-free grammars exist, these techniques require an input grammar to be strongly regular. Systems and methods that convert the rules of a context-free grammar into a strongly regular grammar include transforming each input rule into a set of output rules that approximate the input rule. The output rules are all right- or left-linear and are strongly regular. In various exemplary embodiments, the output rules are output in a specific format that specifies, for each rule, the left-hand non-terminal symbol, a single right-hand non-terminal symbol, and zero, one or more terminal symbols. If the input context-free grammar rule is weighted, the weight of that rule is distributed and assigned to the output rules.

27 citations

Journal ArticleDOI
TL;DR: This paper compares the generative power of colonies with two cooperation strategies and with several types of the selection of the alphabet for the common language.

27 citations

Journal ArticleDOI
TL;DR: The n-fold fuzzy grammars whose rules are of context-free form can be shown to generate context-sensitive languages by setting a threshold appropriately, while the Fuzzy Grammars by Lee and Zadeh with context- free rules, however, cannot generate context -sensitive languages.

27 citations


Network Information
Related Topics (5)
Graph (abstract data type)
69.9K papers, 1.2M citations
80% related
Time complexity
36K papers, 879.5K citations
79% related
Concurrency
13K papers, 347.1K citations
78% related
Model checking
16.9K papers, 451.6K citations
77% related
Directed graph
12.2K papers, 302.4K citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202311
202212
20211
20204
20191
20181