scispace - formally typeset
Search or ask a question
Topic

Context-sensitive grammar

About: Context-sensitive grammar is a research topic. Over the lifetime, 1938 publications have been published within this topic receiving 45911 citations. The topic is also known as: CSG.


Papers
More filters
Book ChapterDOI
05 Mar 1990
TL;DR: An elementary introduction to the notion of an NLC graph grammar is given, and several of its extensions and variations are discussed in a systematic way.
Abstract: An elementary introduction to the notion of an NLC graph grammar is given, and several of its extensions and variations are discussed in a systematic way. Simple concepts are considered rather than technical details.

58 citations

Proceedings ArticleDOI
22 Aug 1988
TL;DR: Two algorithms which construct two different types of generators for lexical functional grammars (LFGs) are described, which generate sentences from functional structures and the second from semantic structures.
Abstract: This paper describes two algorithms which construct two different types of generators for lexical functional grammars (LFGs). The first type generates sentences from functional structures and the second from semantic structures. The latter works on the basis of oxtended LFGs, which contain a mapping from f-structures into semantic structures. Both algorithms can be used on all grammars within the respective class of LFG-grammars. Thus sentences can be generated from input structures by means of LFG-grammars and the same grammar formalism, although not necessarily the same grammar, can be used for both analysis and synthesis.

57 citations

Journal Article
TL;DR: This paper describes a logic grammar formalism, modifier structure grammars (MSGs), together with an interpreter written in Prolog, which can handle coordination (and other natural language constructions) in a reasonable and general way.
Abstract: Logic grammars are grammars expressible in predicate logic. Implemented in the programming language Prolog, logic grammar systems have proved to be a good basis for natural language processing. One of the most difficult constructions for natural language grammars to treat is coordination (construction with conjunctions like 'and'). This paper describes a logic grammar formalism, modifier structure grammars (MSGs), together with an interpreter written in Prolog, which can handle coordination (and other natural language constructions) in a reasonable and general way. The system produces both syntactic analyses and logical forms, and problems of scoping for coordination and quantifiers are dealt with. The MSG formalism seems of interest in its own right (perhaps even outside natural language processing) because the notions of syntactic structure and semantic interpretation are more constrained than in many previous systems (made more implicit in the formalism itself), so that less burden is put on the grammar writer.

57 citations

Patent
10 Jul 1989
TL;DR: In this paper, a compiler for register vector grammars is presented, which uses phase structure rules to generate strongly equivalent grammarmars in register vector grammar form, which can then be used to parse strings and trees.
Abstract: A context-free parsing algorithm employing register vector grammars provides fast parsing of natural languages A compiler for register vector grammars accepts input grammars as standard phase structure rules and generates strongly equivalent grammars in register vector grammar form By applying the context-free register vector grammar parsing algorithm to the resulting grammars, strings may be parsed and trees may be constructed in the same manner performed with phase structure grammar

57 citations

Journal ArticleDOI
TL;DR: The class of output languages of 1V (or L) attribute grammars is the image of the class of IO macro tree languages under all deterministic top-down tree transductions.
Abstract: An attribute grammar is one-visit if the attributes can be evaluated by walking through the derivation tree in such a way that each subtree is visited at most once. One-visit (1V) attribute grammars are compared with one-pass left-to-right (L) attribute grammars and with attribute grammars having only one synthesized attribute (1S). Every 1S attribute grammar can be made one-visit. One-visit attribute grammars are simply permutations of L attribute grammars; thus the classes of output sets of 1V and L attribute grammars coincide, and similarly for 1S and L-1S attribute grammars. In case all attribute values are trees, the translation realized by a 1V attribute grammar is the composition of the translation realized by a 1S attribute grammar with a deterministic top-down tree transduction, and vice versa; thus, using a result of Duske e.a., the class of output languages of 1V (or L) attribute grammars is the image of the class of IO macro tree languages under all deterministic top-down tree transductions.

57 citations


Network Information
Related Topics (5)
Graph (abstract data type)
69.9K papers, 1.2M citations
80% related
Time complexity
36K papers, 879.5K citations
79% related
Concurrency
13K papers, 347.1K citations
78% related
Model checking
16.9K papers, 451.6K citations
77% related
Directed graph
12.2K papers, 302.4K citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202311
202212
20211
20204
20191
20181