scispace - formally typeset
Search or ask a question
Topic

Context-sensitive grammar

About: Context-sensitive grammar is a research topic. Over the lifetime, 1938 publications have been published within this topic receiving 45911 citations. The topic is also known as: CSG.


Papers
More filters
01 Jan 1994
TL;DR: It is shown that dynamic grammars have the formal power of Turing machines and an experimental system which implements a non ambiguous \sl dynamic parser is sketched and applications of this system for the resolution of some semantic analysis problems are shown.
Abstract: We define a dynamic grammar as a device which may generate an unbounded set of context-free grammars, each grammar is produced, while parsing a source text, by the recognition of some construct. It is shown that dynamic grammars have the formal power of Turing machines. For a given source text, a dynamic grammar, when non ambiguous, may be seen as a sequence of usual context-free grammars specialized by this source text: an initial grammar is modified, little by little, while the program is parsed and is used to continue the parsing process. An experimental system which implements a non ambiguous \sl dynamic parser is sketched and applications of this system for the resolution of some semantic analysis problems are shown. Some of these examples are non-trivial (overloading resolution, derived types, polymorphism, \ldots) and indicate that this method may partly compete with other well-known techniques used in type-checking.

14 citations

01 Jan 2008
TL;DR: Some results on the power of tree controlled grammars are presented where the regular languages are restricted to some known subclasses of the family of regular languages.
Abstract: Tree controlled grammars are context-free grammars where the associated language only contains those terminal words which have a derivation where the word of any level of the corresponding derivation tree belongs to a given regular language. We present some results on the power of such grammars where we restrict the regular languages to some known subclasses of the family of regular languages.

14 citations

Journal ArticleDOI
TL;DR: An example of minimal linear language, all of whose minimal linear grammars are ambiguous; this language is not ambiguous in the class of linear context-free languages.
Abstract: We give an example of minimal linear language, all of whose minimal linear grammars are ambiguous; this language is not ambiguous in the class of linear context-free languages.

14 citations

Proceedings ArticleDOI
13 Oct 1971
TL;DR: Necessary and sufficient conditions for a grammar to be LR-regular are derived and then utilized for developing parser generation techniques for arbitrary grammars.
Abstract: LR-regular grammars are defined similarly to Knuth's LR(k) grammars, with the following exception. Arbitrarily long look-ahead is allowed before making a parsing decision during the bottom-up syntactical analysis; however, this look-ahead is restricted in that the essential "lookahead information" can be represented by a finite number of regular sets, thus can be computed by a finite state machine. LR-regular grammars can be parsed deterministically in linear time by a rather simple two-scan algorithm. Efficient parsers are constructed for given LR-regular grammars. The family of LR-regular languages is studied; it properly includes the family of deterministic CF languages and has similar properties. Necessary and sufficient conditions for a grammar to be LR-regular are derived and then utilized for developing parser generation techniques for arbitrary grammars.

14 citations

Journal ArticleDOI
Mark Johnson1
TL;DR: This paper focuses on two widely-used "formal" or "logical" representations of gram mars in computational linguistics, Definite Clause Grammars and Feature Structure Grammar, and describes the way in which they express the recognition problem and the parsing problem.
Abstract: A grammar is a formal device which both identifies a certain set of utter ances as well-formed, and which also defines a transduction relation be tween these utterances and their linguistic representations. This paper focuses on two widely-used "formal" or "logical" representations of gram mars in computational linguistics, Definite Clause Grammars and Feature Structure Grammars, and describes the way in which they express the recognition problem (the problem of determining if an utterance is in the language generated by a grammar) and the parsing problem (the problem of finding the analyses assigned by a grammar to an utterance). Although both approaches are 'constraint-based', one of them is based on logical consequence relation, and the other is based on satisfiability. The main goal of this paper is to point out the different conceptual basis of these two ways of formalizing grammars, and discuss some of their properties.

14 citations


Network Information
Related Topics (5)
Graph (abstract data type)
69.9K papers, 1.2M citations
80% related
Time complexity
36K papers, 879.5K citations
79% related
Concurrency
13K papers, 347.1K citations
78% related
Model checking
16.9K papers, 451.6K citations
77% related
Directed graph
12.2K papers, 302.4K citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202311
202212
20211
20204
20191
20181