scispace - formally typeset
Search or ask a question
Topic

Context-sensitive grammar

About: Context-sensitive grammar is a research topic. Over the lifetime, 1938 publications have been published within this topic receiving 45911 citations. The topic is also known as: CSG.


Papers
More filters
Book ChapterDOI
01 Jan 1996
TL;DR: A language model employing a new headed-disjuncts formulation of Lafferty et al.'s (1992) probabilistic link grammar is described, together with an EM training method for estimating the probabilities, and a procedure for learning some simple lexicalized grammar structures.
Abstract: We describe a language model employing a new headed-disjuncts formulation of Lafferty et al.'s (1992) probabilistic link grammar, together with (1) an EM training method for estimating the probabilities, and (2) a procedure for learning some simple lexicalized grammar structures. The model in its simplest form is a generalization of n-gram models, but in its general form possesses context-free expressiveness. Unlike the original experiments on probabilistic link grammars, we assume that no hand-coded grammar is initially available (as with n-gram models). We employ untyped links to concentrate the learning on lexical dependencies, and our formulation uses the lexical identities of heads to influence the structure of the parse graph. After learning, the language model consists of grammatical rules in the form of a set of simple disjuncts for each word, plus several sets of probability parameters. The formulation extends cleanly toward learning more powerful context-free grammars. Several issues relating to generalization bias, linguistic constraints, and parameter smoothing are considered. Preliminary experimental results on small artificial corpora are supportive of our approach.

10 citations

Journal ArticleDOI
TL;DR: An algorithm is presented which given an arbitrary A-free context-free grammar produces an equivalent context- free grammar in 2 Greibach normal form, and the upper bound on the size of the resulting grammar is not bigger than the bounds known for other algorithms for converting context- Free grammars into equivalent Context-free Grammars in GreibACH normal form.
Abstract: We present an algorithm which given an arbitrary A-free context-free grammar produces an equivalent context-free grammar in 2 Greibach normal form. The upper bound on the size of the resulting grammar in terms of the size of the initially given grammar is given. Our algorithm consists of an elementary construction, while the upper bound on the size of the resulting grammar is not bigger than the bounds known for other algorithms for converting context-free grammars into equivalent context-free grammars in Greibach normal form.

10 citations

Proceedings ArticleDOI
23 Aug 2004
TL;DR: It is shown that GIDLP grammars avoid the explosion in the number of rules required under a traditional phrase structure analysis of free constituent order, and support more modular and compact grammar encodings and require fewer edges in parsing.
Abstract: Linearization-based HPSG theories are widely used for analyzing languages with relatively free constituent order. This paper introduces the Generalized ID/LP (GIDLP) grammar format, which supports a direct encoding of such theories, and discusses key aspects of a parser that makes use of the dominance, precedence, and linearization domain information explicitly encoded in this grammar format. We show that GIDLP grammars avoid the explosion in the number of rules required under a traditional phrase structure analysis of free constituent order. As a result, GIDLP grammars support more modular and compact grammar encodings and require fewer edges in parsing.

10 citations

Proceedings Article
01 Aug 2014
TL;DR: This work formalizes and generalizes some existing mechanisms for dealing with discontinuous phrase structures and non-projective dependency structures and introduces the concept of hybrid grammars, which are extensions of synchronous grammARS, obtained by coupling of lexical elements.
Abstract: We introduce the concept of hybrid grammars, which are extensions of synchronous grammars, obtained by coupling of lexical elements. One part of a hybrid grammar generates linear structures, another generates hierarchical structures, and together they generate discontinuous structures. This formalizes and generalizes some existing mechanisms for dealing with discontinuous phrase structures and non-projective dependency structures. Moreover, it allows us to separate the degree of discontinuity from the time complexity of parsing.

10 citations


Network Information
Related Topics (5)
Graph (abstract data type)
69.9K papers, 1.2M citations
80% related
Time complexity
36K papers, 879.5K citations
79% related
Concurrency
13K papers, 347.1K citations
78% related
Model checking
16.9K papers, 451.6K citations
77% related
Directed graph
12.2K papers, 302.4K citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202311
202212
20211
20204
20191
20181