Topic
Context-sensitive grammar
About: Context-sensitive grammar is a research topic. Over the lifetime, 1938 publications have been published within this topic receiving 45911 citations. The topic is also known as: CSG.
Papers published on a yearly basis
Papers
More filters
••
10 citations
••
01 Jan 1996TL;DR: A language model employing a new headed-disjuncts formulation of Lafferty et al.'s (1992) probabilistic link grammar is described, together with an EM training method for estimating the probabilities, and a procedure for learning some simple lexicalized grammar structures.
Abstract: We describe a language model employing a new headed-disjuncts formulation of Lafferty et al.'s (1992) probabilistic link grammar, together with (1) an EM training method for estimating the probabilities, and (2) a procedure for learning some simple lexicalized grammar structures. The model in its simplest form is a generalization of n-gram models, but in its general form possesses context-free expressiveness. Unlike the original experiments on probabilistic link grammars, we assume that no hand-coded grammar is initially available (as with n-gram models). We employ untyped links to concentrate the learning on lexical dependencies, and our formulation uses the lexical identities of heads to influence the structure of the parse graph. After learning, the language model consists of grammatical rules in the form of a set of simple disjuncts for each word, plus several sets of probability parameters. The formulation extends cleanly toward learning more powerful context-free grammars. Several issues relating to generalization bias, linguistic constraints, and parameter smoothing are considered. Preliminary experimental results on small artificial corpora are supportive of our approach.
10 citations
••
TL;DR: An algorithm is presented which given an arbitrary A-free context-free grammar produces an equivalent context- free grammar in 2 Greibach normal form, and the upper bound on the size of the resulting grammar is not bigger than the bounds known for other algorithms for converting context- Free grammars into equivalent Context-free Grammars in GreibACH normal form.
Abstract: We present an algorithm which given an arbitrary A-free context-free grammar produces an equivalent context-free grammar in 2 Greibach normal form. The upper bound on the size of the resulting grammar in terms of the size of the initially given grammar is given. Our algorithm consists of an elementary construction, while the upper bound on the size of the resulting grammar is not bigger than the bounds known for other algorithms for converting context-free grammars into equivalent context-free grammars in Greibach normal form.
10 citations
••
23 Aug 2004TL;DR: It is shown that GIDLP grammars avoid the explosion in the number of rules required under a traditional phrase structure analysis of free constituent order, and support more modular and compact grammar encodings and require fewer edges in parsing.
Abstract: Linearization-based HPSG theories are widely used for analyzing languages with relatively free constituent order. This paper introduces the Generalized ID/LP (GIDLP) grammar format, which supports a direct encoding of such theories, and discusses key aspects of a parser that makes use of the dominance, precedence, and linearization domain information explicitly encoded in this grammar format. We show that GIDLP grammars avoid the explosion in the number of rules required under a traditional phrase structure analysis of free constituent order. As a result, GIDLP grammars support more modular and compact grammar encodings and require fewer edges in parsing.
10 citations
•
01 Aug 2014TL;DR: This work formalizes and generalizes some existing mechanisms for dealing with discontinuous phrase structures and non-projective dependency structures and introduces the concept of hybrid grammars, which are extensions of synchronous grammARS, obtained by coupling of lexical elements.
Abstract: We introduce the concept of hybrid grammars, which are extensions of synchronous grammars, obtained by coupling of lexical elements. One part of a hybrid grammar generates linear structures, another generates hierarchical structures, and together they generate discontinuous structures. This formalizes and generalizes some existing mechanisms for dealing with discontinuous phrase structures and non-projective dependency structures. Moreover, it allows us to separate the degree of discontinuity from the time complexity of parsing.
10 citations