scispace - formally typeset
Search or ask a question
Topic

Context-sensitive grammar

About: Context-sensitive grammar is a research topic. Over the lifetime, 1938 publications have been published within this topic receiving 45911 citations. The topic is also known as: CSG.


Papers
More filters
Proceedings ArticleDOI
15 Dec 2004
TL;DR: The composite directed MRF model has a potentially exponential number of loops and becomes a context sensitive grammar, nevertheless it is able to estimate its parameters in cubic time using an efficient modified ME method, the generalized inside-outside algorithm, which extends the inside- outside algorithm to incorporate the effects of the n-gram and PLSA language models.
Abstract: We present a directed Markov random field (MRF) model that combines n-gram models, probabilistic context free grammars (PCFGs) and probabilistic latent semantic analysis (PLSA) for the purpose of statistical language modeling. Even though the composite directed MRF model potentially has an exponential number of loops and becomes a context sensitive grammar, we are nevertheless able to estimate its parameters in cubic time using an efficient modified EM method, the generalized inside-outside algorithm, which extends the inside-outside algorithm to incorporate the effects of the n-gram and PLSA language models. We generalize various smoothing techniques to alleviate the sparseness of n-gram counts in cases where there are hidden variables. We also derive an analogous algorithm to calculate the probability of initial subsequence of a sentence, generated by the composite language model. Our experimental results on the Wall Street Journal corpus show that we obtain significant reductions in perplexity compared to the state-of-the-art baseline trigram model with Good-Turing and Kneser-Ney smoothings.

16 citations

Journal ArticleDOI
TL;DR: The article presents proofs of the context freeness of a family of typelogical grammars that are based on a uni- or multimodal logic of pure residuation, possibly enriched with the structural rules of Permutation and Expansion for binary modes.
Abstract: The article presents proofs of the context freeness of a family of type logical grammars, namely all grammars that are based on a uni- or multimodal logic of pure residuation, possibly enriched with the structural rules of Permutation and Expansion for binary modes

16 citations

Book ChapterDOI
23 Sep 1996
TL;DR: Current approaches to weighted constraint logic grammars attempt to address this issue by adding numerical calculation schemata to the deduction scheme of the underlying CLP framework.
Abstract: Constraint logic grammars provide a powerful formalism for expressing complex logical descriptions of natural language phenomena in exact terms. Describing some of these phenomena may, however, require some form of graded distinctions which are not provided by such grammars. Recent approaches to weighted constraint logic grammars attempt to address this issue by adding numerical calculation schemata to the deduction scheme of the underlying CLP framework.

16 citations

Journal ArticleDOI
TL;DR: The main point of this paper is the systematic study of all possibilities of defining leftmost derivation in matrix grammars and finds a characterization of the recursively enumerable languages for matrix Grammars with the leftmost restriction defined on classes of a given partition of the nonterminal alphabet.
Abstract: Matrix grammars are one of the classical topics of formal languages, more specifically, regulated rewriting. Although this type of control on the work of context-free grammars is one of the earliest, matrix grammars still raise interesting questions (not to speak about old open problems in this area). One such class of problems concerns the leftmost derivation (in grammars without appearance checking). The main point of this paper is the systematic study of all possibilities of defining leftmost derivation in matrix grammars. Twelve types of such a restriction are defined, only four of which being discussed in literature. For seven of them, we find a proof of a characterization of recursively enumerable languages (by matrix grammars with arbitrary context-free rules but without appearance checking). Other three cases characterize the recursively enumerable languages modulo a morphism and an intersection with a regular language. In this way, we solve nearly all problems listed as open on page 67 of the monograph [7], which can be seen as the main contribution of this paper. Moreover, we find a characterization of the recursively enumerable languages for matrix grammars with the leftmost restriction defined on classes of a given partition of the nonterminal alphabet.

16 citations


Network Information
Related Topics (5)
Graph (abstract data type)
69.9K papers, 1.2M citations
80% related
Time complexity
36K papers, 879.5K citations
79% related
Concurrency
13K papers, 347.1K citations
78% related
Model checking
16.9K papers, 451.6K citations
77% related
Directed graph
12.2K papers, 302.4K citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202311
202212
20211
20204
20191
20181