scispace - formally typeset
Search or ask a question
Topic

Context-sensitive grammar

About: Context-sensitive grammar is a research topic. Over the lifetime, 1938 publications have been published within this topic receiving 45911 citations. The topic is also known as: CSG.


Papers
More filters
Journal ArticleDOI
TL;DR: It is shown that derivation trees in the generalized Lambek calculus can be transformed to a normal form, and this fact is employed in the proof of the inclusion of the class of phrase languages generated by categorial grammars based on the generalized lambek calculus.
Abstract: We show that derivation trees in the generalized Lambek calculus can be transformed to a normal form. This fact is employed in the proof of the inclusion of the class of phrase languages generated by categorial grammars based on the generalized Lambek calculus in the class of phrase languages generated by categorial grammars based on the generalized Ajdukiewicz calculus.

11 citations

Posted Content
TL;DR: A way of rewriting Minimalist Grammars as Linear Context-Free Rewriting Systems, allowing to easily create a top-down parser, and a method of refining the probabilistic field by using algorithms used in data compression.
Abstract: This paper describes a probabilistic top-down parser for minimalist grammars. Top-down parsers have the great advantage of having a certain predictive power during the parsing, which takes place in a left-to-right reading of the sentence. Such parsers have already been well-implemented and studied in the case of Context-Free Grammars, which are already top-down, but these are difficult to adapt to Minimalist Grammars, which generate sentences bottom-up. I propose here a way of rewriting Minimalist Grammars as Linear Context-Free Rewriting Systems, allowing to easily create a top-down parser. This rewriting allows also to put a probabilistic field on these grammars, which can be used to accelerate the parser. Finally, I propose a method of refining the probabilistic field by using algorithms used in data compression.

11 citations

Journal ArticleDOI
Peter Fletcher1
TL;DR: It is argued that the connectionist style of computation is, in some ways, better suited than sequential computation to the task of representing and manipulating recursive structures.
Abstract: This paper presents a new connectionist approach to grammatical inference. Using only positive examples, the algorithm learns regular graph grammars, representing two-dimensional iterative structures drawn on a discrete Cartesian grid. This work is intended as a case study in connectionist symbol processing and geometric concept formation. A grammar is represented by a self-configuring connectionist network that is analogous to a transition diagram except that it can deal with graph grammars as easily as string grammars. Learning starts with a trivial grammar. expressing no grammatical knowledge, which is then refined, by a process of successive node splitting and merging, into a grammar adequate to describe the population of input patterns. In conclusion. I argue that the connectionist style of computation is, in some ways, better suited than sequential computation to the task of representing and manipulating recursive structures.

11 citations

Proceedings ArticleDOI
25 Aug 1986
TL;DR: The weak generative capacity of a class of parenthesis free categorial grammars derived from those of Ades and Steedman are studied by varying the set of reduction rules and a context sensitive language is obtained.
Abstract: We study the weak generative capacity of a class of parenthesis free categorial grammars derived from those of Ades and Steedman by varying the set of reduction rules. With forward cancellation as the only rule, the grammars are weakly equivalent to context free grammars. When a backward combination rule is added, it is no longer possible to obtain all the context-free language. With suitable restriction of the forward partial rule, the languages are still context-free and a push-down automaton can be used for recognition. Using the unrestricted rule of forward partial combination, a context sensitive language is obtained.

11 citations

Book ChapterDOI
27 Jun 2001
TL;DR: Dependency tree grammars are proposed in which unbounded discontinuity is resolved through the first available valency saturation, and they are weakly equivalent to cf-grammars, parsable in cubic time, and are stronger than non-projective dependency Grammars without long dependencies.
Abstract: Dependency tree grammars are proposed in which unbounded discontinuity is resolved through the first available valency saturation. In general, they are expressive enough to generate non-semilinear context sensitive languages, but in the practical situation where the number of non saturated valencies is bounded by a constant, they are weakly equivalent to cf-grammars, are parsable in cubic time, and are stronger than non-projective dependency grammars without long dependencies.

11 citations


Network Information
Related Topics (5)
Graph (abstract data type)
69.9K papers, 1.2M citations
80% related
Time complexity
36K papers, 879.5K citations
79% related
Concurrency
13K papers, 347.1K citations
78% related
Model checking
16.9K papers, 451.6K citations
77% related
Directed graph
12.2K papers, 302.4K citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202311
202212
20211
20204
20191
20181