scispace - formally typeset
Search or ask a question
Topic

Context-sensitive grammar

About: Context-sensitive grammar is a research topic. Over the lifetime, 1938 publications have been published within this topic receiving 45911 citations. The topic is also known as: CSG.


Papers
More filters
Journal ArticleDOI
TL;DR: A new approach to model transformation development is proposed which allows to simplify the developed transformations and improve their quality via the exploitation of the languages' structures and it is shown that such transformations have important properties: they terminate and are sound, complete, and deterministic.

9 citations

Book ChapterDOI
01 Jan 2004
TL;DR: A simple and intuitive approximation method for turning unification-based grammars into context-free Grammars is presented and a novel disambiguation method is introduced which is based on probabilistic context- free approximations.
Abstract: We present a simple and intuitive approximation method for turning unification-based grammars into context-free grammars. We apply our method to several grammars and report on the quality of the approximation. We also present several methods that speed up the approximation process and that might be interesting to other areas of unification-based processing. Finally, we introduce a novel disambiguation method for unification grammars which is based on probabilistic context-free approximations.

9 citations

Journal ArticleDOI
TL;DR: This paper focuses on parallel communicating grammar systems (PCGSs) with context-free components, and it is proved that the class of Szilard languages of centralized (returning or non-returning) PCGSs is included in NC1.

9 citations

Book ChapterDOI
14 Dec 1998
TL;DR: This paper shows that a formalism fitting better to linguistic structures can be obtained by using a sequence of pushdowns instead of one pushdown for the storage of the indices in a derivation, and argues that the corresponding restriction on writing is more natural from a linguistic point of view.
Abstract: Linear indexed grammars (LIGs) can be used to describe nonlocal dependencies. The indexing mechanism, however, can only account for dependencies that are nested. In natural languages one can easily find examples to which this simple model cannot be applied straightforwardly. In this paper I will show that a formalism fitting better to linguistic structures can be obtained by using a sequence of pushdowns instead of one pushdown for the storage of the indices in a derivation. Crucially, we have to avoid unwanted interactions between the push-downs that would make possible the simulation of a turing machine. [1] solves this problem for multi-pushdown automata by restricting reading to the first nonempty pushdown. I will argue that the corresponding restriction on writing is more natural from a linguistic point of view. I will show that, under each of both restrictions, grammars with a sequence of n pushdowns give rise to a subclass of the nth member of the hierarchy defined by [15,16], and therefore are mildly context sensitive.

9 citations

Journal ArticleDOI
TL;DR: It is shown that context-free picture grammars are strictly weaker than both random permitting and random forbidding context picture Grammars, and also that random permitting context is strictly stronger than random context.
Abstract: We use random context picture grammars to generate pictures through successive refinement. The productions of such a grammar are context-free, but their application is regulated — "permitted" or "forbidden" — by context randomly distributed in the developing picture. Grammars using this relatively weak context often succeed where context-free grammars fail, e.g. in generating the Sierpinski carpets. On the other hand it proved possible to develop iteration theorems for three subclasses of these grammars, namely a pumping–shrinking, a pumping and a shrinking lemma for context-free, random permitting and random forbidding context picture grammars, respectively. Finding necessary conditions is problematic in the case of most models of context-free grammars with context-sensing ability, since they consider a variable and its context as a finite connected array. We have already shown that context-free picture grammars are strictly weaker than both random permitting and random forbidding context picture grammars, also that random permitting context is strictly weaker than random context. We now show that grammars which use forbidding context only are strictly weaker than random context picture grammars.

9 citations


Network Information
Related Topics (5)
Graph (abstract data type)
69.9K papers, 1.2M citations
80% related
Time complexity
36K papers, 879.5K citations
79% related
Concurrency
13K papers, 347.1K citations
78% related
Model checking
16.9K papers, 451.6K citations
77% related
Directed graph
12.2K papers, 302.4K citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202311
202212
20211
20204
20191
20181