scispace - formally typeset
Search or ask a question
Topic

Context-sensitive grammar

About: Context-sensitive grammar is a research topic. Over the lifetime, 1938 publications have been published within this topic receiving 45911 citations. The topic is also known as: CSG.


Papers
More filters
Book ChapterDOI
Ryo Yoshinaka1
05 Mar 2012
TL;DR: This paper shows how opposite approaches to distributional learning models and exploits the relation between strings and contexts are integrated into single learning algorithms that learn quite rich classes of context-free grammars.
Abstract: Recently several "distributional learning algorithms" have been proposed and have made great success in learning different subclasses of context-free grammars. The distributional learning models and exploits the relation between strings and contexts that form grammatical sentences in the language of the learning target. There are two main approaches. One, which we call primal, constructs nonterminals whose language is supposed to be characterized by strings. The other, which we call dual, uses contexts to characterize the language of each nonterminal of the conjecture grammar. This paper shows how those opposite approaches are integrated into single learning algorithms that learn quite rich classes of context-free grammars.

17 citations

Proceedings ArticleDOI
01 Jan 2011
TL;DR: This work presents an algorithm that uses indexed linear tree grammars (ILTGs) both to describe the input set and compute the set that approximates the collecting semantics, thus enabling a more precise binding analysis than afforded by regular Grammars.
Abstract: The collecting semantics of a program defines the strongest static property of interest. We study the analysis of the collecting semantics of higher-order functional programs, cast as left-linear term rewriting systems. The analysis generalises functional flow analysis and the reachability problem for term rewriting systems, which are both undecidable. We present an algorithm that uses indexed linear tree grammars (ILTGs) both to describe the input set and compute the set that approximates the collecting semantics. ILTGs are equi-expressive with pushdown tree automata, and so, strictly more expressive than regular tree grammars. Our result can be seen as a refinement of Jones and Andersen's procedure, which uses regular tree grammars. The main technical innovation of our algorithm is the use of indices to capture (sets of) substitutions, thus enabling a more precise binding analysis than afforded by regular grammars. We give a simple proof of termination and soundness, and demonstrate that our method is more accurate than other approaches to functional flow and reachability analyses in the literature.

17 citations

Book ChapterDOI
07 Apr 2008
TL;DR: The new model of grammatical picture generation, called Pure 2D context-free grammar (CFG), generates rectangular picture arrays of symbols and the generative power of this model in comparison to certain other related models is examined.
Abstract: In this note a new model of grammatical picture generation is introduced. The model is based on the notion of pure context-free grammars of formal string language theory. The resulting model, called Pure 2D context-free grammar (CFG), generates rectangular picture arrays of symbols. The generative power of this model in comparison to certain other related models is examined. Also we associate a regular control language with a Pure 2D CFG and notice that the generative power increases. Certain closure properties are obtained.

17 citations

Journal ArticleDOI
TL;DR: The investigated topics are: closure properties, the efficiency of generating a (linear) language by such a system compared with usual grammars, hierarchies, and so on.
Abstract: We continue the study of parallel communicating grammar systems introduced in P[acaron]un and Sântean [7] as a grammatical model of parallel computing. The investigated topics are: closure properties, the efficiency of generating a (linear) language by such a system compared with usual grammars, hierarchies.

17 citations

Proceedings ArticleDOI
16 Oct 2005
TL;DR: This work uses the genetic programming approach for grammatical inference and proposes the use of frequent sequences, syntax graphs and incremental construction of grammars in order to infer a more comprehensive set of context-free Grammars.
Abstract: We propose a new application area for grammar inference which intends to make domain-specific language development easier and finds a second application in renovation tools for legacy systems. We use the genetic programming approach for grammatical inference and propose the use of frequent sequences, syntax graphs and incremental construction of grammars in order to be able to infer a more comprehensive set of context-free grammars.

17 citations


Network Information
Related Topics (5)
Graph (abstract data type)
69.9K papers, 1.2M citations
80% related
Time complexity
36K papers, 879.5K citations
79% related
Concurrency
13K papers, 347.1K citations
78% related
Model checking
16.9K papers, 451.6K citations
77% related
Directed graph
12.2K papers, 302.4K citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202311
202212
20211
20204
20191
20181