Topic
Context-sensitive grammar
About: Context-sensitive grammar is a research topic. Over the lifetime, 1938 publications have been published within this topic receiving 45911 citations. The topic is also known as: CSG.
Papers published on a yearly basis
Papers
More filters
••
14 Jan 2002TL;DR: A new type of grammar is invented that extends tree grammars by permitting a notion of sharing in the productions that seems to be of independent interest and how to derive type inference from type checking is demonstrated.
Abstract: Abramov and Gluck have recently introduced a technique called URA for inverting first order functional programs. Given some desired output value, URA computes a potentially infinite sequence of substitutions/restrictions corresponding to the relevant input values. In some cases this process does not terminate.In the present paper, we propose a new program analysis for inverting programs. The technique works by computing a finite grammar describing the set of all input that relate to a given output. During the production of the grammar, the original program is implicitly transformed using so-called driving steps. Whereas URA is sound and complete, but sometimes fails to terminate, our technique always terminates and is complete, but not sound. As an example, we demonstrate how to derive type inference from type checking.The idea of approximating functional programs by grammars is not new. For instance, the second author has developed a technique using tree grammars to approximate termination behaviour of deforestation. However, for the present purposes it has been necessary to invent a new type of grammar that extends tree grammars by permitting a notion of sharing in the productions. These dag grammars seem to be of independent interest.
15 citations
••
TL;DR: The algorithm computes a canonical representation of a simple language, converting its arbitrary simple grammar into prime normal form (PNF); a simple grammar is in PNF if all its nonterminals define primes.
15 citations
••
01 Sep 2007TL;DR: This paper describes an approach to learning node replacement graph grammars based on previous research in frequent isomorphic subgraphs discovery, and describes results on several real-world tasks from chemical mining to XML schema induction.
Abstract: Graph grammars combine the relational aspect of graphs with the iterative and recursive aspects of string grammars, and thus represent an important next step in our ability to discover knowledge from data. In this paper we describe an approach to learning node replacement graph grammars. This approach is based on previous research in frequent isomorphic subgraphs discovery. We extend the search for frequent subgraphs by checking for overlap among the instances of the subgraphs in the input graph. If subgraphs overlap by one node we propose a node replacement grammar production. We also can infer a hierarchy of productions by compressing portions of a graph described by a production and then infer new productions on the compressed graph. We validate this approach in experiments where we generate graphs from known grammars and measure how well our system infers the original grammar from the generated graph. We also describe results on several real-world tasks from chemical mining to XML schema induction. We briefly discuss other grammar inference systems indicating that our study extends classes of learnable graph grammars.
15 citations
••
TL;DR: It is proved that every recursively enumerable language can be generated by a scattered context grammar with no more than two context-sensitive productions.
15 citations
••
28 Apr 2005TL;DR: Dependency Structure Grammars (DSG), which are rewriting rule grammars generating sentences together with their dependency structures, are more expressive than CF-grammars and non-equivalent to mildly context-sensitive grammARS.
Abstract: In this paper, we define Dependency Structure Grammars (DSG), which are rewriting rule grammars generating sentences together with their dependency structures, are more expressive than CF-grammars and non-equivalent to mildly context-sensitive grammars
We show that DSG are weakly equivalent to Categorial Dependency Grammars (CDG) recently introduced in [6,3] In particular, these dependency grammars naturally express long distance dependencies and enjoy good mathematical properties
15 citations