scispace - formally typeset
Search or ask a question
Topic

Context-sensitive grammar

About: Context-sensitive grammar is a research topic. Over the lifetime, 1938 publications have been published within this topic receiving 45911 citations. The topic is also known as: CSG.


Papers
More filters
Book ChapterDOI
Gheorghe Paun1
10 Jul 1995
TL;DR: This book investigates two major systems, cooperating distributed grammar systems and parallel communicating grammar systems, which concerns hierarchies with respect to different variants of cooperation, relations with classical formal language theory, syntactic parameters such as the number of components and their size, power of synchronization, and general notions generated from artificial intelligence.
Abstract: From the Publisher: This book investigates two major systems: firstly, cooperating distributed grammar systems, where the grammars work on one common sequential form and the cooperation is realized by the control of the sequence of active grammars; secondly, parallel communicating grammar systems, where each grammar works on its own sequential form and cooperation is done by means of communicating between grammars. The investigation concerns hierarchies with respect to different variants of cooperation, relations with classical formal language theory, syntactic parameters such as the number of components and their size, power of synchronization, and general notions generated from artificial intelligence.

395 citations

Journal ArticleDOI
TL;DR: This paper studies certain sets, functions, and relations on trees using natural generalizations of ordinary automata theory using Thatcher and Wright's algebraic formalism to give succinct descriptions of linguistic constructions in the tree case.
Abstract: Recent developments in the theory of automata have pointed to an extension of the domain of definition of automata from strings to trees. Here we study certain sets, functions, and relations on trees using natural generalizations of ordinary automata theory. Why pursue such a generalization? First, because enlarging the domain of automata theory may strengthen and simplify the subject in the same way that emphasizing strings rather than natural numbers already has done. Second, because parts of mathematical linguistics can be formalized easily in a tree-automaton setting. The theories of transformational grammars and of syntax-directed compilation are two examples. A two-dimensional automata theory seems better suited to handle concepts arising in these areas than does the conventional theory. The algebraic properties of finite automata on trees have been extensively studied; see Brainerd [5], Doner [8], Mezei and Wright [12], Thatcher [15], Thatcher and Wright [17], and Arbib and Give'on [4]. The notion of recognizable set is central to these papers. A finite checking scheme (automaton) is used on an inp/lt tree. The scheme analyzes a tree from the bottom (leaves) up to the top (root), classifying the tree as acceptable or not. The recognizable set associated with the automaton is the set of all acceptable trees. Here we will define sets of trees produced by finite-state generative schemes. In this respect, making automata work from the top down instead of the bottom up is convenient. Rabin [13] was the first to use this idea; his purpose was to define recognizable sets of infinite trees. We do not consider such trees here; our emphasis is on generation, but the top-down concept is important for all our definitions. We use Thatcher and Wright's algebraic formalism to give succinct descriptions of linguistic constructions in the tree case. Using these constructions, we investigate decision problems and closure properties. Our results should clarify

369 citations

Journal ArticleDOI
TL;DR: The handle-rewriting hypergraph grammars (HH Grammars) are introduced, based on the replacement of handles, i.e., of subhypergraphs consisting of one hyperedge together with its incident vertices, which can be characterized as the least solutions of certain systems of equations.

360 citations

Proceedings ArticleDOI
07 Jul 2007
TL;DR: This tutorial gives a brief introduction to Backus Naur Form Grammars and a background into the use of grammars with Genetic Programming, before describing the inner workings of Grammatical Evolution and some of the more commonly used extensions.
Abstract: Grammatical Evolution is an automatic programming system that is a form of Genetic Programming that uses grammars to evolve structures. These structures can be in any form that can be specified using a grammar, including computer languages, graphs and neural networks. When evolving computer languages, multiple types can be handled in a completely transparent manner.This tutorial gives a brief introduction to Backus Naur Form grammars and a background into the use of grammars with Genetic Programming, before describing the inner workings of Grammatical Evolution and some of the more commonly used extensions.

344 citations


Network Information
Related Topics (5)
Graph (abstract data type)
69.9K papers, 1.2M citations
80% related
Time complexity
36K papers, 879.5K citations
79% related
Concurrency
13K papers, 347.1K citations
78% related
Model checking
16.9K papers, 451.6K citations
77% related
Directed graph
12.2K papers, 302.4K citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202311
202212
20211
20204
20191
20181