scispace - formally typeset
Search or ask a question

Showing papers on "Context-sensitive grammar published in 1983"


Journal ArticleDOI

59 citations


Journal Article
TL;DR: This paper describes a logic grammar formalism, modifier structure grammars (MSGs), together with an interpreter written in Prolog, which can handle coordination (and other natural language constructions) in a reasonable and general way.
Abstract: Logic grammars are grammars expressible in predicate logic. Implemented in the programming language Prolog, logic grammar systems have proved to be a good basis for natural language processing. One of the most difficult constructions for natural language grammars to treat is coordination (construction with conjunctions like 'and'). This paper describes a logic grammar formalism, modifier structure grammars (MSGs), together with an interpreter written in Prolog, which can handle coordination (and other natural language constructions) in a reasonable and general way. The system produces both syntactic analyses and logical forms, and problems of scoping for coordination and quantifiers are dealt with. The MSG formalism seems of interest in its own right (perhaps even outside natural language processing) because the notions of syntactic structure and semantic interpretation are more constrained than in many previous systems (made more implicit in the formalism itself), so that less burden is put on the grammar writer.

57 citations




Journal ArticleDOI
TL;DR: Greibach normal form Grammars and position restricted grammars will be investigated from the point of view of descriptional complexity of context-free languages.

25 citations


Journal ArticleDOI
TL;DR: This paper presents a new attribute evaluator applicable to any attribute grammar in canonical form, noncircular or circular, and simulates a modified depth-first traversal of the graph by reversing the arrows in the compound dependency graph associated with a pars e tree, but without actually constructing either of the graphs.
Abstract: 1 . Introductio n Attribute grammars were proposed by Knuth [10] for specifying the semantic s of languages defined by context-free grammars . Each grammar symbol has associated with it a set of attributes . The attributes are defined in terms of othe r attributes via semantic rules associated with the productions . The meaning of a n input string is the value of the synthesized attribute instances of the start symbol i n the parse tree for that input . This paper presents a new attribute evaluator [5 ] applicable to any attribute grammar in canonical form [6], noncircular or circular . The algorithm simulates a modified depth-first traversal [4] of the graph obtaine d by reversing the arrows in the compound dependency graph associated with a pars e tree, but without actually constructing either of the graphs. The order in which th e attribute instances are visited is determined dynamically for every parse tree . Only those attribute instances needed for static evaluation of a particular attribute instance are visited . An attribute instance is treated as a leaf if it has a valu e assigned to it . The evaluation algorithm is repeated for every synthesized attribute of the root and, each time, it is applied to a possibly smaller graph since newly evaluated attribute instances become leaves .

22 citations


Journal ArticleDOI
TL;DR: This paper deals with the problem of computing relations from their abstract non-algorithmic specifications, and a notion of the relation specified by a two-level grammar is introduced and computability of such relations is discussed.

18 citations


Journal ArticleDOI
TL;DR: For every triple (k, /, m) of nonnegaUve integers, every context-free grammar G can be transformed rote a normal form where (1) each nontermmating production is of the type A ~ wkBwtCw,~ with the property that I wl appears m the length set of L(G).
Abstract: For every triple (k, /, m) of nonnegaUve integers, every context-free grammar G can be transformed rote a normal form where (1) each nontermmating production is of the type A ~ wkBwtCw,~ with I wk [ = k, I wll --/, and I w,~ [ = m, and 00 each terminating producUon A ~ w has the property that I wl appears m the length set of L(G). Apphcations and generalizations of this result are discussed. Categories and SubJect Descriptors' F 4 2 [Mathematical Logic and Formal Languages]: Grammars and Other Rewriting Systems--grammar types, F 4 3 [Mathematical Logic and Formal Languages]: Formal Languages--classes defined by grammars or automata General Terms' Theory Addltmnal

15 citations


Journal ArticleDOI
TL;DR: A formal model for defining transformations of languages of designs in terms of the grammars which generate them is described in detail and those transformations which preserve the recursive structure of the original grammar are identified.
Abstract: A formal model for defining transformations of languages of designs in terms of the grammars which generate them is described in detail. First, a normal form for grammars is presented which distinguishes two basic determinants of the compositional structure of designs in a language: spatial relations and the order in which they are employed to generate designs. These two constructive mechanisms are used to specify each rule in a normal form grammar. An internal formal property of a normal form grammar called its recursive structure is also characterized. Rules of transformation are then defined which map the rules of a given normal form grammar onto rules of new grammars by changing independently the two components of rules in the original grammar. The new grammars produced specify new languages of designs. Of particular interest are those transformations which preserve the recursive structure of the original grammar.

15 citations


Journal ArticleDOI
TL;DR: It is shown first that there is no recursive function bounding the succinctness gained using parsable content-free grammars instead of parsers, and that there exists an infinite family of LL(2) Grammars such that the size of every left or right parser for these grammARS must be ⩾2cm.

14 citations


Journal ArticleDOI
TL;DR: In this paper, the first part of a paper consisting of two parts that investigates how various language-theoretical properties of influence the closure properties of selective substitution grammars are established.
Abstract: Let be the class of languages generated by selective substitution grammars, where: 1) arbitrary productions of the form are allowed, where b is a letter and w is a word and 2) the selectors used are from . This is the first part of a paper consisting of two parts that investigates how various language-theoretical properties of influence the closure properties of In this part basic techniques for manipulating selectors of selective substitution grammars are established. Then we investigate how properties of influence the closure of under union and concatenation.

Proceedings Article
08 Aug 1983
TL;DR: The aim of this paper is to show how LFG can be translated into DCG and that the procedural semantics of PROLOG provides an efficient tool for LFG-implementations in that it allows the construction of function structures directly during the parsing process.
Abstract: Lexical functional grammar (LFG) is an attempt to solve problems that arise in transformational grammar and ATN-formalisms (Bresnan, 1982). Another powerful formalism for describing natural languages follows from a method for expressing grammars in logic, due to Colmerauer (1970) and Kowalski (1974) called definite clause grammars(DCG) (Warren, Pereira, 1980). Both formalisms are a natural extension of context free grammars (CFG). The aim of this paper is to show -how LFG can be translated into DCG -that the procedural semantics of PROLOG provides an efficient tool for LFG-implementations in that it allows the construction of function structures (f-structures) directly during the parsing process. i.e. it is not necessary to have a separate component which first derives a set of functional equations from the parse tree, and secondly generates a f-structure by solving these equations.

Journal ArticleDOI
TL;DR: By requiring that each derivation tree has a computation sequence with a certain property, it is possible to give simple characterizations of wellknown subclasses of attribute grammars.
Abstract: A computation sequence for a derivation tree specifies a way of walking through the tree evaluating all the attributes of all nodes. By requiring that each derivation tree has a computation sequence with a certain property, it is possible to give simple characterizations of wellknown subclasses of attribute grammars. Especially the absolutely noncircular attribute grammars are considered.


Journal ArticleDOI
TL;DR: An examination of a one‐ pass grammar for the programming language Euclid shows that the present definition of one‐pass grammars is too general: the space behaviour of the produced compilers differs from that found in conventional hand‐written compilers.
Abstract: Automatic production of one-pass compilers from attribute grammars is considered. An examination of a one-pass grammar for the programming language Euclid shows that the present definition of one-pass grammars is too general: the space behaviour of the produced compilers differs from that found in conventional hand-written compilers. A new class of attribute grammars is defined. The class models naturally the use of space in a hand-written compiler. This implies that the compiler produced automatically on the basis of the grammar uses space in the same way as a practical hand-written recursive descent compiler. Furthermore, a graphical notation is introduced as a design tool for obtaining grammars in the proposed class.

Proceedings Article
01 Jan 1983

Journal ArticleDOI
TL;DR: It is shown that context-sensitive control grammars with leftmost derivations are no more powerful than context-free ones, and, using this, resolve two open problems of Ginsburg and Spanier.
Abstract: We present a formal model for stratificational linguistics, and examine its properties such as generative power, complexity of recognition and descriptional complexity. By relating stratificational grammars to control grammars and Szilard languages, we obtain a table of language families generated by stratificational grammars under several restrictions of linguistic interest. In the process, we show that context-sensitive control grammars with leftmost derivations are no more powerful than context-free ones, and, using this, resolve two open problems of Ginsburg and Spanier. Throughout the paper, formal results are interpreted in terms of their significance for linguistic theory and practice.


Journal ArticleDOI
TL;DR: The basic idea is from Korenjak and Hopcroft's branching algorithm for simple deterministic grammars, but the algorithm is so distinguished that it is throughout free from mixing the nonterminals of the respective Grammars in question and then very simple.

Book ChapterDOI
18 Jul 1983
TL;DR: In this article, the translational mechanism of attribute grammars using tree automata is investigated, and the pushdown tree-to-string transducer with a certain synchronization facility is proposed.
Abstract: The translational mechanism of attribute grammars using tree automata are investigated. The pushdown tree-to-string transducer with a certain synchronization facility as a model to realize transformations by attribute grammars is proposed and its basic properties using tree-walking finite state automata are studied. To demonstrate the utility of this model, it is shown that noncircular attribute grammars are equally powerful as arbitrary attribute grammars, and a method is provided to show that a certain type of transformations is impossible by attribute grammars.

Proceedings ArticleDOI
C. M. R. Kintala1
21 Mar 1983
TL;DR: This work specifies the formal syntax and semantics of two working database translators using attributed grammars, and opens the possibility of applying the emerging technology of semantics-directed compiler construction to build query language translator-generators and to prove those translators correct.
Abstract: Systems which translate queries written using a high-level conceptual model of a database into sequences of commands based on another model of the database are studied here. We take the view that these translators are similar to, albeit simpler than, the compilers for programming languages. Motivated by the recent interest in describing all the aspects of a compiler by an attributed grammar, we specify the formal syntax and semantics of two working database translators using attributed grammars. All the precise details about the parsing, the code optimization and the rules for preserving the query semantics are captured by those grammars. It is hoped that this approach brings the understanding of query languages closer to that of programming languages, and opens the possibility of applying the emerging technology of semantics-directed compiler construction to build query language translator-generators and to prove those translators correct.

Proceedings ArticleDOI
15 Jun 1983
TL;DR: Findings regarding the decidability, generative capacity, and recognition complexity of several syntactic theories are surveyed and the implications of these results with respect to linguistic theory are discussed.
Abstract: Meta-theoretical results on the decidability, generative capacity, and recognition complexity of several syntactic theories are surveyed. These include context-free grammars, transformational grammars, lexical functional grammars, generalized phrase structure grammars, and tree adjunct grammars.

Journal ArticleDOI
TL;DR: It is proved that for a probabilistic context-free language L(G), the population density of a character (terminal symbol) is equal to its relative density in the words of a sample S from L( G) whenever the production probabilities of the grammar G are estimated by the relative frequencies of the corresponding productions in the sample.
Abstract: It is proved that for a probabilistic context-free language L(G), the population density of a character (terminal symbol) is equal to its relative density in the words of a sample S from L(G) whenever the production probabilities of the grammar G are estimated by the relative frequencies of the corresponding productions in the sample.

Proceedings Article
08 Aug 1983
TL;DR: In this paper, a question-answering system based on graph grammars has been implemented, where each word is associated to a syntactico-senantic constituent type that is represented by a transition network -like graph whose transitions correspond to transformations in the derivation graph.
Abstract: String grammars have been found in many ways inadequate for parsing inflectional languages with "free" word order. To overcome these problems we have replaced linear string grammars and tree transformations by their multidimensional generalization, graph grammars. In our approach parsing is seen as a transformation between two graph languages, namely the sets of morphological and semantic representations of natural language sentences. An experimental Finnish question-answering system SUVI based on graph grammars has been implemented. In SUVI the role of individual words is active. Each word is associated to a syntactico-senantic constituent type that is represented by a transition network - like graph whose transitions correspond to transformations in the derivation graph. Parsing is performed by interpreting the constituent type graphs corresponding to the words of the current sentence.

Book ChapterDOI
TL;DR: In this paper, the problem of finding classes of context-free grammars which can be transformed to LL(k) or LR(O) languages has been extensively studied.