scispace - formally typeset
Search or ask a question

Showing papers on "Context-sensitive grammar published in 1988"


Book
01 Jan 1988
TL;DR: A tutorial introduction to the algebraic approach of graph grammars can be found in this paper, where the authors introduce the Hyperedge replacement method and an introduction to parallel map generating systems.
Abstract: Tutorial introduction to the algebraic approach of graph grammars- May we introduce to you: Hyperedge replacement- An introduction to parallel map generating systems- Set theoretic approaches to graph grammars- An introduction to the NLC way of rewriting graphs- Array grammars- Graph grammar based specification of interconnection structures for massively parallel computation- Towards distributed graph grammars- On partially ordered graph grammars- A representation of graphs by algebraic expressions and its use for graph rewriting systems- On context-free sets of graphs and their monadic second-order theory- Restricting the complexity of regular DNLC languages- Apex graph grammars- Graph grammar engineering: A software specification method- A linguistic formalism for engineering solid modeling- Graph grammars and diagram editing- Graphics and their grammars- On network algebras and recursive equations- Ada-concurrency specified by graph grammars- Basic notions of actor grammars- Embedding rule independent theory of graph grammars- Supporting the software development process with attributed NLC graph grammars- Practical applications of precedence graph grammars- Is parallelism already concurrency? Part 1: Derivations in graph grammars- Is parallelism already concurrency? Part 2: Non-sequential processes in graph grammars- Map OL-systems with edge label control: Comparison of marker and cyclic systems- From 0L and 1L map systems to indeterminate and determinate growth in plant morphogenesis- Fundamentals of edge-label controlled graph grammars- Parallelism analysis in rule-based systems using graph grammars- An efficient algorithm for the solution of hierarchical networks of constraints- A software development environment based on graph technology- Map 0L systems with markers- Graph rewriting with unification and composition- Complexity of pattern generation via planar parallel binary fission/fusion grammars- Applications of L-systems to computer imagery- Advances in array languages- Rosenfeld's cycle grammars and kolam- Application of graph grammars in music composing systems- Boundary NLC and partition controlled graph grammars

401 citations


01 Jan 1988
TL;DR: A class of formalisms called Linear Context-Free Rewriting Systems (LCFRS's), which includes a wide range of grammatical formalisms with restricted power, and it is proved that all members of this family generate only semilinear languages that can be recognized in polynomial time.
Abstract: This thesis involves the study of formal properties of grammatical formalisms that are relevant to computational linguists. The formalisms which will receive the most attention share the property that they are highly restricted in their generative power. Recent research suggests that Context-Free Grammars (CFG's) lack the necessary expressive power on which to base a linguistic theory. This has led computational linguists to consider grammatical formalisms whose generative power exceeds CFG's, but only to a limited extent. We compare a number of formalisms on the basis of their weak generative capacity, as well as suggesting ways in which they can be compared on the basis of their strong generative capacity. In particular, we consider properties of their structural descriptions (or tree sets); and the types of dependencies (nested, crossed, etc.) that can be exhibited by each formalism. Several formalisms that are notationally quite different (Tree Adjoining Grammars, Head Grammars, and Linear Indexed Grammars) have been shown to be weakly equivalent. We show that Combinatory Categorical Grammars are weakly equivalent to these formalisms. The class of languages generated by these formalisms can be thought of one step up from CFG's, and we describe a number of progressions that illustrate this. The string languages generated by TAL's, HL's, CCL's and LIL's exhibit limited crossed-serial dependencies in addition to those produced by Context-Free Grammars (nested and serial dependencies). By formalizing these crossed-serial dependencies and their relationship with the nested dependencies produced by CFG's we define an infinite progression of formalisms. Our work on structural descriptions leads us to characterize a class of formalisms called Linear Context-Free Rewriting Systems (LCFRS's), which includes a wide range of grammatical formalisms with restricted power. The systems in this class have context-free derivations, and simple composition operations that are linear and nonerasing. We prove that all members of this family generate only semilinear languages that can be recognized in polynomial time.

314 citations


BookDOI
TL;DR: Categorial Grammars as Theories of Language: The Lambek Calculus, Generative Power of Categorial grammars, Semantic Categories and the Development of Categorized Grammar and Phrase Structure Grammar: An Excursion on the Syntax-Semantics Frontier as discussed by the authors.
Abstract: Categorial Grammars as Theories of Language- The Lambek Calculus- Generative Power of Categorial Grammars- Semantic Categories and the Development of Categorial Grammars- Aspects of a Categorial Theory of Binding- Type Raising, Functional Composition, and Non-Constituent Conjunction- Implications of Process-Morphology for Categorial Grammar- Phrasal Verbs and the Categories of Postponement- Natural Language Motivations for Extending Categorial Grammar- Categorial and Categorical Grammars- Mixed Composition and Discontinuous Dependencies- Multi-Dimensional Compositional Functions as a Basis for Grammatical Analysis- Categorial Grammar and Phrase Structure Grammar: An Excursion on the Syntax-Semantics Frontier- Combinators and Grammars- A Typology of Functors and Categories- Consequences of Some Categorially-Motivated Phonological Assumptions- Index of Names- Index of Subjects- Index of Categories and Functors

264 citations


Book
01 Jan 1988
TL;DR: This book is referred to read because it is an inspiring book to give you more chance to get experiences and also thoughts and this attribute grammars definitions systems and bibliography.
Abstract: Downloading the book in this website lists can give you more advantages. It will show you the best book collections and completed collections. So many books can be found in this website. So, this is not only this attribute grammars definitions systems and bibliography. However, this book is referred to read because it is an inspiring book to give you more chance to get experiences and also thoughts. This is simple, read the soft file of the book and you get it.

181 citations


Proceedings ArticleDOI
22 Aug 1988
TL;DR: Two algorithms which construct two different types of generators for lexical functional grammars (LFGs) are described, which generate sentences from functional structures and the second from semantic structures.
Abstract: This paper describes two algorithms which construct two different types of generators for lexical functional grammars (LFGs). The first type generates sentences from functional structures and the second from semantic structures. The latter works on the basis of oxtended LFGs, which contain a mapping from f-structures into semantic structures. Both algorithms can be used on all grammars within the respective class of LFG-grammars. Thus sentences can be generated from input structures by means of LFG-grammars and the same grammar formalism, although not necessarily the same grammar, can be used for both analysis and synthesis.

57 citations


Book ChapterDOI
01 Jan 1988
TL;DR: The authors survey the generative capacity of categorial grammars and show that strong and weak generative capacities of various kinds of categorical grammar can be found in a large number of cases.
Abstract: This paper surveys the author’s results in strong and weak generative capacity of various kinds of categorial grammars The growing interest of linguists, computer scientists, and linguistically-minded logicians in the domain of categorial grammar calls for a new, more advanced and profound elaboration of its internal mathematics, in which the problems to be discussed here have traditionally been recognized to play a quite fundamental role

56 citations


Book ChapterDOI
11 Feb 1988
TL;DR: An abstract notion of context-free grammar is introduced that deals with abstract objects that can be words, trees, graphs or other combinatorial objects and is applied to NLC graph grammars introduced by Rozenberg and Janssens.
Abstract: An abstract notion of context-free grammar is introduced. It deals with abstract objects that can be words, trees, graphs or other combinatorial objects. It is applied to NLC graph grammars introduced by Rozenberg and Janssens. The monadic second-order theory of a context-free NLC set of graphs is decidable.

55 citations


Proceedings ArticleDOI
07 Jun 1988
TL;DR: The structural descriptions produced by Combinatory Categorial Grammars are discussed and compared to those of grammar formalisms in the class of Linear Context-Free Rewriting Systems.
Abstract: Recent results have established that there is a family of languages that is exactly the class of languages generated by three independently developed grammar formalisms: Tree Adjoining Grammars, Head Grammars, and Linear Indexed Grammars. In this paper we show that Combinatory Categorial Grammars also generates the same class of languages. We discuss the structural descriptions produced by Combinatory Categorial Grammars and compare them to those of grammar formalisms in the class of Linear Context-Free Rewriting Systems. We also discuss certain extensions of Combinatory Categorial Grammars and their effect on the weak generative capacity.

53 citations


Proceedings ArticleDOI
22 Aug 1988
TL;DR: This work presents the formal basis of UCG, with independent definitions of well-formedness for syntactic and semantic dimensions, and focuses on the concept of modifier within the theory.
Abstract: Unification Categorial Grammar (UCG) combines the syntactic insights of Categorial Grammar with the semantic insights of Discourse Representation Theory. The addition of unification to these two frameworks allows a simple account of interaction between different linguistic levels within a constraining, monostraial theory. The resulting, computationally efficient, system provides an explicit formal framework for linguistic description, within which large fragments of grammars for French and English have already been developed. We present the formal basis of UCG, with independent definitions of well-formedness for syntactic and semantic dimensions. We will also focus on the concept of modifier within the theory.

47 citations



Book ChapterDOI
29 Aug 1988
TL;DR: Each phrase-structure grammar can be replaced by an equivalent grammar with all of the rules context-free, of the form S→v, where S is the initial symbol.
Abstract: Some new normal forms for the phrase-structure grammars are presented. Each phrase-structure grammar can be replaced by an equivalent grammar with all of the rules context-free, of the form S→v, where S is the initial symbol, and either two extra rules AB→ ɛ, CD→ ɛ, or two extra rules AB→ ɛ, CC→ ɛ, or two extra rules AA→ ɛ, BBB→ ɛ, or even a single extra rule ABBBA→ ɛ, or a single extra rule ABC→ ɛ.

Book ChapterDOI
01 Jan 1988
TL;DR: This paper will take as a general framework the program and set of assumptions that have been called ‘extended Montague grammar’ and in particular a slightly modified version of Montague’s ‘Universal Grammar’ (UG: Paper 7 in Montague, 1974).
Abstract: In recent years, there has been a growing interest in categorial grammar as a framework for formulating empirical theories about natural language. This conference bears witness to that revival of interest. How well does this framework fare when used in this way? And how well do particular theories in what we might call the family of categorial theories fare when they are put up against the test of natural language description and explanation? I say ‘family’ of theories, for there have been a number of different developments, all of which take off from the fundamental idea of a categorial grammar as it was first introduced by Ajdukiewicz and later modified and studied by Bar-Hillel, Curry, and Lambek. In this paper I would like to discuss these questions, considering a number of different hypotheses that have been put forward within the broad framework that we may call ‘extended categorial grammar’ and making a few comparisons with other theories. In my remarks, I will take as a general framework the program and set of assumptions that have been called ‘extended Montague grammar’ and in particular a slightly modified version of Montague’s ‘Universal Grammar’ (UG: Paper 7 in Montague, 1974). From this point of view, the syntax of a language is looked at as a kind of algebra. Then, the empirical problem of categorial grammar can be seen as part of a general program that tries to answer these questions: (A) What is the set of primitive and derived categories that we need to describe and explain natural languages in their syntax and semantics (and phonology, etc.)? (B) What are the operations that we need to describe and explain natural languages (in the syntax, semantics, phonology, morphology, etc.)? (C) What are the relations that we need in order to hook up with each other the various categories and operations mentioned or alluded to in (A) and (B)?

Journal ArticleDOI
TL;DR: The notions introduced in the paper are useful for researches in less restricted edNLC-graph Grammars, for example grammars analogical to context-free string grammARS.

Journal ArticleDOI
TL;DR: The proposed constructive method for the inference of Even Linear Grammars from positive samples is employed and it is shown that the method can be used in a hierarchical manner to infer grammars for more complex pictures.

Proceedings ArticleDOI
22 Aug 1988
TL;DR: This paper constructs a parser by compiling systemic grammars into the notation of Functional Unification Grammar, and testing is the basis for some observations about the bidirectional use of a grammar.
Abstract: We describe a general parsing method for systemic grammars. Systemic grammars contain a paradigmatic analysis of language in addition to structural information, so a parser must assign a set of grammatical features and functions to each constituent in addition to producing a constituent structure. Our method constructs a parser by compiling systemic grammars into the notation of Functional Unification Grammar. The existing methods for parsing with unification grammars have been extended to handle a fuller range of paradigmatic descriptions. In particular, the PATR-II system has been extended by using disjunctive and conditional information in functional descriptions that are attached to phrase structure rules. The method has been tested with a large grammar of English which was originally developed for text generation. This testing is the basis for some observations about the bidirectional use of a grammar.

Proceedings ArticleDOI
01 Dec 1988
TL;DR: An efficient algorithm for learning context-free grammars using two types of queries: structural equivalence queries and structural membership queries is presented, and it is shown that a grammar learned by the algorithm is not only a correct grammar but also structurally equivalent to it.
Abstract: We consider the problem of learning a context-free grammar from its structural descriptions. Structural descriptions of a context-free grammar are unlabelled derivation trees of the grammar. We present an efficient algorithm for learning context-free grammars using two types of queries: structural equivalence queries and structural membership queries. The learning protocol is based on what is called “minimally adequate teacher”, and it is shown that a grammar learned by the algorithm is not only a correct grammar, i.e. equivalent to the unknown grammar but also structurally equivalent to it. Furthermore, the algorithm runs in time polynomial in the number of states of the minimum frontier-to-root tree automaton for the set of structural descriptions of the unknown grammar and the maximum size of any counter-example returned by a structural equivalence query.

Book
Daniel M. Yellin1
15 Apr 1988
TL;DR: Grammar-based translation methodologies and RIFs as discussed by the authors have been used to translate between programming languages in the context of RIF grammars and generalizing RIF.
Abstract: Grammar based translation methodologies and RIFs- The inversion of RIF grammars- Generalizing RIFs- The INVERT system- Translating between programming languages- Conclusions and future directions

Journal ArticleDOI
01 Nov 1988
TL;DR: A grammar that formally constructs the types of diagrams used in system dynamics is fully described and a method to achieve a formal program for interactive modeling upon any diagram, starting from the formal specification of the diagram, is presented.
Abstract: A grammar that formally constructs the types of diagrams used in system dynamics is fully described. It belongs to the special kinds of grammars (attributed programmed graph grammars) that are applied to the construction of graphs and geometric figures. The flow diagrams used in system dynamics have been defined by 'attributed graphs' so that the approach could be applied. The grammar manipulates the graphs according to the requirements of the methodology. Basically, a method to achieve a formal program for interactive modeling upon any diagram, starting from the formal specification of the diagram, is presented. >


Proceedings ArticleDOI
10 Oct 1988
TL;DR: A model for the specification of icon systems is proposed, and a general-purpose icon interpreter is presented, based on attribute grammars, which expresses the syntactic aspects of the icon systems through conceptual tree graphs.
Abstract: A model for the specification of icon systems is proposed, and a general-purpose icon interpreter is presented, based on attribute grammars. In the model, the underlying context-free grammar is a picture grammar that expresses the syntactic aspects of the icon systems. An attribute evaluator computes the meaning of a given icon sentence by evaluating the designated s-attribute of the nonterminal on the root of the parse tree. As semantic rules are actually domain-independent rule schemata, during the attribute evaluation a domain-specific knowledge base is consulted. The meaning of the ionic sentence is expressed in terms of conceptual tree graphs, which are well-suited for later execution. The design of the model is based on the theory of generalized icons. The system diagram of the icon interpreter is presented. The icon dictionary, the domain specific knowledge base, and the attribute grammar are described. >

15 Sep 1988
TL;DR: P-PATR as mentioned in this paper is a compiler for unification-based grammars that is written in Quintus Prolog running on a Sun 2 workstation and is based on the PATR-II formalism developed at SRI International.
Abstract: : P-PATR is a compiler for unification-based grammars that is written in Quintus Prolog running on a Sun 2 workstation. P-PATR is based on the PATR-II formalism [14] developed at SRI International. PATR is a simple, unification-based formalism capable of encoding a wide variety of grammars. As a result of this versatility, several parsing systems and development environments based on this formalism have been implemented [18,5]. P-PATR is one such system, designed in response to the slow parse times of most of the other PATR implementations Most of the currently running PATR systems operate by interpreting a PATR grammar. P-PATR differs from these systems by compiling the grammar into a Prolog definite clause grammar (DCG) [8]. The compilation is done only once for a given grammar; the resulting DCG contains all the information in the original PATR grammar in a form readily conducive to parsing. The advantage of compilation is that less work needs to be done during parsing, as some of the necessary computations have already been performed in the compilation phase.


Journal ArticleDOI
TL;DR: A new proof of this theorem is given which relies on the algebra of phrase structures and exhibits a possibility to justify the key construction used in Gaifman's proof by means of the Lambek calculus of syntactic types.
Abstract: The equivalence of (classical) categorial grammars and context-free grammars, proved by Gaifman [4], is a very basic result of the theory of formal grammars (an essentially equivalent result is known as the Greibach normal form theorem [1], [14]). We analyse the contents of Gaifman's theorem within the framework of structure and type transformations. We give a new proof of this theorem which relies on the algebra of phrase structures and exhibit a possibility to justify the key construction used in Gaifman's proof by means of the Lambek calculus of syntactic types [15].


Journal ArticleDOI
TL;DR: A software tool called attribute grammar based theorem prover (AGBTP) is proposed, which can be used both as a processor of attribute grammars and as a theoremProver, and can combine procedural and declarative characteristics using a very high level language i.e. the attribute grammar' language and user defined semantic functions in the host language.
Abstract: In this paper a software tool called attribute grammar based theorem prover (AGBTP) is proposed, which can be used both as a processor of attribute grammars and as a theorem prover. Hence, attribute grammars' applications from the area of software engineering as well as theorem proving applications from the area of knowledge engineering can be faced using the same tool. The main advantages of the proposed tool are that it can combine procedural and declarative characteristics using a very high level language i.e. the attribute grammars' language and user defined semantic functions in the host language. Second, full theorem proving capabilities are obtained through an extended parser, which implements the model elimination procedure.

Proceedings ArticleDOI
T. Rus1, J.P. Le Peau1
01 Jan 1988
TL;DR: Two classes of algorithms for languages parsing based on multi-axiom grammars are developed: an algorithm obtained by generalizing context-free LR-parsers to multi-AXiom Grammars, and a pattern-matching algorithm that results from the ability to layer a multi-Axiom language into levels such that each sublanguage is independent of the language that contains it.
Abstract: Multiaxiom grammars and language, presented as generalizations of context-free grammars and languages, are defined and used as a mechanism for programming language specification and implementation It is shown how to divide such a grammar into a sequence of subgrammars that generate inductively the language specified by the original grammar Furthermore, it is shown how to use this sequence of subgrammars for inductive language recognition by a process of tokenizing Two classes of algorithms for languages parsing based on multi-axiom grammars are developed: an algorithm obtained by generalizing context-free LR-parsers to multi-axiom grammars, and a pattern-matching algorithm that results from the ability to layer a multi-axiom language into levels such that each sublanguage is independent of the language that contains it The implications of multi-axiom grammars for compiler code generation are briefly discussed >

01 Jan 1988
TL;DR: This article investigated context-free grammars, the rules of which can be used in a productive and in a reductive fashion, while the application of these rules is controlled by a regular language.
Abstract: We investigate context-free grammars the rules of which can be used in a productive and in a reductive fashion, while the application of these rules is controlled by a regular language. We distinguish several modes of derivation for this kind of grammar. The resulting language families (properly) extend the family of context-free languages. We establish some closure properties of these language families and some grammatical transformations which yield a few normal forms for this type of grammar. Finally, we consider some special cases (viz. the context-free grammar is linear or left-linear), and generalizations, in particular, the use of arbitrary rather than regular control languages.

Proceedings ArticleDOI
22 Aug 1988
TL;DR: A running system, named SAIL, for the development of Natural Language Grammars, which sees grammar rules as processes which can be activated or inactivated, and can handle exchange of information, structured as messages, among rules for long distance analysis.
Abstract: A running system, named SAIL, for the development of Natural Language Grammars is described. Stress is put on the particular grammar rule model adopted, named Complex Grammar Units, and on the parsing algorithm that runs rules written in according to this model. Moreover, the parser is like a processor and sees grammar rules as processes which can be activated or inactivated, and can handle exchange of information, structured as messages, among rules for long distance analysis. A brief description of the framework of SAIL a user can interact with, named SIS, is also given. Finally, an example shows that different grammar formalisms can be implemented into the frame of SAIL.

01 Jan 1988
TL;DR: This article investigated context-free grammars, the rules of which can be used in a productive and in a reductive fashion, while the application of these rules is controlled by a regular language.
Abstract: We investigate context-free grammars the rules of which can be used in a productive and in a reductive fashion, while the application of these rules is controlled by a regular language. We distinguish several modes of derivation for this kind of grammar. The resulting language families (properly) extend the family of context-free languages. We establish some closure properties of these language families and some grammatical transformations which yield a few normal forms for this type of grammar. Finally, we consider some special cases (viz. the context-free grammar is linear or left-linear), and generalizations, in particular, the use of arbitrary rather than regular control languages.

Journal ArticleDOI
TL;DR: Some relative decision problems concerning LR( k ), LL-regular, and LR-regular grammars and languages are shown to be undecidable and iteration theorems are derived which allow the proof that certain languages are no: LL- regular of LR- regular.