scispace - formally typeset
Search or ask a question

Showing papers on "Context-sensitive grammar published in 1994"


Journal ArticleDOI
TL;DR: The result presented in this paper is that all four of the formalisms under consideration generate exactly the same class of string languages.
Abstract: There is currently considerable interest among computational linguists in grammatical formalisms with highly restricted generative power. This paper concerns the relationship between the class of string languages generated by several such formalisms, namely, combinatory categorial grammars, head grammars, linear indexed grammars, and tree adjoining grammars. Each of these formalisms is known to generate a larger class of languages than context-free grammars. The four formalisms under consideration were developed independently and appear superficially to be quite different from one another. The result presented in this paper is that all four of the formalisms under consideration generate exactly the same class of string languages.

246 citations


Proceedings ArticleDOI
04 Oct 1994
TL;DR: A formal semantics for constraint multiset grammars is given, the theoretical complexity of parsing with these grammARS is investigated, and an incremental parsing algorithm is given.
Abstract: Constraint multiset grammars provide a general, high-level framework for the definition of visual languages. They are a new formalism based on multiset rewriting. We give a formal semantics for constraint multiset grammars, investigate the theoretical complexity of parsing with these grammars and give an incremental parsing algorithm. >

139 citations


Book
01 Jan 1994
TL;DR: This chapter discusses the basic theory of rigid grammars, a theorem of finite elasticity, and the learnability theorem, which describes the structure of grammar according to k-valued and least-valued values.
Abstract: 1. Introduction 2. Learnability theorem 3. A theorem of finite elasticity 4. Classical categorial grammar 5. Basic theory of rigid grammars 6. Learning from structures I: rigid, k-valued, and least-valued grammar 7. Learning from structures II: Subclasses of the optimal grammars 8. Learning from strings 9. Variations 10. Conclusions Appendix.

50 citations


Proceedings ArticleDOI
04 Oct 1994
TL;DR: A class of relation grammars that satisfy the context-freeness property is defined, which is an essential condition to solve the membership problem in polynomial time, and a predictive parsing algorithm is designed for such Grammars.
Abstract: We define a class of relation grammars that satisfy the context-freeness property, which is an essential condition to solve the membership problem in polynomial time. The context-freeness property is used to design a predictive parsing algorithm for such grammars. The algorithm has a polynomial time behaviour when applied to grammars which generate languages having the additional properties of connections and degree-boundedness. One remarkable result is that a polynomial time complexity is obtained without imposing (total or partial) ordering on the symbols of input sentences. >

39 citations


BookDOI
01 May 1994
TL;DR: Substitutions on words and languages applications to cryptography, A. Atanasiu grammar systems - a multi-agent framework for natural language generation, and control mechanisms on #-context-free array grammars.
Abstract: Substitutions on words and languages applications to cryptography, A. Atanasiu grammar systems - a multi-agent framework for natural language generation, E. Csuhaj-Varju control mechanisms on #-context-free array grammars, R. Freund on contextual grammars with parallel derivation, L. Ilie coloured Gauss and tangent codes, J. Kari and V. Niemi matrix grammars versus parallel communicating grammar systems, V. Mihalache reducts vs reducing operators, M. Novotny on conditional grammars and conditional Petri nets, F.-L. Tiplea. (Part Contents).

33 citations


Journal ArticleDOI
TL;DR: This paper compares the generative power of colonies with two cooperation strategies and with several types of the selection of the alphabet for the common language.

27 citations


Journal ArticleDOI
TL;DR: This paper investigates the classical regulated rewriting mechanisms like programmed grammars, matrix Grammars and ordered grammARS considered as accepting devices, in contrast with the usual generating mode, and obtains that ordered gramMars with context-free rules, admitting γ-productions, are computationally universal in accepting mode.
Abstract: In this paper, we investigate the classical regulated rewriting mechanisms like programmed grammars, matrix grammars and ordered grammars considered as accepting devices, in contrast with the usual generating mode. Where in the type-n grammars of the Chomsky hierarchy the descriptive power both of generating and of accepting grammars coincide, this need not be true any more in regulated devices. We even obtain, e.g., that ordered grammars with context-free rules and γ-free productions accept all context- sensitive γ-free languages, and that ordered grammars with context-free rules, admitting γ-productions, are computationally universal in accepting mode.

27 citations


Proceedings Article
06 Jun 1994

26 citations



Posted Content
TL;DR: An efficient algorithm for the application of local grammars put in this form to lemmatized texts is described and illustrated.
Abstract: Local grammars can be represented in a very convenient way by automata. This paper describes and illustrates an efficient algorithm for the application of local grammars put in this form to lemmatized texts.

22 citations


Journal ArticleDOI
TL;DR: It is shown that every S-HH hypergraph language of bounded (hyper-)degree can be generated by a (separated) CFHG grammar, which implies that these two types of grammar generate the same class of graph languages of bounded degree, but incomparable classes of hypergraph languages.

Book ChapterDOI
13 Nov 1994
TL;DR: A truly concurrent semantics for graph grammars, based on event structures, is proposed that generalizes to arbitrary consuminggrammars (i.e., such that each production deletes some items) the semantics presented in [4] for the subclass of safe Grammars.
Abstract: We propose a truly concurrent semantics for graph grammars, based on event structures, that generalizes to arbitrary consuming grammars (i.e., such that each production deletes some items) the semantics presented in [4] for the subclass of safe grammars. Also, parallel derivations are explicitly considered, instead of sequential ones only as in [4]. The “domain” and the “event structure” of a grammar are introduced independently, and one main result shows that they are strongly related, since the domain is the domain of finite configurations of the event structure. Another important result provides an abstract characterization of when two (parallel) graph derivations should be considered as equivalent from a true-concurrency perspective.

Journal ArticleDOI
19 Apr 1994
TL;DR: In this article, the authors present grammatical descriptions of the set of normal inhabitants of a given type under a given basis, both for the standard simple type system (in the partial discharge convention) and for the system in the total discharge convention (or the Prawitz style natural deduction system).
Abstract: We present grammatical (or equational) descriptions of the set of normal inhabitants {M|??M:A,Min?-normal form} of a given typeAunder a given basis?, both for the standard simple type system (in the partial discharge convention) and for the system in the total discharge convention (or the Prawitz-style natural deduction system). It is shown that in the latter system we can describe the set by a (finite) context-free grammar, but for the standard system this is not necessarily the case because we may need an infinite supply of fresh (bound) variables to describe the set. In both cases, however, our grammars reflect the structure of normal inhabitants in such a way that, when non-terminals are ignored, a derivation tree of the grammars yielding a?-termMcan be identified with B?hm tree ofM. We give some applications of the grammatical descriptions. Among others, we give simple algorithms for the emptiness/finiteness problem of the set of normal inhabitants of a given type (both for the standard and nonstandard systems).

Journal ArticleDOI
TL;DR: This paper extends previous work on the recognition of imperfect strings generated by fuzzy context-free grammars to the recognition that some strings may not be recognizable when only one change is made to the grammar.

Journal ArticleDOI
TL;DR: The use of evolving algebra methods of specifying grammars for natural languages through distributed evolving algebras is considered, and a reconstruction of some classic grammar formalisms in directly dynamic terms is given.
Abstract: We consider the use ofevolving algebra methods of specifying grammars for natural languages. We are especially interested in distributed evolving algebras. We provide the motivation for doing this, and we give a reconstruction of some classic grammar formalisms in directly dynamic terms. Finally, we consider some technical questions arising from the use of direct dynamism in grammar formalisms.

01 Jan 1994
TL;DR: It is shown that dynamic grammars have the formal power of Turing machines and an experimental system which implements a non ambiguous \sl dynamic parser is sketched and applications of this system for the resolution of some semantic analysis problems are shown.
Abstract: We define a dynamic grammar as a device which may generate an unbounded set of context-free grammars, each grammar is produced, while parsing a source text, by the recognition of some construct. It is shown that dynamic grammars have the formal power of Turing machines. For a given source text, a dynamic grammar, when non ambiguous, may be seen as a sequence of usual context-free grammars specialized by this source text: an initial grammar is modified, little by little, while the program is parsed and is used to continue the parsing process. An experimental system which implements a non ambiguous \sl dynamic parser is sketched and applications of this system for the resolution of some semantic analysis problems are shown. Some of these examples are non-trivial (overloading resolution, derived types, polymorphism, \ldots) and indicate that this method may partly compete with other well-known techniques used in type-checking.

Journal ArticleDOI
Mark Johnson1
TL;DR: This paper focuses on two widely-used "formal" or "logical" representations of gram mars in computational linguistics, Definite Clause Grammars and Feature Structure Grammar, and describes the way in which they express the recognition problem and the parsing problem.
Abstract: A grammar is a formal device which both identifies a certain set of utter ances as well-formed, and which also defines a transduction relation be tween these utterances and their linguistic representations. This paper focuses on two widely-used "formal" or "logical" representations of gram mars in computational linguistics, Definite Clause Grammars and Feature Structure Grammars, and describes the way in which they express the recognition problem (the problem of determining if an utterance is in the language generated by a grammar) and the parsing problem (the problem of finding the analyses assigned by a grammar to an utterance). Although both approaches are 'constraint-based', one of them is based on logical consequence relation, and the other is based on satisfiability. The main goal of this paper is to point out the different conceptual basis of these two ways of formalizing grammars, and discuss some of their properties.


Journal ArticleDOI
TL;DR: Investigating connections between algorithmic identification in the limit of grammars from text presentation of recursively enumerable languages and standardizing operations on classes of recurring languages is the subject of this paper.

Proceedings ArticleDOI
16 Jul 1994
TL;DR: An algorithm to learn languages defined by structurally reversible deterministic context-free grammars from queries and counterexamples in time polynomial in input size and the size of the original grammar is presented.
Abstract: In this paper we present an algorithm to learn languages defined by structurally reversible deterministic context-free grammars from queries and counterexamples. The algorithm works in time polynomial in input size and the size of the original grammar.A context-free grammar is said to be structurally reversible if among all non-terminal strings that might derive a given terminal string, no one is an extension of the other.The concept of learning from queries and counterexamples was introduced by D. Angluin in 1987. She showed that regular languages are polynomial-time learnable from queries and counterexamples. Since that paper there has been considerable interest in extending the result to a larger class of languages.Among structurally reversible grammars there are very simple grammars which have been recently investigated towards learnability, and weighted grammars. As the complexity of algorithm presented here does not depend on the terminal alphabet size, it is applicable to learning left Szilard languages.Weighted grammars are grammars with integer weights assigned to all symbols such that each rule preserves the weight. The vast majority context-free languages used in practice (for example, most programming languages) can be generated by weighted grammars.


Book ChapterDOI
10 Jun 1994
TL;DR: For some classes of context-free grammars that their multiplicity equivalence problem is decidable, generalizing a technique introduced by D. Raz is shown.
Abstract: Two context-free grammars are called multiplicity equivalent iff all words over the common terminal alphabet are generated with the same degree of ambiguity. Generalizing a technique introduced by D. Raz, we show for some classes of context-free grammars that their multiplicity equivalence problem is decidable.

Journal ArticleDOI
01 Nov 1994
TL;DR: The computational complexities of the universal recognition problems for parallel multiple context‐free grammars, multiple context-free gramMars, and their subclasses are discussed.
Abstract: A number of grammatical formalisms have been proposed to describe the syntax of natural languages, and the universal recognition problems for some of those classes of grammars have been studied. A universal recognition problem for a class Q of grammars is the one to decide, taking a grammar G ∈ G and a string ui as an input, whether G can generate w or not. In this paper, the computational complexities of the universal recognition problems for parallel multiple context-free grammars, multiple context-free grammars, and their subclasses are discussed.

Journal ArticleDOI
TL;DR: The power of the size-restricted criteria is characterized and it is used to prove that some classes of languages, which can be learned by converging in the limit to up to n + 1 nearly minimal size correct grammars, cannot be learned using this criterion even if these lattergrammars are allowed to have a finite number of anomalies per grammar.

Book ChapterDOI
21 Sep 1994
TL;DR: The problem of inferring grammars from examples and counter-examples has been mostly studied for regular languages, but here a classical approach is applied, said by enumeration, to context-free Grammars.
Abstract: The problem of inferring grammars from examples and counter-examples has been mostly studied for regular languages. Here we apply a classical approach, said by enumeration, to context-free grammars. Structural containment is used as an ordering relation, and associated operators are defined. Special attention is paid to computation time, hence our system cannot always reach all the solutions for a given problem. Though, our system is able to generalize efficiently, and the solutions found constitute a good description of the sample.


Journal ArticleDOI
01 Nov 1994
TL;DR: An infinite hierarchy of languages that comprises the context‐free languages as the first and all the languages generated by TAGs as the second element is obtained, which can be used for on‐line parsing of natural language.
Abstract: Coupled-context-free grammars are a natural generalisation of context-free grammars obtained by combining nonterminals to corresponding parentheses which can only be substituted simultaneously. Refering to the generative capacity of the grammars we obtain an infinite hierarchy of languages that comprises the context-free languages as the first and all the languages generated by TAGs as the second element. Here, we present a generalization of the context-free LL (k)-notion onto coupled-context-free grammars, which leads to a characterization of subclasses of coupled-context-free grammars-and in this way of TAGs as well-which can be parsed in linear time. The parsing procedure described works incrementally so that it can be used for on-line parsing of natural language. Examples show that important elements of the tree-adjoining languages can be generated by LL(k)-coupled-context-free grammars

Proceedings ArticleDOI
29 Nov 1994
TL;DR: This paper describes a method for evaluating the quality of context-free grammars according to (i) the complexity of each grammar and (ii) the amount of disambiguation information necessary for much grammar to reproduce the training set.
Abstract: An infinite number of context-free grammars may be inferred from a given training set. The defensibility of any single grammar hinges on the ability to compare that grammar against others in a meaningful way. In keeping with the minimum description length principle, smaller grammars are preferred over larger ones, but only insofar as the small grammar does not over-generalise the language being studied. Furthermore, measures of size must incorporate the grammar's ability to cover sentences of the source language not included in the training set. This paper describes a method for evaluating the quality of context-free grammars according to (i) the complexity of each grammar and (ii) the amount of disambiguation information necessary for much grammar to reproduce the training set. The sum of the two evaluations is used as an objective measure of a grammar's information content. Three grammars are used as examples of this process. >

DissertationDOI
01 Jan 1994
TL;DR: It is concluded that grammars are more well-suited to generating many alternative designs and searching large, unexplored design spaces, while expert systems function best in well-known domains when only one design is required.
Abstract: Just as grammars for natural languages use rules to form grammatical sentences from a dictionary of words, grammars for engineering design use rules to make structures from a dictionary of shapes, properties, labels, and other elements. Engineering grammars may be used to help the designer generate and evaluate ideas and concepts during the conceptual phase of the design process. A formal definition of a grammar is given and some properties of grammars are discussed. Natural language grammars, shape grammars, and engineering grammars are defined. Some group-theoretic properties of shapes and operations are derived. It is shown that some sets of shapes form Boolean algebras under the standard regularized set operations. Polygonal tracings, which are extensions of the outlines of two-dimensional polygons, form a ring under the shape union and convolution (or generalized Minkowski sum) operations. A subset of polygonal tracings which includes all convex tracings, along with the convolution and shape scaling operations, form a vector space over the real numbers. The implications for grammar rules which use these types of shapes and operations are discussed. Grammars and expert systems are compared and contrasted. While the formalisms have similar definitions, some explicit differences exist. Furthermore, when the customary uses of the two systems are compared, large differences are evident. It is concluded that grammars are more well-suited to generating many alternative designs and searching large, unexplored design spaces, while expert systems function best in well-known domains when only one design is required. The formation and modification of grammatical rules is discussed, focusing on the relationships between form and function in design. Several strategies which could be used for the search for optimal designs in a grammar's language are considered. The significance of transformations used to apply rules is discussed. An extended example of grammars used to generate configurations of modular reconfigurable robot arms is presented. The grammars generate all non-isomorphic assembly configurations, while simultaneously calculating kinematic properties of the arms. Several methods of quickly searching for arms to satisfy various requirements are discussed.

Book ChapterDOI
10 Jun 1994
TL;DR: An algorithm is presented that is a variation of the one of Senizergues that decides the NonTerminal Separation property of context-free grammars in polynomial time.
Abstract: An algorithm is presented that is a variation of the one of Senizergues in [4]. It decides the NonTerminal Separation property of context-free grammars in polynomial time. A straightforward generalization of the algorithm decides the NTS property of extended context-free grammars (but not in polynomial time).