scispace - formally typeset
Search or ask a question
Topic

Tree-adjoining grammar

About: Tree-adjoining grammar is a research topic. Over the lifetime, 2491 publications have been published within this topic receiving 57813 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: LR(k) grammars are defined, which are perhaps the most general ones of this type, and they provide the basis for understanding all of the special tricks which have been used in the construction of parsing algorithms for languages with simple structure, e.g. algebraic languages.
Abstract: There has been much recent interest in languages whose grammar is sufficiently simple that an efficient left-to-right parsing algorithm can be mechanically produced from the grammar. In this paper, we define LR(k) grammars, which are perhaps the most general ones of this type, and they provide the basis for understanding all of the special tricks which have been used in the construction of parsing algorithms for languages with simple structure, e.g. algebraic languages. We give algorithms for deciding if a given grammar satisfies the LR(k) condition, for given k, and also give methods for generating recognizes for LR(k) grammars. It is shown that the problem of whether or not a grammar is LR(k) for some k is undecidable, and the paper concludes by establishing various connections between LR(k) grammars and deterministic languages. In particular, the LR(k) condition is a natural analogue, for grammars, of the deterministic condition, for languages.

819 citations

Journal ArticleDOI
TL;DR: In this report, certain properties of context-free (CF or type 2) Grammars are investigated, like that of Chomsky, and it is shown that this type of grammar is essentially stronger than type 2 grammars and has the advantage over type 1 grammARS that the phrase structure of a grammatical sentence is unique, once the derivation is given.
Abstract: In this report, certain properties of context-free (CF or type 2) grammars are investigated, like that of Chomsky. In particular, questions regarding structure, possible ambiguity and relationship to finite automata are considered. The following results are presented: The language generated by a context-free grammmar is linear in a sense that is defined precisely.The requirement of unambiguity—that every sentence has a unique phrase structure—weakens the grammar in the sense that there exists a CF language that cannot be generated unambiguously by a CF grammar.The result that not every CF language is a finite automaton (FA) language is improved in the following way. There exists a CF language L such that for any L′ ⊆ L, if L′ is FA, an L″ ⊆ L can be found such that L″ is also FA, L′ ⊆ L″ and L″ contains infinitely many sentences not in L′.A type of grammar is defined that is intermediate between type 1 and type 2 grammars. It is shown that this type of grammar is essentially stronger than type 2 grammars and has the advantage over type 1 grammars that the phrase structure of a grammatical sentence is unique, once the derivation is given.

788 citations

Book ChapterDOI
01 Apr 1997
TL;DR: A tree generating system called tree-adjoining grammar (TAG) is described and a number of formal results have been established for TAGs, which are of interest to researchers in formal languages and automata, including those interested in tree grammars and tree automata.
Abstract: In this paper, we will describe a tree generating system called tree-adjoining grammar (TAG) and state some of the recent results about TAGs. The work on TAGs is motivated by linguistic considerations. However, a number of formal results have been established for TAGs, which we believe, would be of interest to researchers in formal languages and automata, including those interested in tree grammars and tree automata.

787 citations

Journal ArticleDOI
27 Apr 2006-Nature
TL;DR: It is shown that European starlings (Sturnus vulgaris) accurately recognize acoustic patterns defined by a recursive, self-embedding, context-free grammar, and this finding opens a new range of complex syntactic processing mechanisms to physiological investigation.
Abstract: Noam Chomsky's work on ‘generative grammar’ led to the concept of a set of rules that can generate a natural language with a hierarchical grammar, and the idea that this represents a uniquely human ability. In a series of experiments with European starlings, in which several types of ‘warble’ and ‘rattle’ took the place of words in a human language, the birds learnt to classify phrase structure grammars in a way that met the same criteria. Their performance can be said to be almost human on this yardstick. So if there are language processing capabilities that are uniquely human, they may be more context-free or at a higher level in the Chomsky hierarchy. Or perhaps there is no single property or processing capacity that differentiates human language from non-human communication systems. Humans regularly produce new utterances that are understood by other members of the same language community1. Linguistic theories account for this ability through the use of syntactic rules (or generative grammars) that describe the acceptable structure of utterances2. The recursive, hierarchical embedding of language units (for example, words or phrases within shorter sentences) that is part of the ability to construct new utterances minimally requires a ‘context-free’ grammar2,3 that is more complex than the ‘finite-state’ grammars thought sufficient to specify the structure of all non-human communication signals. Recent hypotheses make the central claim that the capacity for syntactic recursion forms the computational core of a uniquely human language faculty4,5. Here we show that European starlings (Sturnus vulgaris) accurately recognize acoustic patterns defined by a recursive, self-embedding, context-free grammar. They are also able to classify new patterns defined by the grammar and reliably exclude agrammatical patterns. Thus, the capacity to classify sequences from recursive, centre-embedded grammars is not uniquely human. This finding opens a new range of complex syntactic processing mechanisms to physiological investigation.

510 citations

Journal ArticleDOI
Alfred V. Aho1
TL;DR: A new type of grammar for generating formal languages, called an indexed grammar, is presented, and the class of languages generated by indexed grammars has closure properties and decidability results similar to those for context-free languages.
Abstract: A new type of grammar for generating formal languages, called an indexed grammar, is presented. An indexed grammar is an extension of a context-free grammar, and the class of languages generated by indexed grammars has closure properties and decidability results similar to those for context-free languages. The class of languages generated by indexed grammars properly includes all context-free languages and is a proper subset of the class of context-sensitive languages. Several subclasses of indexed grammars generate interesting classes of languages.

476 citations


Network Information
Related Topics (5)
Graph (abstract data type)
69.9K papers, 1.2M citations
85% related
Parsing
21.5K papers, 545.4K citations
85% related
Time complexity
36K papers, 879.5K citations
84% related
Semantics
24.9K papers, 653K citations
82% related
Tree (data structure)
44.9K papers, 749.6K citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202315
202225
20217
20205
20196
201811