scispace - formally typeset
Search or ask a question
Topic

Tree-adjoining grammar

About: Tree-adjoining grammar is a research topic. Over the lifetime, 2491 publications have been published within this topic receiving 57813 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: A generative grammar called equal matrix grammar which generates a class which meets both context-sensitive and context-free languages is defined and the formal power series generated by context- Free Grammars is extended to grammars of this system.
Abstract: A generative grammar called equal matrix grammar which generates a class which meets both context-sensitive and context-free languages is defined and the formal power series generated by context-free grammars is extended to grammars of this system. The Parikh mapping of this family of languages is shown to be semilinear. The Boolean and closure properties of a certain subfamily are examined. For this subfamily, the generative power of equal matrix grammar is higher than that of context-free grammars. For certain inherently ambiguous context-free languages, including that of Parikh, unambiguous grammars of this class exist. The application of equal matrix grammar to the generation of Tamil kernel sentences is given in the appendix.

70 citations

Proceedings ArticleDOI
06 Aug 2009
TL;DR: A novel method for learning a type of Synchronous Tree Adjoining Grammar and associated probabilities from aligned tree/string training data and a method of converting these grammars to a weakly equivalent tree transducer for decoding is introduced.
Abstract: Tree Adjoining Grammars have well-known advantages, but are typically considered too difficult for practical systems. We demonstrate that, when done right, adjoining improves translation quality without becoming computationally intractable. Using adjoining to model optionality allows general translation patterns to be learned without the clutter of endless variations of optional material. The appropriate modifiers can later be spliced in as needed. In this paper, we describe a novel method for learning a type of Synchronous Tree Adjoining Grammar and associated probabilities from aligned tree/string training data. We introduce a method of converting these grammars to a weakly equivalent tree transducer for decoding. Finally, we show that adjoining results in an end-to-end improvement of +0.8 Bleu over a baseline statistical syntax-based MT model on a large-scale Arabic/English MT task.

69 citations

Proceedings ArticleDOI
05 May 1969
TL;DR: This paper defines the tree analogue of a non deterministic generalized sequential machine and obtain results about the domain and range of such a mapping and relates these results to the theory of generalized finite automata6.
Abstract: In this paper we discuss still another version of indexed grammars 1 and macro grammars3,gaining some geometric intuition about the structure of these systems. An ordinary context-free grammar is a rewriting system for strings; we find that a macro grammar is a rewriting system for trees. CF grammars on strings form a special case since strings can be thought of as trees without branching nodes. We consider the special case of finite-state grammars in this report. We define the tree analogue of a non deterministic generalized sequential machine and obtain results about the domain and range of such a mapping. We relate these results to the theory of generalized finite automata6.

69 citations

Book ChapterDOI
01 Jan 1987
TL;DR: Tree Adjoining Grammars (TAG) is a formalism that factors recursion and dependencies in a special way, leading to a kind of locality and the possibility of incremental generation.
Abstract: Grammatical formalisms can be viewed as neutral with respect to comprehension or generation, or they can be investigated from the point of view of their suitability for comprehension or generation. Tree Adjoining Grammars (TAG) is a formalism that factors recursion and dependencies in a special way, leading to a kind of locality and the possibility of incremental generation. We will examine the relevance of these properties from the point of view of sentence generation.

68 citations

Proceedings Article
23 Feb 2000
TL;DR: The authors extract different LTAGs from the Penn Treebank and show that certain strategies yield an improved extracted LTAG in terms of compactness, broad coverage, and supertagging accuracy.
Abstract: The accuracy of statistical parsing models can be improved with the use of lexical information Statistical parsing using Lexicalized tree adjoining grammar (LTAG), a kind of lexicalized grammar, has remained relatively unexplored We believe that is largely in part due to the absence of large corpora accurately bracketed in terms of a perspicuous yet broad coverage LTAG Our work attempts to alleviate this difficulty We extract different LTAGs from the Penn Treebank We show that certain strategies yield an improved extracted LTAG in terms of compactness, broad coverage, and supertagging accuracy Furthermore, we perform a preliminary investigation in smoothing these grammars by means of an external linguistic resource, namely, the tree families of an XTAG grammar, a hand built grammar of English

68 citations


Network Information
Related Topics (5)
Graph (abstract data type)
69.9K papers, 1.2M citations
85% related
Parsing
21.5K papers, 545.4K citations
85% related
Time complexity
36K papers, 879.5K citations
84% related
Semantics
24.9K papers, 653K citations
82% related
Tree (data structure)
44.9K papers, 749.6K citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202315
202225
20217
20205
20196
201811