scispace - formally typeset
Search or ask a question
Topic

Tree-adjoining grammar

About: Tree-adjoining grammar is a research topic. Over the lifetime, 2491 publications have been published within this topic receiving 57813 citations.


Papers
More filters
01 Jan 1975
TL;DR: Shape grammars as mentioned in this paper provide a means for the recursive specification of shapes, where a phrase structura grammar is defined over an alphabet of symbols and generates a language of sequences of symbols.
Abstract: Shape grammars provide a means for the recursive specification of shapes. The formalism for shape grammars is designed to be easily usable and understandable by people and at the same time to be adaptable for use in computer programs. Shape grammars are similar to phrase structura grammars, which were developed by Chomsky [ 1956, 1957]. Where a phrase structura grammar is defined over an alphabet of symbols and generates a language of sequences of symbols, a shape grammar is defined over an alphabet of shapes and generates a language of shapes. This dissertation explores the uses of shape grammars. The dissertation is divided into three sections and an appendix. In the first section: Shape grammars are defined. Some simple examples are given for instructiva purposes. Shape grammars are used to generate a new class of reversible figures. Shape grammars are given for some well-known mathematical curves (the Snowflake curve, a variation of Peano's curve, and Hilbert's curve). To show the general computational power of shape grammars, a procedura that given any Turing machine constructs a shape grammar that simulates the operation of that Turing machine is presented. Related work on various formalisms for pictura grammars is described. A symbolic characterization of shape grammars is given that is useful for implementing shape grammars in computer programs.

26 citations

Journal ArticleDOI
Barry K. Rosen1
TL;DR: This work assigns context-free grammars to recursion schemes in such a way that schemes are tree equivalent iff their Grammars generate the same language, and shows that this relation is also narrow enough to imply input-output equivalence.

26 citations

Proceedings ArticleDOI
26 Apr 2007
TL;DR: This work provides a conceptual basis for thinking of machine translation in terms of synchronous grammars in general, and probabilistic synchronous tree-adjoining grammARS in particular, and evidence is found in the structure of bilingual dictionaries of the last several millennia.
Abstract: We provide a conceptual basis for thinking of machine translation in terms of synchronous grammars in general, and probabilistic synchronous tree-adjoining grammars in particular. Evidence for the view is found in the structure of bilingual dictionaries of the last several millennia.

26 citations

01 Jan 1998
TL;DR: The work here focuses on extending a syntactic grammar to handle phenomena occurring within a single sentence which have punctuation as an integral compo nent Punctuation marks are treated as full edged lexical items in a Lexicalized Tree Adjoining Grammar which is an extremely well suited formalism for encoding punctuation in the sentence grammar.
Abstract: INCORPORATING PUNCTUATION INTO THE SENTENCE GRAMMAR A LEXICALIZED TREE ADJOINING GRAMMAR PERSPECTIVE Christine D Doran Supervisor Aravind K Joshi Punctuation helps us to structure and thus to understand texts Many uses of punctuation straddle the line between syntax and discourse because they serve to combine multiple propositions within a single orthographic sentence They allow us to insert discourse level relations at the level of a single sentence Just as people make use of information from punctuation in processing what they read computers can use information from punctuation in processing texts automatically Most cur rent natural language processing systems fail to take punctuation into account at all losing a valuable source of information about the text Those which do mostly do so in a super cial way again failing to fully exploit the information conveyed by punctuation To be able to make use of such information in a computational system we must rst characterize its uses and nd a suitable representation for encoding them The work here focuses on extending a syntactic grammar to handle phenomena occurring within a single sentence which have punctuation as an integral compo nent Punctuation marks are treated as full edged lexical items in a Lexicalized Tree Adjoining Grammar which is an extremely well suited formalism for encoding punctuation in the sentence grammar Each mark anchors its own elementary trees and imposes constraints on the surrounding lexical items I have analyzed data rep resenting a wide variety of constructions and added treatments of them to the large English grammar which is part of the XTAG system The advantages of using LTAG are that its elementary units are structured trees of a suitable size for stating the constraints we are interested in and the derivation histories it produces contain in formation the discourse grammar will need about which elementary units have used and how they have been combined I also consider in detail a few particularly inter esting constructions where the sentence and discourse grammars meet appositives reported speech and uses of parentheses My results con rm that punctuation can

26 citations

Journal ArticleDOI
TL;DR: In this paper, the authors explore the implications of taking the answer to be negative and derive an explanation for a range of coordination facts that have remained quite mysterious since they were discovered by J. R. Ross some 15 years ago.
Abstract: A traditional concern of grammarians has been the question of whether the members of given pairs of expressions belong to the same or different syntactic categories. Consider the following example sentences. ( a ) I think Fido destroyed the kennel . ( b ) The kennel, I think Fido destroyed . Are the two underlined expressions members of the same syntactic category or not? The generative grammarians of the last quarter century have, almost without exception, taken the answer to be affirmative. In the present paper I explore the implications of taking the answer to be negative. The changes consequent upon this negative answer turn out to be very far-reaching: (i) it becomes as simple to state rules for constructions of the general type exemplified in ( b ) as it is for the canonical NP VP construction in ( a ); (ii) we immediately derive an explanation for a range of coordination facts that have remained quite mysterious since they were discovered by J. R. Ross some 15 years ago; (iii) our grammars can entirely dispense with the class of rules known as transformations; (iv) our grammars can be shown to be formally equivalent to what are known as the context-free phrase structure grammars; (v) this latter consequence has the effect of making potentially relevant to natural language grammars a whole literature of mathematical results on the parsability and learnability of context-free phrase structure grammars.

26 citations


Network Information
Related Topics (5)
Graph (abstract data type)
69.9K papers, 1.2M citations
85% related
Parsing
21.5K papers, 545.4K citations
85% related
Time complexity
36K papers, 879.5K citations
84% related
Semantics
24.9K papers, 653K citations
82% related
Tree (data structure)
44.9K papers, 749.6K citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202315
202225
20217
20205
20196
201811