scispace - formally typeset
Open Access

A Lexicalized Tree Adjoining Grammar for English

Reads0
Chats0
TLDR
A representation of prepositional complements that is based on extended elementary trees, and how to deal with semantic non compositionality in verb-particle combinations, light verb constructions and idioms, without losing the internal syntactic composition of these structures are presented.
Abstract
This paper presents a sizable grammar for English written in the Tree Adjoining grammar (TAG) formalism. The grammar uses a TAG that is both lexicalized (Schabes, Abeille, Joshi 1988) and feature-based (VijayShankar, Joshi 1988). In this paper, we describe a wide range of phenomena that it covers. A Lexicalized TAG (LTAG) is organized around a lexicon, which associates sets of elementary trees (instead of just simple categories) with the lexical items. A Lexicalized TAG consists of a finite set of trees associated with lexical items, and operations (adjunction and substitution) for composing the trees. A lexical item is called the anchor of its corresponding tree and directly determines both the tree's structure and its syntactic features. In particular, the trees define the domain of locality over which constraints are specified and these constraints are local with respect to their anchor. In this paper, the basic tree structures of the English LTAG are described, along with some relevant features. The interaction between the morphological and the syntactic components of the lexicon is also explained. Next, the properties of the different tree structures are discussed. The use of S complements exclusively allows us to take full advantage of the treatment of unbounded dependencies originally presented in Joshi (1985) and Kroch and Joshi (1985). Structures for auxiliaries and raising-verbs which use adjunction trees are also discussed. We present a representation of prepositional complements that is based on extended elementary trees. This representation avoids the need for preposition incorporation in order to account for double whquestions (preposition stranding and pied-piping) and the pseudo-passive. A treatment of light verb constructions is also given, similar to what Abeille (1988c) has presented. Again, neither noun nor adjective incorporation is needed to handle double passives and to account for CNPC violations in these constructions. TAG'S extended domain of locality allows us to handle, within a single level of syntactic description, phenomena that in other frameworks require either dual analyses or reanalysis. In addition, following Abeille and Schabes (1989), we describe how to deal with semantic non compositionality in verb-particle combinations, light verb constructions and idioms, without losing the internal syntactic composition of these structures. The last sections discuss current work on PRO, case, anaphora and negation, and outline future work on copula constructions and small clauses, optional arguments, adverb movement and the nature of syntactic rules in a lexicalized framework. Comments University of Pennsylvania Department of Computer and Information Science Technical Report No. MSCIS-90-24. This technical report is available at ScholarlyCommons: http://repository.upenn.edu/cis_reports/527 A Lexicalized Tree Adjoining Grammar For English MS-CIS-90-24 LINC LAB 170 Anne Abeillh Kathleen Bishop Sharon Cote Yves Schabes Department of Computer and Information Science School of Engineering and Applied Science University of Pennsylvania Philadelphia, PA 19104-6389

read more

Content maybe subject to copyright    Report

Citations
More filters
Book ChapterDOI

Tree-adjoining grammars

TL;DR: A tree generating system called tree-adjoining grammar (TAG) is described and a number of formal results have been established for TAGs, which are of interest to researchers in formal languages and automata, including those interested in tree grammars and tree automata.
Journal ArticleDOI

CCGbank: A Corpus of CCG Derivations and Dependency Structures Extracted from the Penn Treebank

TL;DR: This article presents an algorithm for translating the Penn Treebank into a corpus of Combinatory Categorial Grammar (CCG) derivations augmented with local and long-range word-word dependencies, and discusses the implications of the findings for the extraction of other linguistically expressive grammars from the Treebank, and for the design of future treebanks.
Journal Article

Supertagging: an approach to almost parsing

TL;DR: Novel methods for robust parsing that integrate the flexibility of linguistically motivated lexical descriptions with the robustness of statistical techniques are proposed.
Journal ArticleDOI

A large-scale classification of English verbs

TL;DR: The result is a comprehensive Levin-style classification for English verbs providing over 90% token coverage of the Proposition Bank data and thus can be highly useful for practical applications.
Proceedings Article

An Open Source Grammar Development Environment and Broad-coverage English Grammar Using HPSG

TL;DR: An outline of the LinGO English grammar and LKB system is given, and the ways in which they are currently being used are discussed, which supports collaborative development on many levels.
References
More filters
Book

Constraints on variables in syntax

TL;DR: This paper is intended to provide a history of modern language pedagogical practices in the United States and its applications in the context of modern linguistics.
Journal ArticleDOI

The Mental representation of grammatical relations

Joan Bresnan
- 01 Dec 1985 - 
TL;DR: In this article, twelve articles are grouped into three sections, as follows: "I. Syntactic Representation: " Lexical-Functional Grammar: A Formal Theory for Grammatical Representation (R. Kaplan and J. Bresnan); Control and Complementation (J.Bresnan).
Book

Generalized Phrase Structure Grammar

TL;DR: "Generalized Phrase Structure Grammar" provides the definitive exposition of the theory of grammar originally proposed by Gerald Gazdar and developed during half a dozen years' work with his colleagues Ewan Klein, Geoffrey Pullum, and Ivan Sag.
Book

Information-based syntax and semantics

Carl Pollard, +1 more
TL;DR: The particular theory presented, head-driven phrase structure grammar (HPSG) - so-called because of its central notion of the grammatical head - is an information-based (or 'unification-based' theory that has its roots in a number of different research programs within linguistics and neighboring disciplines such as philosophy and computer science.