scispace - formally typeset
Search or ask a question
Book

Head-driven phrase structure grammar

TL;DR: This book presents the most complete exposition of the theory of head-driven phrase structure grammar, introduced in the authors' "Information-Based Syntax and Semantics," and demonstrates the applicability of the HPSG approach to a wide range of empirical problems.
Abstract: This book presents the most complete exposition of the theory of head-driven phrase structure grammar (HPSG), introduced in the authors' "Information-Based Syntax and Semantics." HPSG provides an integration of key ideas from the various disciplines of cognitive science, drawing on results from diverse approaches to syntactic theory, situation semantics, data type theory, and knowledge representation. The result is a conception of grammar as a set of declarative and order-independent constraints, a conception well suited to modelling human language processing. This self-contained volume demonstrates the applicability of the HPSG approach to a wide range of empirical problems, including a number which have occupied center-stage within syntactic theory for well over twenty years: the control of "understood" subjects, long-distance dependencies conventionally treated in terms of "wh"-movement, and syntactic constraints on the relationship between various kinds of pronouns and their antecedents. The authors make clear how their approach compares with and improves upon approaches undertaken in other frameworks, including in particular the government-binding theory of Noam Chomsky.
Citations
More filters
Book
28 May 1999
TL;DR: This foundational text is the first comprehensive introduction to statistical natural language processing (NLP) to appear and provides broad but rigorous coverage of mathematical and linguistic foundations, as well as detailed discussion of statistical methods, allowing students and researchers to construct their own implementations.
Abstract: Statistical approaches to processing natural language text have become dominant in recent years This foundational text is the first comprehensive introduction to statistical natural language processing (NLP) to appear The book contains all the theory and algorithms needed for building NLP tools It provides broad but rigorous coverage of mathematical and linguistic foundations, as well as detailed discussion of statistical methods, allowing students and researchers to construct their own implementations The book covers collocation finding, word sense disambiguation, probabilistic parsing, information retrieval, and other applications

9,295 citations

Posted Content
TL;DR: NLTK, the Natural Language Toolkit, is a suite of open source program modules, tutorials and problem sets, providing ready-to-use computational linguistics courseware that covers symbolic and statistical natural language processing.
Abstract: NLTK, the Natural Language Toolkit, is a suite of open source program modules, tutorials and problem sets, providing ready-to-use computational linguistics courseware. NLTK covers symbolic and statistical natural language processing, and is interfaced to annotated corpora. Students augment and replace existing components, learn structured programming by example, and manipulate sophisticated models from the outset.

3,345 citations

Book
01 Dec 1999
TL;DR: It is now clear that HAL's creator, Arthur C. Clarke, was a little optimistic in predicting when an artificial agent such as HAL would be avail-able as discussed by the authors.
Abstract: is one of the most recognizablecharacters in 20th century cinema. HAL is an artificial agent capable of such advancedlanguage behavior as speaking and understanding English, and at a crucial moment inthe plot, even reading lips. It is now clear that HAL’s creator, Arthur C. Clarke, wasa little optimistic in predicting when an artificial agent such as HAL would be avail-able. But just how far off was he? What would it take to create at least the language-relatedpartsofHAL?WecallprogramslikeHALthatconversewithhumansinnatural

3,077 citations

Proceedings ArticleDOI
17 Jul 2006
TL;DR: The Natural Language Toolkit has been rewritten, simplifying many linguistic data structures and taking advantage of recent enhancements in the Python language.
Abstract: The Natural Language Toolkit is a suite of program modules, data sets and tutorials supporting research and teaching in computational linguistics and natural language processing. NLTK is written in Python and distributed under the GPL open source license. Over the past year the toolkit has been rewritten, simplifying many linguistic data structures and taking advantage of recent enhancements in the Python language. This paper reports on the simplified toolkit and explains how it is used in teaching NLP.

2,835 citations


Cites background from "Head-driven phrase structure gramma..."

  • ...Infrastructure for grammar development has a long history in unification-based (or constraint-based) grammar frameworks, from DCG (Pereira and Warren, 1980) to HPSG (Pollard and Sag, 1994)....

    [...]

Journal ArticleDOI
TL;DR: A mechanistic account of dialogue, the interactive alignment account, is proposed and used to derive a number of predictions about basic language processes, and the need for a grammatical framework that is designed to deal with language in dialogue rather than monologue is considered.
Abstract: Traditional mechanistic accounts of language processing derive almost entirely from the study of monologue. Yet, the most natural and basic form of language use is dialogue. As a result, these accounts may only offer limited theories of the mechanisms that un- derlie language processing in general. We propose a mechanistic account of dialogue, the interactive alignment account, and use it to de- rive a number of predictions about basic language processes. The account assumes that, in dialogue, the linguistic representations em- ployed by the interlocutors become aligned at many levels, as a result of a largely automatic process. This process greatly simplifies production and comprehension in dialogue. After considering the evidence for the interactive alignment model, we concentrate on three aspects of processing that follow from it. It makes use of a simple interactive inference mechanism, enables the development of local di- alogue routines that greatly simplify language processing, and explains the origins of self-monitoring in production. We consider the need for a grammatical framework that is designed to deal with language in dialogue rather than monologue, and discuss a range of implica- tions of the account.

2,222 citations


Cites methods from "Head-driven phrase structure gramma..."

  • ...Instead, the interactive alignment model is compatible with constraintbased grammar approaches in which syntax, semantics, and phonology form separate but equal parts of a multidimensional sign (Gazdar et al. 1985; Kaplan & Bresnan 1982; Pollard & Sag 1994)....

    [...]

References
More filters
Book
01 Jan 1981

7,936 citations

Journal ArticleDOI
01 Dec 1985-Language
TL;DR: In this article, twelve articles are grouped into three sections, as follows: "I. Syntactic Representation: " Lexical-Functional Grammar: A Formal Theory for Grammatical Representation (R. Kaplan and J. Bresnan); Control and Complementation (J.Bresnan).
Abstract: The editor of this volume, who is also author or coauthor of five of the contributions, has provided an introduction that not only affords an overview of the separate articles but also interrelates the basic issues in linguistics, psycholinguistics and cognitive studies that are addressed in this volume. The twelve articles are grouped into three sections, as follows: "I. Lexical Representation: " The Passive in Lexical Theory (J. Bresnan); On the Lexical Representation of Romance Reflexive Clitics (J. Grimshaw); and Polyadicity (J. Bresnan)."II. Syntactic Representation: " Lexical-Functional Grammar: A Formal Theory for Grammatical Representation (R. Kaplan and J. Bresnan); Control and Complementation (J. Bresnan); Case Agreement in Russian (C. Neidle); The Representation of Case in Icelandic (A. Andrews); Grammatical Relations and Clause Structure in Malayalam (K. P. Monahan); and Sluicing: A Lexical Interpretation Procedure (L. Levin)."III. Cognitive Processing of Grammatical Representations: " A Theory of the Acquisition of Lexical Interpretive Grammars (S. Pinker); Toward a Theory of Lexico-Syntactic Interactions in Sentence Perception (M. Ford, J. Bresnan, and R. Kaplan); and Sentence Planning Units: Implications for the Speaker's Representation of Meaningful Relations Underlying Sentences (M. Ford).

1,908 citations

Book
01 Jan 1985
TL;DR: "Generalized Phrase Structure Grammar" provides the definitive exposition of the theory of grammar originally proposed by Gerald Gazdar and developed during half a dozen years' work with his colleagues Ewan Klein, Geoffrey Pullum, and Ivan Sag.
Abstract: "Generalized Phrase Structure Grammar" provides the definitive exposition of the theory of grammar originally proposed by Gerald Gazdar and developed during half a dozen years' work with his colleagues Ewan Klein, Geoffrey Pullum, and Ivan Sag. This long-awaited book contains both detailed specifications of the theory and extensive illustrations of its power to describe large parts of English grammar. Experts who wish to evaluate the theory and students learning GPSP for the first time will find this book an invaluable guide.The initial chapters lay out the theoretical machinery of GPSP in a readily intelligible way. Combining informal discussion with precise formalization, the authors describe all major aspects of their grammatical system, including a complete theory of syntactic features, phrase structure rules, meta rules, and feature instantiation principles. The book then shows just what a GPSP analysis of English syntax can accomplish. Topics include the internal structure of phrases, unbounded dependency constructions of many varieties, and coordinate conjunction a construction long considered the sticking point for phrase structure approaches to syntax.The book concludes with a well developed proposal for a model theoretic semantic system to go along with GPSP syntax. Throughout, the authors maintain the highest standards of explicitness and rigor in developing and assessing their grammatical system. Their aim is to provide the best possible test of the hypothesis that syntactic description can be accomplished in a single-level system. And more generally, it is their intention to formulate a grammatical framework in which linguistic universals follow directly from the form of the system and therefore require no explicit statement. Their book sets new methodological standards for work in generative grammar while presenting a grammatical system of extraordinary scope."

1,856 citations


"Head-driven phrase structure gramma..." refers background or methods in this paper

  • ...,As in the tradition of Gazdar et al. (1985), the analyses are detailed and made precise, so that a fair evaluation is possible....

    [...]

  • ...,As in the tradition of Gazdar et al. (1985), the analyses are detailed and made precise, so that a fair evaluation is possible. Moreover, the formalism used is similar enough to by-now traditional NLP grammar formalisms (in particular feature-based grammars) that readers without prior knowledge of HPSG implementations can have a reasonable idea of how to implement the analyses. As mentioned, the book carefully compares P&S's proposals to standard analyses within Principles and Parameters. Unfortunately, there is much less explicit comparison with work done within frameworks intellectually closer to HPSG, such as lexical-functional grammar (LFG) (Bresnan 1982). Finally, although the book focuses mainly on English syntax, P&S attempt to provide a cross-linguistic perspective in several chapters, in particular when dealing with agreement and relative clauses. To the novice reader, HPSG is a constraint-based grammatical formalism that belongs to the growing family of frameworks using feature structures as their basic data structure. In contrast to other feature-based frameworks, such as PATR-II (Shieber 1986, Shieber et al. 1983), LFG, or GPSG, HPSG does not rely on a context-free backbone; constituency is only one of the attributes of the linguistic object par excellence, the linguistic sign. It is on a par with other syntactic and semantic attributes. Moreover, HPSG is characterized by a systematic use of typing of feature structures (not unlike the use of templates in PATR-II). It is similar to DATR (Evans and Gazdar 1989a, 1989b) in this respect. HPSG uses a multiple-inheritance scheme over those types to cross-classify linguistic objects. Although they leave the question open, P&S de facto use a strict inheritance scheme in their analyses, rather than the default inheritance sometimes used in similar approaches (see Briscoe, de Paiva, and Copestake 1993). Overall, then, the formalism that HPSG uses, which owes a lot to King (1989), is very close to that discussed in detail in Carpenter (1992), for example. But for typing it uses a logic similar to that developed by Kasper and Rounds (1986) or Keller (1993). Typing in HPSG is used to factor out shared properties of linguistic objects, be they words, phrases, or anything else, into appropriate classes....

    [...]

  • ...Technically, the modeling of unbounded dependencies in HPSG (leaving aside the reformulation in chapter 9, for now) is similar to the classic SLASH percolation method used by Gazdar et al. (1985). There are three major differences between the two treatments....

    [...]

  • ...,As in the tradition of Gazdar et al. (1985), the analyses are detailed and made precise, so that a fair evaluation is possible. Moreover, the formalism used is similar enough to by-now traditional NLP grammar formalisms (in particular feature-based grammars) that readers without prior knowledge of HPSG implementations can have a reasonable idea of how to implement the analyses. As mentioned, the book carefully compares P&S's proposals to standard analyses within Principles and Parameters. Unfortunately, there is much less explicit comparison with work done within frameworks intellectually closer to HPSG, such as lexical-functional grammar (LFG) (Bresnan 1982). Finally, although the book focuses mainly on English syntax, P&S attempt to provide a cross-linguistic perspective in several chapters, in particular when dealing with agreement and relative clauses. To the novice reader, HPSG is a constraint-based grammatical formalism that belongs to the growing family of frameworks using feature structures as their basic data structure. In contrast to other feature-based frameworks, such as PATR-II (Shieber 1986, Shieber et al. 1983), LFG, or GPSG, HPSG does not rely on a context-free backbone; constituency is only one of the attributes of the linguistic object par excellence, the linguistic sign. It is on a par with other syntactic and semantic attributes. Moreover, HPSG is characterized by a systematic use of typing of feature structures (not unlike the use of templates in PATR-II). It is similar to DATR (Evans and Gazdar 1989a, 1989b) in this respect. HPSG uses a multiple-inheritance scheme over those types to cross-classify linguistic objects. Although they leave the question open, P&S de facto use a strict inheritance scheme in their analyses, rather than the default inheritance sometimes used in similar approaches (see Briscoe, de Paiva, and Copestake 1993). Overall, then, the formalism that HPSG uses, which owes a lot to King (1989), is very close to that discussed in detail in Carpenter (1992), for example....

    [...]

  • ...Technically, the modeling of unbounded dependencies in HPSG (leaving aside the reformulation in chapter 9, for now) is similar to the classic SLASH percolation method used by Gazdar et al. (1985)....

    [...]

Book
01 Jan 1986
TL;DR: This book surveys the important concept of unification as it relates to linguistic theory and, in particular, to Functional Unification Grammar, Definite-Clause Grammars, Lexical- functions, and Generalized Phrase Struture Grammar.
Abstract: This book surveys the important concept of unification as it relates to linguistic theory and, in particular, to Functional Unification Grammar, Definite-Clause Grammars, Lexical-Function Grammar, Generalized Phrase Struture Grammar, and Head-Driven Phrase Structure Grammar. The notes include careful and correct definitions, as well as well-chosen examples of actual grammars, and a discussion of the relationships of computational systems and linguistic theories which use ideas from unification.

902 citations

Book
01 Sep 1992
TL;DR: The Logic of Typed Feature Structures as discussed by the authors is a monograph that brings all the main theoretical ideas into one place where they can be related and compared in a unified setting.
Abstract: For those of us who belonged to the "Bay Area (Computational) Linguistics Community," the early eighties were a heady time. Local researchers working on linguistics, computational linguistics, and logic programming were investigating notions of category, type, feature, term, and partial specification that appeared to converge to a powerful new approach for describing (linguistic) objects and their relationships by monotonic accumulation of constraints between their features. The seed notions had almost independently arisen in generalized phrase structure grammar (GPSG) (Gazdar et al. 1985), lexical-functional grammar (LFG) (Bresnan and Kaplan 1982), functionalunification grammar (FUG) (Kay 1985), logic programming (Colmerauer 1978, Pereira and Warren 1980), and terminological reasoning systems (Ait-Kaci 1984). It took, however, a lot of experimental and theoretical work to identify precisely what the core notions were, how particular systems related to the core notions, and what were the most illuminating mathematical accounts of that core. The development of the unificationbased formalism PATR-II (Shieber 1984) was an early step toward the definition of the core, but its mathematical analysis, and the clarification of the connections between the various systems, are only now coming to a reasonable closure. The Logic of Typed Feature Structures is the first monograph that brings all the main theoretical ideas into one place where they can be related and compared in a unified setting. Carpenter's book touches most of the crucial questions of the developments during the decade, provides proofs for central results, and reaches right up to the edge of current research in the field. These contributions alone make it an indispensable compendium for the researcher or graduate student working on constraint-based grammatical formalisms, and they also make it a very useful reference work for researchers in object-oriented databases and logic programming. Having discharged the main obligation of the reviewer of saying who should read the book under review and why, I will now survey each of the book's four parts while raising some more general questions impinging on the whole book as they arise from the discussion of each part.

849 citations