scispace - formally typeset
Search or ask a question
Topic

Phrase structure grammar

About: Phrase structure grammar is a research topic. Over the lifetime, 1459 publications have been published within this topic receiving 48733 citations.


Papers
More filters
Book
01 May 1965
TL;DR: Generative grammars as theories of linguistic competence as discussed by the authors have been used as a theory of performance for language learning. But they have not yet been applied to the problem of language modeling.
Abstract: : Contents: Methodological preliminaries: Generative grammars as theories of linguistic competence; theory of performance; organization of a generative grammar; justification of grammars; formal and substantive grammars; descriptive and explanatory theories; evaluation procedures; linguistic theory and language learning; generative capacity and its linguistic relevance Categories and relations in syntactic theory: Scope of the base; aspects of deep structure; illustrative fragment of the base component; types of base rules Deep structures and grammatical transformations Residual problems: Boundaries of syntax and semantics; structure of the lexicon

12,225 citations

Journal ArticleDOI
TL;DR: It is found that no finite-state Markov process that produces symbols with transition from state to state can serve as an English grammar, and the particular subclass of such processes that produce n -order statistical approximations to English do not come closer, with increasing n, to matching the output of anEnglish grammar.
Abstract: We investigate several conceptions of linguistic structure to determine whether or not they can provide simple and "revealing" grammars that generate all of the sentences of English and only these. We find that no finite-state Markov process that produces symbols with transition from state to state can serve as an English grammar. Furthermore, the particular subclass of such processes that produce n -order statistical approximations to English do not come closer, with increasing n , to matching the output of an English grammar. We formalize-the notions of "phrase structure" and show that this gives us a method for describing language which is essentially more powerful, though still representable as a rather elementary type of finite-state process. Nevertheless, it is successful only when limited to a small subset of simple sentences. We study the formal properties of a set of grammatical transformations that carry sentences with phrase structure into new sentences with derived phrase structure, showing that transformational grammars are processes of the same elementary type as phrase-structure grammars; that the grammar of English is materially simplified if phrase structure description is limited to a kernel of simple sentences from which all other sentences are constructed by repeated transformations; and that this view of linguistic structure gives a certain insight into the use and understanding of language.

2,140 citations

Journal ArticleDOI
TL;DR: Four ostensibly different theoretical models of induction are presented, in which the problem dealt with is the extrapolation of a very long sequence of symbols—presumably containing all of the information to be used in the induction.
Abstract: 1 Summary In Part I, four ostensibly different theoretical models of induction are presented, in which the problem dealt with is the extrapolation of a very long sequence of symbols—presumably containing all of the information to be used in the induction Almost all, if not all problems in induction can be put in this form Some strong heuristic arguments have been obtained for the equivalence of the last three models One of these models is equivalent to a Bayes formulation, in which a priori probabilities are assigned to sequences of symbols on the basis of the lengths of inputs to a universal Turing machine that are required to produce the sequence of interest as output Though it seems likely, it is not certain whether the first of the four models is equivalent to the other three Few rigorous results are presented Informal investigations are made of the properties of these models There are discussions of their consistency and meaningfulness, of their degree of independence of the exact nature of the Turing machine used, and of the accuracy of their predictions in comparison to those of other induction methods In Part II these models are applied to the solution of three problems—prediction of the Bernoulli sequence, extrapolation of a certain kind of Markov chain, and the use of phrase structure grammars for induction Though some approximations are used, the first of these problems is treated most rigorously The result is Laplace's rule of succession The solution to the second problem uses less certain approximations, but the properties of the solution that are discussed, are fairly independent of these approximations The third application, using phrase structure grammars, is least exact of the three First a formal solution is presented Though it appears to have certain deficiencies, it is hoped that presentation of this admittedly inadequate model will suggest acceptable improvements in it This formal solution is then applied in an approximate way to the determination of the “optimum” phrase structure grammar for a given set of strings The results that are obtained are plausible, but subject to the uncertainties of the approximation used

1,927 citations

Journal ArticleDOI
TL;DR: The use of augmented transition network grammars for the analysis of natural language sentences is described, and structure-building actions associated with the arcs of the grammar network allow for a powerful selectivity which can rule out meaningless analyses and take advantage of semantic information to guide the parsing.
Abstract: The use of augmented transition network grammars for the analysis of natural language sentences is described Structure-building actions associated with the arcs of the grammar network allow for the reordering, restructuring, and copying of constituents necessary to produce deep-structure representations of the type normally obtained from a transformational analysis, and conditions on the arcs allow for a powerful selectivity which can rule out meaningless analyses and take advantage of semantic information to guide the parsing The advantages of this model for natural language analysis are discussed in detail and illustrated by examples An implementation of an experimental parsing system for transition network grammars is briefly described

1,369 citations

Journal ArticleDOI
TL;DR: A sequence of restrictions that limit grammars first to Turing machines, then to two types of system from which a phrase structure description of the generated language can be drawn, and finally to finite state Markov sources are shown to be increasingly heavy.
Abstract: A grammar can be regarded as a device that enumerates the sentences of a language. We study a sequence of restrictions that limit grammars first to Turing machines, then to two types of system from which a phrase structure description of the generated language can be drawn, and finally to finite state Markov sources (finite automata). These restrictions are shown to be increasingly heavy in the sense that the languages that can be generated by grammars meeting a given restriction constitute a proper subset of those that can be generated by grammars meeting the preceding restriction. Various formulations of phrase structure description are considered, and the source of their excess generative power over finite state sources is investigated in greater detail.

1,330 citations


Network Information
Related Topics (5)
Semantics
24.9K papers, 653K citations
82% related
Parsing
21.5K papers, 545.4K citations
81% related
Natural language
31.1K papers, 806.8K citations
80% related
Syntax
16.7K papers, 518.6K citations
79% related
Graph (abstract data type)
69.9K papers, 1.2M citations
78% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20231
20226
20213
20203
201911
20189