scispace - formally typeset
Topic

Context-free grammar

About: Context-free grammar is a(n) research topic. Over the lifetime, 3449 publication(s) have been published within this topic receiving 92951 citation(s). The topic is also known as: context free grammar & CFG.
Papers
More filters

Book
01 May 1965-
Abstract: : Contents: Methodological preliminaries: Generative grammars as theories of linguistic competence; theory of performance; organization of a generative grammar; justification of grammars; formal and substantive grammars; descriptive and explanatory theories; evaluation procedures; linguistic theory and language learning; generative capacity and its linguistic relevance Categories and relations in syntactic theory: Scope of the base; aspects of deep structure; illustrative fragment of the base component; types of base rules Deep structures and grammatical transformations Residual problems: Boundaries of syntax and semantics; structure of the lexicon

12,204 citations


Book
01 Jan 1969-
TL;DR: The theory of formal languages as a coherent theory is presented and its relationship to automata theory is made explicit, including the Turing machine and certain advanced topics in language theory.
Abstract: From the Preface (See Front Matter for full Preface) The study of formal languages constitutes an important subarea of computer science. This area sprang to life around 1956 when Noam Chomsky gave a mathematical model of a grammar in connection with his study of natural languages. Shortly afterwards, the concept of a grammar was found to be of great importance to the programmer when the syntax of the programming language ALGOL was defined by a context-free grammar. This development led naturally to syntax-directed compiling and the concept of a compiler compiler. Since then a considerable flurry of activity has taken place, the results of which have related formal languages and automata theory to such an extent that it is impossible to treat the areas separately. By now, no serious study of computer science would be complete without a knowledge of the techniques and results from language and automata theory. This book presents the theory of formal languages as a coherent theory and makes explicit its relationship to automata. The book begins with an explanation of the notion of a finite description of a language. The fundamental descriptive device--the grammar--is explained, as well as its three major subclasses--regular, context-free, and context-sensitive grammars. The context-free grammars are treated in detail, and such topics as normal forms, derivation trees, and ambiguity are covered. Four types of automata equivalent to the four types of grammars are described. These automata are the finite automaton, the pushdown automaton, the linear bounded automaton, and the Turing machine. The Turing machine is covered in detail, and unsolvability of the halting problem shown. The book concludes with certain advanced topics in language theory--closure properties, computational complexity, deterministic pushdown automata, LR(k) grammars, stack automata, and decidability.

1,583 citations


Journal ArticleDOI
Abstract: A parsing algorithm which seems to be the most efficient general context-free algorithm known is described. It is similar to both Knuth's LR(k) algorithm and the familiar top-down algorithm. It has a time bound proportional to n3 (where n is the length of the string being parsed) in general; it has an n2 bound for unambiguous grammars; and it runs in linear time on a large class of grammars, which seems to include most practical context-free programming language grammars. In an empirical comparison it appears to be superior to the top-down and bottom-up algorithms studied by Griffiths and Petrick.

1,481 citations


Journal ArticleDOI
TL;DR: The use of augmented transition network grammars for the analysis of natural language sentences is described, and structure-building actions associated with the arcs of the grammar network allow for a powerful selectivity which can rule out meaningless analyses and take advantage of semantic information to guide the parsing.
Abstract: The use of augmented transition network grammars for the analysis of natural language sentences is described Structure-building actions associated with the arcs of the grammar network allow for the reordering, restructuring, and copying of constituents necessary to produce deep-structure representations of the type normally obtained from a transformational analysis, and conditions on the arcs allow for a powerful selectivity which can rule out meaningless analyses and take advantage of semantic information to guide the parsing The advantages of this model for natural language analysis are discussed in detail and illustrated by examples An implementation of an experimental parsing system for transition network grammars is briefly described

1,353 citations


Journal ArticleDOI
TL;DR: A sequence of restrictions that limit grammars first to Turing machines, then to two types of system from which a phrase structure description of the generated language can be drawn, and finally to finite state Markov sources are shown to be increasingly heavy.
Abstract: A grammar can be regarded as a device that enumerates the sentences of a language. We study a sequence of restrictions that limit grammars first to Turing machines, then to two types of system from which a phrase structure description of the generated language can be drawn, and finally to finite state Markov sources (finite automata). These restrictions are shown to be increasingly heavy in the sense that the languages that can be generated by grammars meeting a given restriction constitute a proper subset of those that can be generated by grammars meeting the preceding restriction. Various formulations of phrase structure description are considered, and the source of their excess generative power over finite state sources is investigated in greater detail.

1,254 citations


Network Information
Related Topics (5)
Tree-adjoining grammar

2.4K papers, 57.8K citations

92% related
L-attributed grammar

2.5K papers, 58.5K citations

90% related
Linear grammar

55 papers, 646 citations

90% related
Parser combinator

2.2K papers, 66.7K citations

89% related
Formal language

5.7K papers, 154.1K citations

89% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20221
202131
202046
201954
201838
201764

Top Attributes

Show by:

Topic's top 5 most impactful authors

Grzegorz Rozenberg

35 papers, 822 citations

Alexander Okhotin

34 papers, 742 citations

Alexander Meduna

25 papers, 142 citations

Giorgio Satta

21 papers, 383 citations

Jürgen Dassow

15 papers, 142 citations