scispace - formally typeset
Search or ask a question
Topic

Context-sensitive grammar

About: Context-sensitive grammar is a research topic. Over the lifetime, 1938 publications have been published within this topic receiving 45911 citations. The topic is also known as: CSG.


Papers
More filters
01 Apr 1984
TL;DR: The separation of syntactic and semantic rules is intended to promote modularity, simplicity and clarity of definition, and ease of modification as compared to Definite Clause GramMars, Metamorphosis Grammars, and Restriction GrammARS.
Abstract: In this paper we introduce Definite Clause Translation Grammars, a new class of logic grammars which generalizes Definite Clause Grammars and which may be thought of as a logical implementation of Attribute Grammars. Definite Clause Translation Grammars permit the specification of the syntax and semantics of a language: the syntax is specified as in Definite Clause Grammars; but the semantics is specified by one or more semantic rules in the form of Horn clauses attached to each node of the parse tree (automatically created during syntactic analysis), and which control traversal(s) of the parse tree and computation of attributes of each node. The semantic rules attached to a node constitute therefore, a local data base for that node. The separation of syntactic and semantic rules is intended to promote modularity, simplicity and clarity of definition, and ease of modification as compared to Definite Clause Grammars, Metamorphosis Grammars, and Restriction Grammars.

56 citations

Journal ArticleDOI
TL;DR: Every system of this kind of language equations that are resolved with respect to variables and contain the operations of concatenation, union and intersection is proved to have a least fixed point, and the equivalence of these systems to conjunctive grammars is established.
Abstract: This paper studies systems of language equations that are resolved with respect to variables and contain the operations of concatenation, union and intersection Every system of this kind is proved to have a least fixed point, and the equivalence of these systems to conjunctive grammars is established This allows us to obtain an algebraic characterization of the language family generated by conjunctive grammars

55 citations

Journal ArticleDOI
TL;DR: In this paper, six different types of shape grammars are defined by considering different kinds of restrictions on rule format and rule ordering, and the effects that these different restrictions have on the generative power, practicality, pedagogical value, and other characteristics of a shape grammar are discussed.
Abstract: The issue of decidability in relation to shape grammars is considered here. Decidability concerns, first, the identification of different types of grammars and, second, the answerability or solvability of questions about these types of grammars. In this paper, the first of these two topics is examined. Six different types of shape grammars are defined by considering different kinds of restrictions on rule format and rule ordering. The effects that these different restrictions have on the generative power, practicality, pedagogical value, and other characteristics of a shape grammar are discussed. In a subsequent paper, “Shape grammars: five questions” (Knight, 1998), the answerabilities of various questions about the types of shape grammars outlined here are explored. The decidability issues addressed in this paper and the subsequent one are key to the practical use of shape grammars in design projects where specific goals and constraints need to be satisfied.

55 citations

Book ChapterDOI
TL;DR: What is currently known about natural language morphology and syntax from the perspective of formal language theory is surveyed and recent developments such as feature-theory, the use of extension and unification, default mechanisms, and metagram-matical techniques are outlined.
Abstract: This paper surveys what is currently known about natural language morphology and syntax from the perspective of formal language theory. Firstly, the position of natural language word-sets and sentence-sets on the formal language hierarchy is discussed. Secondly, the contemporary use by linguists of a range of formal grammars (from finite state transducers to indexed grammars) in both word-syntax (i.e. morphology) and sentence-syntax is sketched. Finally, recent developments such as feature-theory, the use of extension and unification, default mechanisms, and metagram-matical techniques, are outlined.

55 citations

Book ChapterDOI
01 Jan 2007
TL;DR: This work has argued that a principal function for intuitive theories, just as for grammars for natural languages, is to generate a constrained space of hypotheses that people consider in carrying out a class of cognitively central and otherwise severely underconstrained inductive inference tasks.
Abstract: In the previous chapter (Tenenbaum, Griffiths, & Niyogi, this volume), we introduced a framework for thinking about the structure, function, and acquisition of intuitive theories inspired by an analogy to the research program of generative grammar in linguistics. We argued that a principal function for intuitive theories, just as for grammars for natural languages, is to generate a constrained space of hypotheses that people consider in carrying out a class of cognitively central and otherwise severely underconstrained inductive inference tasks. Linguistic grammars generate a hypothesis space of syntactic structures considered in sentence comprehension; intuitive theories generate a hypothesis space of causal network structures considered in causal induction. Both linguistic grammars and intuitive causal theories must also be reliably learnable from primary data available to people. In our view, these functional characteristics of intuitive theories should strongly constrain the content and form of the knowledge they represent, leading to representations somewhat like those used in generative grammars for language. However, until now we have not presented any specific proposals for formalizing the knowledge content or representational form of “causal grammars.” That is our goal here. Just as linguistic grammars encode the principles that implicitly underlie all grammatical utterances in a language, so do causal grammars express knowledge more abstract than any one causal network in a domain. Consequently, existing approaches for representing causal knowledge based on Bayesian networks defined over observable events, properties or variables, are not sufficient to characterize causal grammars. Causal grammars are in some sense analogous to the “framework theories” for core domains that have been studied in cognitive development (Wellman & Gelman, 1992): the domain-specific concepts and principles that allow learners to construct appropriate causal networks for reasoning about

55 citations


Network Information
Related Topics (5)
Graph (abstract data type)
69.9K papers, 1.2M citations
80% related
Time complexity
36K papers, 879.5K citations
79% related
Concurrency
13K papers, 347.1K citations
78% related
Model checking
16.9K papers, 451.6K citations
77% related
Directed graph
12.2K papers, 302.4K citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202311
202212
20211
20204
20191
20181