scispace - formally typeset
Search or ask a question
Topic

Context-sensitive grammar

About: Context-sensitive grammar is a research topic. Over the lifetime, 1938 publications have been published within this topic receiving 45911 citations. The topic is also known as: CSG.


Papers
More filters
Proceedings ArticleDOI
07 Jul 2003
TL;DR: A MetaGrammar is introduced, which allows the grammar writer to specify in compact manner syntactic properties that are potentially framework- and to some extent language-independent, from which grammars for several frameworks and languages are automatically generated offline.
Abstract: We introduce a MetaGrammar, which allows us to automatically generate, from a single and compact MetaGrammar hierarchy, parallel Lexical Functional Grammars (LFG) and Tree-Adjoining Grammars (TAG) for French and for English: the grammar writer specifies in compact manner syntactic properties that are potentially framework-, and to some extent language-independent (such as subcategorization, valency alternations and realization of syntactic functions), from which grammars for several frameworks and languages are automatically generated offline.

19 citations

Journal ArticleDOI
TL;DR: Methodological considerations on crucial issues in areas of string and graph grammar-based syntactic methods are made and recommendations concerning an enhancement of context-free grammars as well as constructing parsable and inducible classes of graph Grammars are formulated.
Abstract: Fundamental open problems, which are frontiers of syntactic pattern recognition are discussed in the paper. Methodological considerations on crucial issues in areas of string and graph grammar-based syntactic methods are made. As a result, recommendations concerning an enhancement of context-free grammars as well as constructing parsable and inducible classes of graph grammars are formulated.

19 citations

Proceedings ArticleDOI
22 Jun 1993
TL;DR: It is shown that the generative capacity of fts' is equal to that of nc-lfg's, and at least one NP-complete language is generated by fts'.
Abstract: A number of grammatical formalisms were introduced to define the syntax of natural languages. Among them are parallel multiple context-free grammars (pmcfg's) and lexical-functional grammars (lfg's). Pmcfg's and their subclass called multiple context-free grammars (mcfg's) are natural extensions of cfg's, and pmcfg's are known to be recognizable in polynomial time. Some subclasses of lfg's have been proposed, but they were shown to generate an NP-complete language. Finite state translation systems (fts') were introduced as a computational model of transformational grammars. In this paper, three subclasses of lfg's called nc-lfg's, dc-lfg's and fc-lfg's are introduced and the generative capacities of the above mentioned grammatical formalisms are investigated. First, we show that the generative capacity of fts' is equal to that of nc-lfg's. As relations among subclasses of those formalisms, it is shown that the generative capacities of deterministic fts', dc-lfg's, and pmcfg's are equal to each other, and the generative capacity of fc-lfg's is equal to that of mcfg's. It is also shown that at least one NP-complete language is generated by fts'. Consequently, deterministic fts', dc-lfg's and fc-lfg's can be recognized in polynomial time. However, fts' (and nc-lfg's) cannot, if P ≠ NP.

19 citations

Proceedings ArticleDOI
15 Sep 1992
TL;DR: A two dimensional extension of the Cocke-Kasami-Younger parser for context-free languages is used to parse figures using these grammars.
Abstract: Generalized two dimensional context free grammars an extension of context free grammars to two dimensions, is described. This extension is a generalization of Tomita's two dimensional context free grammars (M. Tomita, 1989), and better fits into the families of graph grammars described by Crimi (1990) Relation Grammars and by Flasinski (1988) edNLC Grammars, Figure Grammars are particularly useful for applications such as handwritten mathematical expressions. A two dimensional extension of the Cocke-Kasami-Younger parser for context-free languages is used to parse figures using these grammars. >

19 citations

Proceedings Article
01 Jan 2009
TL;DR: The results demonstrate robust implicit learning of recursively embedded structures (context-free grammar) and recursive structures with cross-dependencies ( context-sensitive grammar) in an artificial grammar learning task spanning 9 days.
Abstract: A Matter of Time: Implicit Acquisition of Recursive Sequence Structures Julia Udden a,b,c (Julia.Udden@ki.se) Susana Araujo c,d (smaraujo@ualg.pt) Christian Forkstam a,b,c (Christian.Forkstam@ki.se) Martin Ingvar b (Martin.Ingvar@ki.se) Peter Hagoort a,c (Peter.Hagoort@mpi.nl) Karl Magnus Petersson a,b,c,d (Karl-Magnus.Petersson@mpi.nl) a Max Planck Institute for Psycholinguistics, Nijmegen, the Netherlands b Cognitive Neurophysiology Research Group, Stockholm Brain Institute Karolinska Institutet, Stockholm, Sweden c Donders Institute for Brain, Cognition and Behaviour: Centre for Cognitive Neuroimaging Radboud University Nijmegen, the Netherlands d Cognitive Neuroscience Research Group, Universidade do Algarve, Faro, Portugal classification session. In the acquisition phase, participants are typically engaged in a short term memory task using an acquisition sample of sequences generated from a formal grammar. Subsequently, subjects are informed that the symbol sequences were generated according to a complex system of rules and asked to classify novel items as grammatical or not, typically with the instruction to base their classification decisions on their immediate intuitive impression (i.e., guessing based on ''gut feeling''). It is a robust finding on regular grammars that subjects perform well above chance and more so after several days of learning (Folia et al., 2008; Forkstam, Elwer, Ingvar & Petersson, 2008). Taking the perspective that some aspects of the faculty of language are shared with nonhuman animals (faculty of language in a broad sense; FLB) and that other aspects are specific to human language (faculty of language in a narrow sense; FLN), the quest for FLN in AGL has centered around the theoretical construct of the Chomsky hierarchy – a complexity hierarchy for formal grammars, which are divided into regular (finite state; T3), context-free (phrase- structure; T2), context-sensitive (T1), and general phrase- structure grammars (Turing-Tue; T0), and its embodiment in the recursion-only hypothesis for FLN outlined in a seminal paper by Hauser, Chomsky and Fitch (2002). For example, in a sentence such as: Abstract A dominant hypothesis in empirical research on the evolution of language is the following: the fundamental difference between animal and human communication systems is captured by the distinction between regular and more complex non-regular grammars. Studies reporting successful artificial grammar learning of nested recursive structures and imaging studies of the same have methodological shortcomings since they typically allow explicit problem solving strategies and this has been shown to account for the learning effect in subsequent behavioral studies. The present study overcomes these shortcomings by using subtle violations of agreement structure in a preference classification task. In contrast to the studies conducted so far, we use an implicit learning paradigm, allowing the time needed for both abstraction processes and consolidation to take place. Our results demonstrate robust implicit learning of recursively embedded structures (context-free grammar) and recursive structures with cross-dependencies (context-sensitive grammar) in an artificial grammar learning task spanning 9 days. Keywords: Implicit artificial grammar learning; centre embedded; cross-dependency; implicit learning; context- sensitive grammar; context-free grammar; regular grammar; non-regular grammar Introduction During the past decade, investigations of language acquisition as well as language evolution have been revitalized by the artificial grammar learning (AGL) paradigm which allows animals as well as children and adult humans to implicitly acquire new syntactic structures without explicit teaching, i.e., similar to the conditions for natural language development. In this context, implicit learning is a process whereby a complex, rule-governed knowledge base is acquired largely independent of awareness of both the process and product of acquisition (Reber, Walkenfeld & Hernstadt, 1991). In AGL, one separates the acquisition and the testing phase, and the paradigm consists of at least one acquisition and The cat the rats the dog chases fear is sitting in the yard. The recursive embedding of subordinate phrases in super- ordinate phrases introduces morphological noun-verb agreement dependencies or what we here call nested dependencies. In a recent paper (de Vries, Monaghan, Knecht, & Zwitserlood, 2008), participants were trained on such sequences following the pattern A 1 A 2 A 3 B 3 B 2 B 1 and tested on different kinds of violations, all in one session. Critically, there was no indication of learning in the hierarchical vs. scrambled condition, where non- grammatical sequences were only violating the

19 citations


Network Information
Related Topics (5)
Graph (abstract data type)
69.9K papers, 1.2M citations
80% related
Time complexity
36K papers, 879.5K citations
79% related
Concurrency
13K papers, 347.1K citations
78% related
Model checking
16.9K papers, 451.6K citations
77% related
Directed graph
12.2K papers, 302.4K citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202311
202212
20211
20204
20191
20181