scispace - formally typeset
Search or ask a question
Topic

Context-sensitive language

About: Context-sensitive language is a research topic. Over the lifetime, 426 publications have been published within this topic receiving 9115 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: In this report, certain properties of context-free (CF or type 2) Grammars are investigated, like that of Chomsky, and it is shown that this type of grammar is essentially stronger than type 2 grammars and has the advantage over type 1 grammARS that the phrase structure of a grammatical sentence is unique, once the derivation is given.
Abstract: In this report, certain properties of context-free (CF or type 2) grammars are investigated, like that of Chomsky. In particular, questions regarding structure, possible ambiguity and relationship to finite automata are considered. The following results are presented: The language generated by a context-free grammmar is linear in a sense that is defined precisely.The requirement of unambiguity—that every sentence has a unique phrase structure—weakens the grammar in the sense that there exists a CF language that cannot be generated unambiguously by a CF grammar.The result that not every CF language is a finite automaton (FA) language is improved in the following way. There exists a CF language L such that for any L′ ⊆ L, if L′ is FA, an L″ ⊆ L can be found such that L″ is also FA, L′ ⊆ L″ and L″ contains infinitely many sentences not in L′.A type of grammar is defined that is intermediate between type 1 and type 2 grammars. It is shown that this type of grammar is essentially stronger than type 2 grammars and has the advantage over type 1 grammars that the phrase structure of a grammatical sentence is unique, once the derivation is given.

788 citations

Book ChapterDOI
21 Sep 1994
TL;DR: A new algorithm is proposed which allows for the identification of any stochastic deterministic regular language as well as the determination of the probabilities of the strings in the language.
Abstract: We propose a new algorithm which allows for the identification of any stochastic deterministic regular language as well as the determination of the probabilities of the strings in the language. The algorithm builds the prefix tree acceptor from the sample set and merges systematically equivalent states. Experimentally, it proves very fast and the time needed grows only linearly with the size of the sample set.

378 citations

Journal ArticleDOI
06 Jun 2002-Nature
TL;DR: Understanding how darwinian evolution gives rise to human language requires the integration of formal language theory, learning theory and evolutionary dynamics.
Abstract: Language is our legacy. It is the main evolutionary contribution of humans, and perhaps the most interesting trait that has emerged in the past 500 million years. Understanding how darwinian evolution gives rise to human language requires the integration of formal language theory, learning theory and evolutionary dynamics. Formal language theory provides a mathematical description of language and grammar. Learning theory formalizes the task of language acquisition it can be shown that no procedure can learn an unrestricted set of languages. Universal grammar specifies the restricted set of languages learnable by the human brain. Evolutionary dynamics can be formulated to describe the cultural evolution of language and the biological evolution of universal grammar.

377 citations

Journal ArticleDOI
TL;DR: It follows from the theory of the evolution of language that the size of the core part of language, the ‘kernel lexicon’, does not vary as language evolves, and the two regimes in the distribution naturally emerge from the evolutionary dynamics of the word web.
Abstract: Human language may be described as a complex network of linked words. In such a treatment, each distinct word in language is a vertex of this web, and interacting words in sentences are connected by edges. The empirical distribution of the number of connections of words in this network is of a peculiar form that includes two pronounced power-law regions. Here we propose a theory of the evolution of language, which treats language as a self-organizing network of interacting words. In the framework of this concept, we completely describe the observed word web structure without any fitting. We show that the two regimes in the distribution naturally emerge from the evolutionary dynamics of the word web. It follows from our theory that the size of the core part of language, the 'kernel lexicon', does not vary as language evolves.

264 citations

Journal ArticleDOI
14 Mar 1997-Science
TL;DR: The proposition that grammaticality equals optimality sheds light on a wide range of phenomena, from the gulf between production and comprehension in child language, to language learnability, to the fundamental questions of linguistic theory.
Abstract: Can concepts from the theory of neural computation contribute to formal theories of the mind? Recent research has explored the implications of one principle of neural computation, optimization, for the theory of grammar Optimization over symbolic linguistic structures provides the core of a new grammatical architecture, optimality theory The proposition that grammaticality equals optimality sheds light on a wide range of phenomena, from the gulf between production and comprehension in child language, to language learnability, to the fundamental questions of linguistic theory: What is it that the grammars of all languages share, and how may they differ?

252 citations


Network Information
Related Topics (5)
Semantics
24.9K papers, 653K citations
79% related
Concurrency
13K papers, 347.1K citations
79% related
Model checking
16.9K papers, 451.6K citations
78% related
Parsing
21.5K papers, 545.4K citations
77% related
Time complexity
36K papers, 879.5K citations
76% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20212
20179
20165
201511
20148
201310