scispace - formally typeset
Open AccessJournal ArticleDOI

Varieties of crossing dependencies: structure dependence and mild context sensitivity

Edward P. Stabler
- 01 Sep 2004 - 
- Vol. 28, Iss: 5, pp 699-720
TLDR
Four different kinds of grammars that can define crossing dependencies in human language are compared here and some results relevant to the viability of mildly context sensitive analyses and some open questions are reviewed.
About
This article is published in Cognitive Science.The article was published on 2004-09-01 and is currently open access. It has received 49 citations till now. The article focuses on the topics: Phrase structure grammar & Context-sensitive grammar.

read more

Citations
More filters
MonographDOI

The Evolution of Language

TL;DR: The authors exploit newly available massive natu- ral language corpora to capture the language as a language evolution phenomenon. But their work is limited to a subset of the languages in the corpus.
Journal ArticleDOI

CCGbank: A Corpus of CCG Derivations and Dependency Structures Extracted from the Penn Treebank

TL;DR: This article presents an algorithm for translating the Penn Treebank into a corpus of Combinatory Categorial Grammar (CCG) derivations augmented with local and long-range word-word dependencies, and discusses the implications of the findings for the extraction of other linguistically expressive grammars from the Treebank, and for the design of future treebanks.
Journal ArticleDOI

Uncertainty about the rest of the sentence.

TL;DR: A word-by-word human sentence processing complexity metric is presented, formalizing the intuition that comprehenders have more trouble on words contributing larger amounts of information about the syntactic structure of the sentence as a whole.
Journal ArticleDOI

Hierarchical processing in music, language, and action: Lashley revisited

TL;DR: Although the precise computational function of the lateral prefrontal regions in action syntax remains debated, Lashley's notion—that this cortical region implements a working‐memory buffer or stack scannable by posterior and subcortical brain regions—is consistent with considerable experimental data.
Journal ArticleDOI

Artificial grammar learning meets formal language theory: an overview

TL;DR: FLT has much to offer scientists who are interested in rigorous empirical investigations of human cognition from a neuroscientific and comparative perspective and it is suggested that progress has been hampered by a pervasive conflation of distinct issues.
References
More filters
Book

Introduction to Automata Theory, Languages, and Computation

TL;DR: This book is a rigorous exposition of formal languages and models of computation, with an introduction to computational complexity, appropriate for upper-level computer science undergraduates who are comfortable with mathematical arguments.
Book ChapterDOI

Reducibility Among Combinatorial Problems

TL;DR: The work of Dantzig, Fulkerson, Hoffman, Edmonds, Lawler and other pioneers on network flows, matching and matroids acquainted me with the elegant and efficient algorithms that were sometimes possible.
Book

Derivation by phase

Noam Chomsky

Lecture Notes in Artificial Intelligence

P. Brezillon, +1 more
TL;DR: The topics in LNAI include automated reasoning, automated programming, algorithms, knowledge representation, agent-based systems, intelligent systems, expert systems, machine learning, natural-language processing, machine vision, robotics, search systems, knowledge discovery, data mining, and related programming languages.