scispace - formally typeset
Open AccessJournal ArticleDOI

The neurobiology of syntax: beyond string sets.

Karl Magnus Petersson, +1 more
- 19 Jul 2012 - 
- Vol. 367, Iss: 1598, pp 1971-1983
TLDR
The brain represents grammars in its connectivity, and its ability for syntax is based on neurobiological infrastructure for structured sequence processing, and the acquisition of this ability is accounted for in an adaptive dynamical systems framework.
Abstract
The human capacity to acquire language is an outstanding scientific challenge to understand. Somehow our language capacities arise from the way the human brain processes, develops and learns in interaction with its environment. To set the stage, we begin with a summary of what is known about the neural organization of language and what our artificial grammar learning (AGL) studies have revealed. We then review the Chomsky hierarchy in the context of the theory of computation and formal learning theory. Finally, we outline a neurobiological model of language acquisition and processing based on an adaptive, recurrent, spiking network architecture. This architecture implements an asynchronous, event-driven, parallel system for recursive processing. We conclude that the brain represents grammars (or more precisely, the parser/generator) in its connectivity, and its ability for syntax is based on neurobiological infrastructure for structured sequence processing. The acquisition of this ability is accounted for in an adaptive dynamical systems framework. Artificial language learning (ALL) paradigms might be used to study the acquisition process within such a framework, as well as the processing properties of the underlying neurobiological infrastructure. However, it is necessary to combine and constrain the interpretation of ALL results by theoretical models and empirical studies on natural language processing. Given that the faculty of language is captured by classical computational models to a significant extent, and that these can be embedded in dynamic network architectures, there is hope that significant progress can be made in understanding the neurobiology of the language faculty.

read more

Citations
More filters
Journal ArticleDOI

The Now-or-Never bottleneck: A fundamental constraint on language.

TL;DR: It is argued that, to deal with this “Now-or-Never” bottleneck, the brain must compress and recode linguistic input as rapidly as possible, which implies that language acquisition is learning to process, rather than inducing, a grammar.
Journal ArticleDOI

MUC (Memory, Unification, Control) and beyond

TL;DR: A neurobiological model of language is discussed that overcomes the shortcomings of the classical Wernicke-Lichtheim-Geschwind model based on a subdivision of language processing into three components: Memory, Unification, and Control.
Journal ArticleDOI

Nodes and networks in the neural architecture for language: Broca's region and beyond

TL;DR: Current views on the neurobiological underpinnings of language are discussed that deviate in a number of ways from the classical Wernicke-Lichtheim-Geschwind model, in which core regions of language processing need to interact with other networks to establish full functionality of language and communication.
Proceedings ArticleDOI

Understanding understanding source code with functional magnetic resonance imaging

TL;DR: This paper explores whether functional magnetic resonance imaging (fMRI), which is well established in cognitive neuroscience, is feasible to soundly measure program comprehension and finds a clear, distinct activation pattern of five brain regions that fit well to the understanding of program comprehension.
Journal ArticleDOI

Disruption of hierarchical predictive coding during sleep

TL;DR: It is discovered that both short-term and long-term brain responses to auditory prediction errors are disrupted during non-rapid eye movement and rapid eye movement sleep; however, the brain still exhibits detectable auditory responses and a capacity to habituate to frequently repeated sounds.
References
More filters
Book

Introduction to Automata Theory, Languages, and Computation

TL;DR: This book is a rigorous exposition of formal languages and models of computation, with an introduction to computational complexity, appropriate for upper-level computer science undergraduates who are comfortable with mathematical arguments.
Journal ArticleDOI

On Computable Numbers, with an Application to the Entscheidungsproblem

TL;DR: This chapter discusses the application of the diagonal process of the universal computing machine, which automates the calculation of circle and circle-free numbers.
Journal ArticleDOI

Attitudinal effects of mere exposure.

TL;DR: The exposure-attitude hypothesis as discussed by the authors suggests that mere repeated exposure of the individual to a stimulus object enhances his attitude toward it, i.e., exposure is meant a condition making the stimulus accessible to the individual's perception.
Journal ArticleDOI

Stochastic differential equations : an introduction with applications

TL;DR: Some Mathematical Preliminaries as mentioned in this paper include the Ito Integrals, Ito Formula and the Martingale Representation Theorem, and Stochastic Differential Equations.
Book

Dynamical Systems in Neuroscience

TL;DR: This book explains the relationship of electrophysiology, nonlinear dynamics, and the computational properties of neurons, with each concept presented in terms of both neuroscience and mathematics and illustrated using geometrical intuition, providing a link between the two disciplines.
Related Papers (5)