scispace - formally typeset
Open AccessJournal ArticleDOI

Neural systems underlying British Sign Language and audio‐visual English processing in native users

TLDR
This first neuroimaging study of the perception of British Sign Language (BSL) measures brain activation using functional MRI in nine hearing and nine congenitally deaf native users of BSL while they performed a BSL sentence-acceptability task and suggests that left- temporal auditory regions may be privileged for processing heard speech even in hearing native signers.
Abstract
In order to understand the evolution of human language, it is necessary to explore the neural systems that support language processing in its many forms. In particular, it is informative to separate those mechanisms that may have evolved for sensory processing (hearing) from those that have evolved to represent events and actions symbolically (language). To what extent are the brain systems that support language processing shaped by auditory experience and to what extent by exposure to language, which may not necessarily be acoustically structured? In this first neuroimaging study of the perception of British Sign Language (BSL), we explored these questions by measuring brain activation using functional MRI in nine hearing and nine congenitally deaf native users of BSL while they performed a BSL sentence-acceptability task. Eight hearing, non-signing subjects performed an analogous task that involved audio-visual English sentences. The data support the argument that there are both modality-independent and modality-dependent language localization patterns in native users. In relation to modality-independent patterns, regions activated by both BSL in deaf signers and by spoken English in hearing non-signers included inferior prefrontal regions bilaterally (including Broca's area) and superior temporal regions bilaterally (including Wernicke's area). Lateralization patterns were similar for the two languages. There was no evidence of enhanced right-hemisphere recruitment for BSL processing in comparison with audio-visual English. In relation to modality-specific patterns, audio-visual speech in hearing subjects generated greater activation in the primary and secondary auditory cortices than BSL in deaf signers, whereas BSL generated enhanced activation in the posterior occipito-temporal regions (V5), reflecting the greater movement component of BSL. The influence of hearing status on the recruitment of sign language processing systems was explored by comparing deaf and hearing adults who had BSL as their first language (native signers). Deaf native signers demonstrated greater activation in the left superior temporal gyrus in response to BSL than hearing native signers. This important finding suggests that left- temporal auditory regions may be privileged for processing heard speech even in hearing native signers. However, in the absence of auditory input this region can be recruited for visual processing.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

The myth of language universals: Language diversity and its importance for cognitive science

TL;DR: This target article summarizes decades of cross-linguistic work by typologists and descriptive linguists, showing just how few and unprofound the universal characteristics of language are, once the authors honestly confront the diversity offered to us by the world's 6,000 to 8,000 languages.
Journal ArticleDOI

Neural reorganization following sensory loss: the opportunity of change.

TL;DR: Crossmodal neuroplasticity with regards to behavioural adaptation after sensory deprivation is discussed and the possibility of maladaptive consequences within the context of rehabilitation is highlighted.
Journal ArticleDOI

Cognition counts: A working memory system for ease of language understanding (ELU)

TL;DR: The present paper focuses on four aspects of the model which have led to the current, updated version: the language generality assumption; the mismatch assumption; chronological age; and the episodic buffer function of rapid, automatic multimodal binding of phonology (RAMBPHO).
Journal ArticleDOI

Music and language side by side in the brain: a PET study of the generation of melodies and sentences

TL;DR: A comparative model of shared, parallel, and distinctive features of the neural systems supporting music and language is outlined, assuming thatMusic and language show parallel combinatoric generativity for complex sound structures but distinctly different informational content (semantics).
References
More filters
Journal ArticleDOI

Cortical Representation of Sign Language: Comparison of Deaf Signers and Hearing Non-signers

TL;DR: It is shown that cortical areas of the classical left-hemisphere language areas are also activated in hearing non-signers during passive viewing of signs that for them are linguistically meaningless.
Journal ArticleDOI

Regional Cerebral Blood Flow in Sign Language Users

TL;DR: In this article, the cerebral activation was measured by recordings of the regional cerebral blood flow for both spoken and sign language comprehension, and it was found that sign language activates the cortex in a way similar to spoken language, when the listener watches the speaker.
Journal ArticleDOI

What's right about the neural organization of sign language? A perspective on recent neuroimaging results.

TL;DR: The main points are that the vast majority of behavioral, neuropsychological, and functional imaging data support the hypothesis that the left hemisphere is dominant for lexical and grammatical aspects of sign language perception and production, and that the within-hemisphere organization of signed and spoken language is in many respects the same—but not in all respects.
Journal ArticleDOI

Impairment of motor imagery in putamen lesions in humans

TL;DR: Evidence is provided that the basal ganglia as well as cortical structures play an important role in the neural network mediating motor imagery in patients with putamen or cortical lesions.
Journal ArticleDOI

Linking sight and sound: fMRI evidence of primary auditory cortex activation during visual word recognition.

TL;DR: A brain region involved in the most basic aspects of auditory processing appears to be engaged in reading even when there is no environmental oral or auditory component.
Related Papers (5)