scispace - formally typeset
Open AccessJournal ArticleDOI

Neural systems underlying British Sign Language and audio‐visual English processing in native users

TLDR
This first neuroimaging study of the perception of British Sign Language (BSL) measures brain activation using functional MRI in nine hearing and nine congenitally deaf native users of BSL while they performed a BSL sentence-acceptability task and suggests that left- temporal auditory regions may be privileged for processing heard speech even in hearing native signers.
Abstract
In order to understand the evolution of human language, it is necessary to explore the neural systems that support language processing in its many forms. In particular, it is informative to separate those mechanisms that may have evolved for sensory processing (hearing) from those that have evolved to represent events and actions symbolically (language). To what extent are the brain systems that support language processing shaped by auditory experience and to what extent by exposure to language, which may not necessarily be acoustically structured? In this first neuroimaging study of the perception of British Sign Language (BSL), we explored these questions by measuring brain activation using functional MRI in nine hearing and nine congenitally deaf native users of BSL while they performed a BSL sentence-acceptability task. Eight hearing, non-signing subjects performed an analogous task that involved audio-visual English sentences. The data support the argument that there are both modality-independent and modality-dependent language localization patterns in native users. In relation to modality-independent patterns, regions activated by both BSL in deaf signers and by spoken English in hearing non-signers included inferior prefrontal regions bilaterally (including Broca's area) and superior temporal regions bilaterally (including Wernicke's area). Lateralization patterns were similar for the two languages. There was no evidence of enhanced right-hemisphere recruitment for BSL processing in comparison with audio-visual English. In relation to modality-specific patterns, audio-visual speech in hearing subjects generated greater activation in the primary and secondary auditory cortices than BSL in deaf signers, whereas BSL generated enhanced activation in the posterior occipito-temporal regions (V5), reflecting the greater movement component of BSL. The influence of hearing status on the recruitment of sign language processing systems was explored by comparing deaf and hearing adults who had BSL as their first language (native signers). Deaf native signers demonstrated greater activation in the left superior temporal gyrus in response to BSL than hearing native signers. This important finding suggests that left- temporal auditory regions may be privileged for processing heard speech even in hearing native signers. However, in the absence of auditory input this region can be recruited for visual processing.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Neural correlates of human action observation in hearing and deaf subjects

TL;DR: During human motion processing, deaf individuals may engage specialized neural systems that allow for rapid, online differentiation of meaningful linguistic actions from non-linguistic human movements.
Journal ArticleDOI

Visual stimuli can impair auditory processing in cochlear implant users.

TL;DR: The ability to segregate conflicting auditory and visual information was assessed and significant differences were observed between the non-proficient cochlear implant users and their matched controls when the accompanying visual stimuli consisted of a moving random-dot pattern or incongruent lip movements.
Journal ArticleDOI

Origins of task-specific sensory-independent organization in the visual and auditory brain: neuroscience evidence, open questions and clinical implications.

TL;DR: Recent data suggesting that a combination of the connectivity bias and sensitivity to task-distinctive features might account for TSSI plasticity in the sensory cortices as a whole are reviewed.
Journal ArticleDOI

Visual speech circuits in profound acquired deafness: a possible role for latent multimodal connectivity.

TL;DR: It is suggested that functional compensation of sensory deprivation does not require slowly progressive colonization of superior temporal regions by visual inputs, but can exploit a switch to pre-existing latent multimodal connectivity.
Journal ArticleDOI

Psycholinguistic, cognitive, and neural implications of bimodal bilingualism

TL;DR: The bimodal bilingual brain differs from the unimodalilingual brain with respect to the degree and extent of neural overlap for the two languages, with less overlap for bimmodal bilinguals.
References
More filters
Journal ArticleDOI

Global, voxel, and cluster tests, by theory and permutation, for a difference between two groups of structural MR images of the brain

TL;DR: Almost entirely automated procedures for estimation of global, voxel, and cluster-level statistics to test the null hypothesis of zero neuroanatomical difference between two groups of structural magnetic resonance imaging (MRI) data are described.
Journal ArticleDOI

Activation of Auditory Cortex During Silent Lipreading

TL;DR: Three experiments suggest that these auditory cortical areas are not engaged when an individual is viewing nonlinguistic facial movements but appear to be activated by silent meaningless speechlike movements (pseudospeech), which supports psycholinguistic evidence that seen speech influences the perception of heard speech at a prelexical stage.
Journal ArticleDOI

Nonlinear event-related responses in fMRI

TL;DR: The theory and techniques upon which conclusions based on nonlinear system identification based on the use of Volterra series were based are described and the implications for experimental design and analysis are discussed.
Related Papers (5)