Neural systems underlying British Sign Language and audio‐visual English processing in native users
Mairéad MacSweeney,Bencie Woll,Ruth Campbell,Philip McGuire,Anthony S. David,Steven Williams,John Suckling,Gemma A. Calvert,Michael Brammer +8 more
TLDR
This first neuroimaging study of the perception of British Sign Language (BSL) measures brain activation using functional MRI in nine hearing and nine congenitally deaf native users of BSL while they performed a BSL sentence-acceptability task and suggests that left- temporal auditory regions may be privileged for processing heard speech even in hearing native signers.Abstract:
In order to understand the evolution of human language, it is necessary to explore the neural systems that support language processing in its many forms. In particular, it is informative to separate those mechanisms that may have evolved for sensory processing (hearing) from those that have evolved to represent events and actions symbolically (language). To what extent are the brain systems that support language processing shaped by auditory experience and to what extent by exposure to language, which may not necessarily be acoustically structured? In this first neuroimaging study of the perception of British Sign Language (BSL), we explored these questions by measuring brain activation using functional MRI in nine hearing and nine congenitally deaf native users of BSL while they performed a BSL sentence-acceptability task. Eight hearing, non-signing subjects performed an analogous task that involved audio-visual English sentences. The data support the argument that there are both modality-independent and modality-dependent language localization patterns in native users. In relation to modality-independent patterns, regions activated by both BSL in deaf signers and by spoken English in hearing non-signers included inferior prefrontal regions bilaterally (including Broca's area) and superior temporal regions bilaterally (including Wernicke's area). Lateralization patterns were similar for the two languages. There was no evidence of enhanced right-hemisphere recruitment for BSL processing in comparison with audio-visual English. In relation to modality-specific patterns, audio-visual speech in hearing subjects generated greater activation in the primary and secondary auditory cortices than BSL in deaf signers, whereas BSL generated enhanced activation in the posterior occipito-temporal regions (V5), reflecting the greater movement component of BSL. The influence of hearing status on the recruitment of sign language processing systems was explored by comparing deaf and hearing adults who had BSL as their first language (native signers). Deaf native signers demonstrated greater activation in the left superior temporal gyrus in response to BSL than hearing native signers. This important finding suggests that left- temporal auditory regions may be privileged for processing heard speech even in hearing native signers. However, in the absence of auditory input this region can be recruited for visual processing.read more
Citations
More filters
Journal ArticleDOI
An fMRI study of perception and action in deaf signers.
Kayoko Okada,Corianne Rogalsky,Lucinda O'Grady,Leila Hanaumi,Ursula Bellugi,David P. Corina,Gregory Hickok +6 more
TL;DR: It is concluded that the activation in Broca's area during ASL observation is not causally related to sign language understanding, providing prima facie support for the claim that the motor system participates in language perception.
Journal ArticleDOI
Is speech and language therapy meeting the needs of language minorities? The case of deaf people with neurological impairments.
TL;DR: The results suggest that many Deaf people are not gaining access to SLT after neurological impairment, and the instigation of a national team specializing in BSL impairments is recommended.
Signs in the brain: Hearing signers’ cross-linguistic semantic integration strategies
TL;DR: In this paper, a neurolinguistic study aimed to achieve basic knowledge about semantic integration mechanisms across speech and sign language in hearing native and non-native signers, using electrocortical brain activation and behavioral decisions in three groups of study participants: hearing native signers (children of deaf adults, CODAs), hearing late learned signers and hearing non-signing controls.
Journal ArticleDOI
Language lateralization of hearing native signers: A functional transcranial Doppler sonography (fTCD) study of speech and sign production.
Eva Gutierrez-Sigut,Richard E. Daws,Heather Payne,Jonathan Blott,Chloë Marshall,Mairéad MacSweeney +5 more
TL;DR: The current data demonstrate stronger left hemisphere lateralization for producing signs than speech, which was not primarily driven by motoric articulatory demands.
Journal ArticleDOI
Poststroke Aphasia Rehabilitation: Why All Talk and No Action?
TL;DR: A combinatorial hand-arm-language paradigm that capitalizes on shared neural networks may therefore prove beneficial for aphasia recovery in stroke patients and requires further exploration.
References
More filters
Journal ArticleDOI
Co-Planar Stereotaxic Atlas of the Human Brain—3-Dimensional Proportional System: An Approach to Cerebral Imaging, J. Talairach, P. Tournoux. Georg Thieme Verlag, New York (1988), 122 pp., 130 figs. DM 268
Journal ArticleDOI
Global, voxel, and cluster tests, by theory and permutation, for a difference between two groups of structural MR images of the brain
Edward T. Bullmore,John Suckling,S Overmeyer,Sophia Rabe-Hesketh,Eric Taylor,Michael Brammer +5 more
TL;DR: Almost entirely automated procedures for estimation of global, voxel, and cluster-level statistics to test the null hypothesis of zero neuroanatomical difference between two groups of structural magnetic resonance imaging (MRI) data are described.
Journal ArticleDOI
Activation of Auditory Cortex During Silent Lipreading
Gemma A. Calvert,Edward T. Bullmore,Michael Brammer,Ruth Campbell,Steven Williams,Philip McGuire,Peter W.R. Woodruff,Susan D. Iversen,Anthony S. David +8 more
TL;DR: Three experiments suggest that these auditory cortical areas are not engaged when an individual is viewing nonlinguistic facial movements but appear to be activated by silent meaningless speechlike movements (pseudospeech), which supports psycholinguistic evidence that seen speech influences the perception of heard speech at a prelexical stage.
Journal ArticleDOI
Nonlinear event-related responses in fMRI
TL;DR: The theory and techniques upon which conclusions based on nonlinear system identification based on the use of Volterra series were based are described and the implications for experimental design and analysis are discussed.