scispace - formally typeset
Search or ask a question
Journal ArticleDOI

The signing brain: the neurobiology of sign language

01 Nov 2008-Trends in Cognitive Sciences (Elsevier)-Vol. 12, Iss: 11, pp 432-440
TL;DR: The authors found that the neural systems supporting signed and spoken languages are very similar: both involve a predominantly left-lateralised perisylvian network. But they also highlighted processing differences between languages in these different modalities.
About: This article is published in Trends in Cognitive Sciences.The article was published on 2008-11-01. It has received 226 citations till now. The article focuses on the topics: Spoken language & Manually coded language.
Citations
More filters
Journal ArticleDOI
TL;DR: It is found that in congenitally blind individuals, the left visual cortex behaves similarly to classic language regions and is concluded that brain regions that are thought to have evolved for vision can take on language processing as a result of early experience.
Abstract: Humans are thought to have evolved brain regions in the left frontal and temporal cortex that are uniquely capable of language processing. However, congenitally blind individuals also activate the visual cortex in some verbal tasks. We provide evidence that this visual cortex activity in fact reflects language processing. We find that in congenitally blind individuals, the left visual cortex behaves similarly to classic language regions: (i) BOLD signal is higher during sentence comprehension than during linguistically degraded control conditions that are more difficult; (ii) BOLD signal is modulated by phonological information, lexical semantic information, and sentence-level combinatorial structure; and (iii) functional connectivity with language regions in the left prefrontal cortex and thalamus are increased relative to sighted individuals. We conclude that brain regions that are thought to have evolved for vision can take on language processing as a result of early experience. Innate microcircuit properties are not necessary for a brain region to become involved in language processing.

367 citations


Cites background from "The signing brain: the neurobiology..."

  • ...During language processing, speakers of English, Mandarin, and sign languages activate a leftlateralized network of brain regions in the prefrontal, lateral temporal, and temporoparietal cortices (5, 6)....

    [...]

Journal ArticleDOI
14 Jun 2013-Science
TL;DR: It is argued that the evidence for the endophenotype account is unconvincing, not least because there is little support for strong genetic influences on individual differences in cerebral asymmetry.
Abstract: In most people, language is processed predominantly by the left hemisphere of the brain, but we do not know how or why. A popular view is that developmental language disorders result from a poorly lateralized brain, but until recently, evidence has been weak and indirect. Modern neuroimaging methods have made it possible to study normal and abnormal development of lateralized function in the developing brain and have confirmed links with language and literacy impairments. However, there is little evidence that weak cerebral lateralization has common genetic origins with language and literacy impairments. Our understanding of the association between atypical language lateralization and developmental disorders may benefit if we reconceptualize the nature of cerebral asymmetry to recognize its multidimensionality and consider variation in lateralization over developmental time. Contrary to popular belief, cerebral lateralization may not be a highly heritable, stable characteristic of individuals; rather, weak lateralization may be a consequence of impaired language learning.

310 citations


Cites background from "The signing brain: the neurobiology..."

  • ...lateralization to the processing of a spoken language (84)....

    [...]

Journal ArticleDOI
TL;DR: It is suggested that these anterior and posterior perisylvian areas, identified since the mid-19th century as the core of the brain's language system, are not in fact committed to language processing, but may function as a modality-independent semiotic system that plays a broader role in human communication, linking meaning with symbols.
Abstract: Symbolic gestures, such as pantomimes that signify actions (e.g., threading a needle) or emblems that facilitate social transactions (e.g., finger to lips indicating “be quiet”), play an important role in human communication. They are autonomous, can fully take the place of words, and function as complete utterances in their own right. The relationship between these gestures and spoken language remains unclear. We used functional MRI to investigate whether these two forms of communication are processed by the same system in the human brain. Responses to symbolic gestures, to their spoken glosses (expressing the gestures' meaning in English), and to visually and acoustically matched control stimuli were compared in a randomized block design. General Linear Models (GLM) contrasts identified shared and unique activations and functional connectivity analyses delineated regional interactions associated with each condition. Results support a model in which bilateral modality-specific areas in superior and inferior temporal cortices extract salient features from vocal-auditory and gestural-visual stimuli respectively. However, both classes of stimuli activate a common, left-lateralized network of inferior frontal and posterior temporal regions in which symbolic gestures and spoken words may be mapped onto common, corresponding conceptual representations. We suggest that these anterior and posterior perisylvian areas, identified since the mid-19th century as the core of the brain's language system, are not in fact committed to language processing, but may function as a modality-independent semiotic system that plays a broader role in human communication, linking meaning with symbols whether these are words, gestures, images, sounds, or objects.

240 citations


Cites background from "The signing brain: the neurobiology..."

  • ...The unanimous interpretation, reinforced by the sign aphasia literature (6), has been that the perisylvian cortices process languages that possess this canonical, rule-based structure, independent of the modality in which they are expressed....

    [...]

  • ...Over the past decade, neuroimaging studies have clearly and reproducibly demonstrated that American Sign Language, British Sign Language, Langue des Signes Québécoise, indeed all sign languages studied thus far, elicit patterns of activity in core perisylvian areas that are, for the most part, indistinguishable from those accompanying the production and comprehension of spoken language (6)....

    [...]

  • ...Similar patterns of activation are also characteristically observed for comprehension of signed language (6) (see SI Note 3), and lesions of these regions typically result in aphasia for both spoken and signed languages (6, 18)....

    [...]

  • ...’’ The IFG plays an unambiguous role in language processing and damage to the area typically results in spoken and sign language aphasia (6, 13, 14)....

    [...]

  • ..., comprehension of words or of simple syntactic structures (6, 13, 14)....

    [...]

Journal ArticleDOI
TL;DR: This paper summarizes in brief some of the issues that constitute the background for talks presented in a symposium at the Annual Meeting of the Society for Neuroscience, aiming to describe promising new areas of investigation in which the neurosciences intersect with linguistic research more closely than before.
Abstract: Theoretical advances in language research and the availability of increasingly high-resolution experimental techniques in the cognitive neurosciences are profoundly changing how we investigate and conceive of the neural basis of speech and language processing. Recent work closely aligns language research with issues at the core of systems neuroscience, ranging from neurophysiological and neuroanatomic characterizations to questions about neural coding. Here we highlight, across different aspects of language processing (perception, production, sign language, meaning construction), new insights and approaches to the neurobiology of language, aiming to describe promising new areas of investigation in which the neurosciences intersect with linguistic research more closely than before. This paper summarizes in brief some of the issues that constitute the background for talks presented in a symposium at the Annual Meeting of the Society for Neuroscience. It is not a comprehensive review of any of the issues that are discussed in the symposium.

230 citations


Cites methods from "The signing brain: the neurobiology..."

  • ...Modality dependence and independence: the perspective of sign languages The study of sign languages has provided a powerful tool for investigating the neurobiology of human language (for review, see Emmorey, 2002; MacSweeney et al., 2008; Emmorey and McCullough, 2009)....

    [...]

Journal ArticleDOI
TL;DR: Cognitive Hearing Science is illustrated in research on three general topics: (1) language processing in challenging listening conditions; (2) use of auditory communication technologies or the visual modality to boost performance; (3) changes in performance with development, aging, and rehabilitative training.
Abstract: Cognitive Hearing Science or Auditory Cognitive Science is an emerging field of interdisciplinary research concerning the interactions between hearing and cognition. It follows a trend over the last half century for interdisciplinary fields to develop, beginning with Neuroscience, then Cognitive Science, then Cognitive Neuroscience, and then Cognitive Vision Science. A common theme is that an interdisciplinary approach is necessary to understand complex human behaviors, to develop technologies incorporating knowledge of these behaviors, and to find solutions for individuals with impairments that undermine typical behaviors. Accordingly, researchers in traditional academic disciplines, such as Psychology, Physiology, Linguistics, Philosophy, Anthropology, and Sociology benefit from collaborations with each other, and with researchers in Computer Science and Engineering working on the design of technologies, and with health professionals working with individuals who have impairments. The factors that triggered the emergence of Cognitive Hearing Science include the maturation of the component disciplines of Hearing Science and Cognitive Science, new opportunities to use complex digital signal-processing to design technologies suited to performance in challenging everyday environments, and increasing social imperatives to help people whose communication problems span hearing and cognition. Cognitive Hearing Science is illustrated in research on three general topics: (1) language processing in challenging listening conditions; (2) use of auditory communication technologies or the visual modality to boost performance; (3) changes in performance with development, aging, and rehabilitative training. Future directions for modeling and the translation of research into practice are suggested.

212 citations

References
More filters
Journal ArticleDOI
TL;DR: The mirror-neuron mechanism appears to play a fundamental role in both action understanding and imitation as mentioned in this paper, which is at the basis of human culture and ability to learn by imitation.
Abstract: � Abstract A category of stimuli of great importance for primates, humans in particular, is that formed by actions done by other individuals. If we want to survive, we must understand the actions of others. Furthermore, without action understanding, social organization is impossible. In the case of humans, there is another faculty that depends on the observation of others’ actions: imitation learning. Unlike most species, we are able to learn by imitation, and this faculty is at the basis of human culture. In this review we present data on a neurophysiological mechanism—the mirror-neuron mechanism—that appears to play a fundamental role in both action understanding and imitation. We describe first the functional properties of mirror neurons in monkeys. We review next the characteristics of the mirror-neuron system in humans. We stress, in particular, those properties specific to the human mirror-neuron system that might explain the human capacity to learn by imitation. We conclude by discussing the relationship between the mirror-neuron system and language.

3,161 citations

Journal ArticleDOI
TL;DR: A motor theory of speech perception, initially proposed to account for results of early experiments with synthetic speech, is now extensively revised to accommodate recent findings, and to relate the assumptions of the theory to those that might be made about other perceptual modes.

2,523 citations

Book
01 Jan 1979
TL;DR: The two faces of sign and sign language have been studied in this paper, where the authors compare Chinese and American signs and feature analysis of handshapes and the rate of speaking and signing.
Abstract: Introduction PART I: The Two Faces of Sign 1. Iconicity in Signs and Signing 2. Properties of Symbols in a Silent Language 3. Historical Change: From Iconic to Arbitrary PART II: The Structure of the Sign 4. Remembering without Words: Manual Memory 5. Slips of the Hands 6. A Comparison of Chinese and American Signs 7. A Feature Analysis of Handshapes 8. The Rate of Speaking and Signing PART III: Grammatical Processes 9. On the Creation of New Lexical Items by Compounding 10. Linguistic Expression of Category Levels 11. Aspectual Modulations on Adjectival Predicates 12. The Structured Use of Space and Movement: Morphological Processes PART IV: The Heightened Use of Language 13. Wit and Plays on Signs 14. Poetry and Song in a Language without Sound Appendix A: Notation Appendix B: Conventions Employed in Illustrations Notes References Index

1,598 citations

Book
01 Jan 2004
TL;DR: This landmark reference work brings together for the first time in one volume the most recent research from different areas of the emerging field of multisensory integration with broad underlying principles that govern this interaction, regardless of the specific senses involved.
Abstract: This landmark reference work brings together for the first time in one volume the most recent research from different areas of the emerging field of multisensory integration. After many years of using a modality-specific "sense-by-sense" approach, researchers across different disciplines in neuroscience and psychology now recognize that perception is fundamentally a multisensory experience. To understand how the brain synthesizes information from the different senses, we must study not only how information from each sensory modality is decoded but also how this information interacts with the sensory processing taking place within other sensory channels. The findings cited in The Handbook of Multisensory Processes suggest that there are broad underlying principles that govern this interaction, regardless of the specific senses involved.The book is organized thematically into eight sections; each of the 55 chapters presents a state-of-the-art review of its topic by leading researchers in the field. The key themes addressed include multisensory contributions to perception in humans; whether the sensory integration involved in speech perception is fundamentally different from other kinds of multisensory integration; multisensory processing in the midbrain and cortex in model species, including rat, cat, and monkey; behavioral consequences of multisensory integration; modern neuroimaging techniques, including EEG, PET, and fMRI, now being used to reveal the many sites of multisensory processing in the brain; multisensory processes that require postnatal sensory experience to emerge, with examples from multiple species; brain specialization and possible equivalence of brain regions; and clinical studies of such breakdowns of normal sensory integration as brain damage and synesthesia.

1,026 citations