scispace - formally typeset
Search or ask a question
Author

Barry E. Stein

Bio: Barry E. Stein is an academic researcher. The author has contributed to research in topics: Perception & Synesthesia. The author has an hindex of 1, co-authored 1 publications receiving 1023 citations.

Papers
More filters
Book
01 Jan 2004
TL;DR: This landmark reference work brings together for the first time in one volume the most recent research from different areas of the emerging field of multisensory integration with broad underlying principles that govern this interaction, regardless of the specific senses involved.
Abstract: This landmark reference work brings together for the first time in one volume the most recent research from different areas of the emerging field of multisensory integration. After many years of using a modality-specific "sense-by-sense" approach, researchers across different disciplines in neuroscience and psychology now recognize that perception is fundamentally a multisensory experience. To understand how the brain synthesizes information from the different senses, we must study not only how information from each sensory modality is decoded but also how this information interacts with the sensory processing taking place within other sensory channels. The findings cited in The Handbook of Multisensory Processes suggest that there are broad underlying principles that govern this interaction, regardless of the specific senses involved.The book is organized thematically into eight sections; each of the 55 chapters presents a state-of-the-art review of its topic by leading researchers in the field. The key themes addressed include multisensory contributions to perception in humans; whether the sensory integration involved in speech perception is fundamentally different from other kinds of multisensory integration; multisensory processing in the midbrain and cortex in model species, including rat, cat, and monkey; behavioral consequences of multisensory integration; modern neuroimaging techniques, including EEG, PET, and fMRI, now being used to reveal the many sites of multisensory processing in the brain; multisensory processes that require postnatal sensory experience to emerge, with examples from multiple species; brain specialization and possible equivalence of brain regions; and clinical studies of such breakdowns of normal sensory integration as brain damage and synesthesia.

1,026 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this article, the authors analyzed 120 functional neuroimaging studies focusing on semantic processing and identified reliable areas of activation in these studies using the activation likelihood estimate (ALE) technique, which formed a distinct, left-lateralized network comprised of 7 regions: posterior inferior parietal lobe, middle temporal gyrus, fusiform and parahippocampal gyri, dorsomedial prefrontal cortex, inferior frontal gyrus and posterior cingulate gyrus.
Abstract: Semantic memory refers to knowledge about people, objects, actions, relations, self, and culture acquired through experience. The neural systems that store and retrieve this information have been studied for many years, but a consensus regarding their identity has not been reached. Using strict inclusion criteria, we analyzed 120 functional neuroimaging studies focusing on semantic processing. Reliable areas of activation in these studies were identified using the activation likelihood estimate (ALE) technique. These activations formed a distinct, left-lateralized network comprised of 7 regions: posterior inferior parietal lobe, middle temporal gyrus, fusiform and parahippocampal gyri, dorsomedial prefrontal cortex, inferior frontal gyrus, ventromedial prefrontal cortex, and posterior cingulate gyrus. Secondary analyses showed specific subregions of this network associated with knowledge of actions, manipulable artifacts, abstract concepts, and concrete concepts. The cortical regions involved in semantic processing can be grouped into 3 broad categories: posterior multimodal and heteromodal association cortex, heteromodal prefrontal cortex, and medial limbic regions. The expansion of these regions in the human relative to the nonhuman primate brain may explain uniquely human capacities to use language productively, plan, solve problems, and create cultural and technological artifacts, all of which depend on the fluid and efficient retrieval and manipulation of semantic knowledge.

3,283 citations

Journal ArticleDOI
TL;DR: This review critically summarize the main challenges linked to lifelong learning for artificial learning systems and compare existing neural network approaches that alleviate, to different extents, catastrophic forgetting.

2,095 citations

Journal ArticleDOI
TL;DR: The notion that neocortical operations are essentially multisensory is examined, which forces us to abandon the notion that the senses ever operate independently during real-world cognition.

1,332 citations

Journal ArticleDOI
TL;DR: The literature reviewed here supports the view thatCrossmodal correspondences need to be considered alongside semantic and spatiotemporal congruency, among the key constraints that help the authors' brains solve the crossmodal binding problem.
Abstract: In many everyday situations, our senses are bombarded by many different unisensory signals at any given time. To gain the most veridical, and least variable, estimate of environmental stimuli/properties, we need to combine the individual noisy unisensory perceptual estimates that refer to the same object, while keeping those estimates belonging to different objects or events separate. How, though, does the brain “know” which stimuli to combine? Traditionally, researchers interested in the crossmodal binding problem have focused on the roles that spatial and temporal factors play in modulating multisensory integration. However, crossmodal correspondences between various unisensory features (such as between auditory pitch and visual size) may provide yet another important means of constraining the crossmodal binding problem. A large body of research now shows that people exhibit consistent crossmodal correspondences between many stimulus features in different sensory modalities. For example, people consistently match high-pitched sounds with small, bright objects that are located high up in space. The literature reviewed here supports the view that crossmodal correspondences need to be considered alongside semantic and spatiotemporal congruency, among the key constraints that help our brains solve the crossmodal binding problem.

1,133 citations

Journal ArticleDOI
TL;DR: Understanding the acquisition and usage of multisensory integration in the midbrain and cerebral cortex of mammals has been aided by a multiplicity of approaches and some of the challenging questions that remain are examined.
Abstract: Multisensory integration allows information from multiple senses to be combined, with benefits for nervous-system processing. Stein and Stanford discuss the principles of multisensory integration in single neurons in the CNS and consider the questions that the field must address.

1,110 citations