scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Bridging the gap between theories of sensory cue integration and the physiology of multisensory neurons

TL;DR: The richness of perceptual experience, as well as its usefulness for guiding behaviour, depends on the synthesis of information across multiple senses, and recent studies of visual–vestibular cue integration have narrowed this gap considerably.
Abstract: The richness of perceptual experience, as well as its usefulness for guiding behaviour, depends on the synthesis of information across multiple senses. Recent decades have witnessed a surge in our understanding of how the brain combines sensory cues. Much of this research has been guided by one of two distinct approaches: one is driven primarily by neurophysiological observations, and the other is guided by principles of mathematical psychology and psychophysics. Conflicting results and interpretations have contributed to a conceptual gap between psychophysical and physiological accounts of cue integration, but recent studies of visual-vestibular cue integration have narrowed this gap considerably.
Citations
More filters
Journal ArticleDOI
07 Oct 2015-Neuron
TL;DR: This work proposes that BSC includes body-centered perception (hand, face, and trunk), based on the integration of proprioceptive, vestibular, and visual bodily inputs, and involves spatio-temporal mechanisms integrating multisensory bodily stimuli within peripersonal space (PPS).

475 citations

Journal ArticleDOI
30 Apr 2015-Nature
TL;DR: It is shown that combining mechanosensory and nociceptive cues synergistically enhances the selection of the fastest mode of escape locomotion in Drosophila larvae, and proposed that the multilevel multimodal convergence architecture may be a general feature of multisensory circuits enabling complex input–output functions and selective tuning to ecologically relevant combinations of cues.
Abstract: Natural events present multiple types of sensory cues, each detected by a specialized sensory modality. Combining information from several modalities is essential for the selection of appropriate actions. Key to understanding multimodal computations is determining the structural patterns of multimodal convergence and how these patterns contribute to behaviour. Modalities could converge early, late or at multiple levels in the sensory processing hierarchy. Here we show that combining mechanosensory and nociceptive cues synergistically enhances the selection of the fastest mode of escape locomotion in Drosophila larvae. In an electron microscopy volume that spans the entire insect nervous system, we reconstructed the multisensory circuit supporting the synergy, spanning multiple levels of the sensory processing hierarchy. The wiring diagram revealed a complex multilevel multimodal convergence architecture. Using behavioural and physiological studies, we identified functionally connected circuit nodes that trigger the fastest locomotor mode, and others that facilitate it, and we provide evidence that multiple levels of multimodal integration contribute to escape mode selection. We propose that the multilevel multimodal convergence architecture may be a general feature of multisensory circuits enabling complex input-output functions and selective tuning to ecologically relevant combinations of cues.

417 citations

Dissertation
01 Dec 2014

289 citations

Journal ArticleDOI
TL;DR: This work has shown that neurons in a newborn's brain are not capable of multisensory integration, and studies in the midbrain have shown that the development of this process is not predetermined.
Abstract: The ability to use cues from multiple senses in concert is a fundamental aspect of brain function. It maximizes the brain’s use of the information available to it at any given moment and enhances the physiological salience of external events. Because each sense conveys a unique perspective of the external world, synthesizing information across senses affords computational benefits that cannot otherwise be achieved. Multisensory integration not only has substantial survival value but can also create unique experiences that emerge when signals from different sensory channels are bound together. However, neurons in a newborn’s brain are not capable of multisensory integration, and studies in the midbrain have shown that the development of this process is not predetermined. Rather, its emergence and maturation critically depend on cross-modal experiences that alter the underlying neural circuit in such a way that optimizes multisensory integrative capabilities for the environment in which the animal will function.

280 citations

Journal ArticleDOI
TL;DR: Characterizing the computational operations of signal interactions reveals the hierarchical nature of multisensory perception in human neocortex and unravels how the brain accomplishes Bayesian Causal Inference, a statistical computation fundamental for perception and cognition.
Abstract: To form a veridical percept of the environment, the brain needs to integrate sensory signals from a common source but segregate those from independent sources. Thus, perception inherently relies on solving the "causal inference problem." Behaviorally, humans solve this problem optimally as predicted by Bayesian Causal Inference; yet, the underlying neural mechanisms are unexplored. Combining psychophysics, Bayesian modeling, functional magnetic resonance imaging (fMRI), and multivariate decoding in an audiovisual spatial localization task, we demonstrate that Bayesian Causal Inference is performed by a hierarchy of multisensory processes in the human brain. At the bottom of the hierarchy, in auditory and visual areas, location is represented on the basis that the two signals are generated by independent sources (= segregation). At the next stage, in posterior intraparietal sulcus, location is estimated under the assumption that the two signals are from a common source (= forced fusion). Only at the top of the hierarchy, in anterior intraparietal sulcus, the uncertainty about the causal structure of the world is taken into account and sensory signals are combined as predicted by Bayesian Causal Inference. Characterizing the computational operations of signal interactions reveals the hierarchical nature of multisensory perception in human neocortex. It unravels how the brain accomplishes Bayesian Causal Inference, a statistical computation fundamental for perception and cognition. Our results demonstrate how the brain combines information in the face of uncertainty about the underlying causal structure of the world.

264 citations

References
More filters
Book
01 Jan 1966
TL;DR: This book discusses statistical decision theory and sensory processes in signal detection theory and psychophysics and describes how these processes affect decision-making.
Abstract: Book on statistical decision theory and sensory processes in signal detection theory and psychophysics

11,820 citations

Journal ArticleDOI
24 Jan 2002-Nature
TL;DR: The nervous system seems to combine visual and haptic information in a fashion that is similar to a maximum-likelihood integrator, and this model behaved very similarly to humans in a visual–haptic task.
Abstract: When a person looks at an object while exploring it with their hand, vision and touch both provide information for estimating the properties of the object. Vision frequently dominates the integrated visual-haptic percept, for example when judging size, shape or position, but in some circumstances the percept is clearly affected by haptics. Here we propose that a general principle, which minimizes variance in the final estimate, determines the degree to which vision or haptics dominates. This principle is realized by using maximum-likelihood estimation to combine the inputs. To investigate cue combination quantitatively, we first measured the variances associated with visual and haptic estimation of height. We then used these measurements to construct a maximum-likelihood integrator. This model behaved very similarly to humans in a visual-haptic task. Thus, the nervous system seems to combine visual and haptic information in a fashion that is similar to a maximum-likelihood integrator. Visual dominance occurs when the variance associated with visual estimation is lower than that associated with haptic estimation.

4,142 citations

Book
01 Jan 1950

3,843 citations

Journal ArticleDOI
TL;DR: How noise affects neuronal networks and the principles the nervous system applies to counter detrimental effects of noise are highlighted, and noise's potential benefits are discussed.
Abstract: Noise — random disturbances of signals — poses a fundamental problem for information processing and affects all aspects of nervous-system function. However, the nature, amount and impact of noise in the nervous system have only recently been addressed in a quantitative manner. Experimental and computational methods have shown that multiple noise sources contribute to cellular and behavioural trial-to-trial variability. We review the sources of noise in the nervous system, from the molecular to the behavioural level, and show how noise contributes to trial-to-trial variability. We highlight how noise affects neuronal networks and the principles the nervous system applies to counter detrimental effects of noise, and briefly discuss noise's potential benefits.

2,350 citations