scispace - formally typeset
Search or ask a question
Journal ArticleDOI

The Ventriloquist Effect Results from Near-Optimal Bimodal Integration

03 Feb 2004-Current Biology (Cell Press)-Vol. 14, Iss: 3, pp 257-262
TL;DR: This study investigates spatial localization of audio-visual stimuli and finds that for severely blurred visual stimuli, the reverse holds: sound captures vision while for less blurred stimuli, neither sense dominates and perception follows the mean position.
About: This article is published in Current Biology.The article was published on 2004-02-03 and is currently open access. It has received 1642 citations till now. The article focuses on the topics: Visual capture & Visual perception.
Citations
More filters
Journal ArticleDOI
TL;DR: This target article critically examines this "hierarchical prediction machine" approach, concluding that it offers the best clue yet to the shape of a unified science of mind and action.
Abstract: Brains, it has recently been argued, are essentially prediction machines. They are bundles of cells that support perception and action by constantly attempting to match incoming sensory inputs with top-down expectations or predictions. This is achieved using a hierarchical generative model that aims to minimize prediction error within a bidirectional cascade of cortical processing. Such accounts offer a unifying model of perception and action, illuminate the functional role of attention, and may neatly capture the special contribution of cortical processing to adaptive success. This target article critically examines this "hierarchical prediction machine" approach, concluding that it offers the best clue yet to the shape of a unified science of mind and action. Sections 1 and 2 lay out the key elements and implications of the approach. Section 3 explores a variety of pitfalls and challenges, spanning the evidential, the methodological, and the more properly conceptual. The paper ends (sections 4 and 5) by asking how such approaches might impact our more general vision of mind, experience, and agency.

3,640 citations


Cites background from "The Ventriloquist Effect Results fr..."

  • ...This is an important finding that has now been repeated in many domains, including the sound-induced flash illusion (Shams and Beierholm (2005)), ventriloquism effects (Alais and Burr (2004)) and the impact of figure-ground convexity cues in depth perception (Burge et al (2010))....

    [...]

Journal ArticleDOI
TL;DR: How noise affects neuronal networks and the principles the nervous system applies to counter detrimental effects of noise are highlighted, and noise's potential benefits are discussed.
Abstract: Noise — random disturbances of signals — poses a fundamental problem for information processing and affects all aspects of nervous-system function. However, the nature, amount and impact of noise in the nervous system have only recently been addressed in a quantitative manner. Experimental and computational methods have shown that multiple noise sources contribute to cellular and behavioural trial-to-trial variability. We review the sources of noise in the nervous system, from the molecular to the behavioural level, and show how noise contributes to trial-to-trial variability. We highlight how noise affects neuronal networks and the principles the nervous system applies to counter detrimental effects of noise, and briefly discuss noise's potential benefits.

2,350 citations

Journal ArticleDOI
TL;DR: A major challenge for neuroscientists is to test ideas for how this might be achieved in populations of neurons experimentally, and so determine whether and how neurons code information about sensory uncertainty.

2,067 citations

Journal ArticleDOI
TL;DR: It is shown that, depending on the type of information, different combination and integration strategies are used and that prior knowledge is often required for interpreting the sensory signals.

1,628 citations

Journal ArticleDOI
TL;DR: The literature reviewed here supports the view thatCrossmodal correspondences need to be considered alongside semantic and spatiotemporal congruency, among the key constraints that help the authors' brains solve the crossmodal binding problem.
Abstract: In many everyday situations, our senses are bombarded by many different unisensory signals at any given time. To gain the most veridical, and least variable, estimate of environmental stimuli/properties, we need to combine the individual noisy unisensory perceptual estimates that refer to the same object, while keeping those estimates belonging to different objects or events separate. How, though, does the brain “know” which stimuli to combine? Traditionally, researchers interested in the crossmodal binding problem have focused on the roles that spatial and temporal factors play in modulating multisensory integration. However, crossmodal correspondences between various unisensory features (such as between auditory pitch and visual size) may provide yet another important means of constraining the crossmodal binding problem. A large body of research now shows that people exhibit consistent crossmodal correspondences between many stimulus features in different sensory modalities. For example, people consistently match high-pitched sounds with small, bright objects that are located high up in space. The literature reviewed here supports the view that crossmodal correspondences need to be considered alongside semantic and spatiotemporal congruency, among the key constraints that help our brains solve the crossmodal binding problem.

1,133 citations

References
More filters
Book
01 Jan 1993
TL;DR: This article presents bootstrap methods for estimation, using simple arguments, with Minitab macros for implementing these methods, as well as some examples of how these methods could be used for estimation purposes.
Abstract: This article presents bootstrap methods for estimation, using simple arguments. Minitab macros for implementing these methods are given.

37,183 citations

Journal ArticleDOI
24 Jan 2002-Nature
TL;DR: The nervous system seems to combine visual and haptic information in a fashion that is similar to a maximum-likelihood integrator, and this model behaved very similarly to humans in a visual–haptic task.
Abstract: When a person looks at an object while exploring it with their hand, vision and touch both provide information for estimating the properties of the object. Vision frequently dominates the integrated visual-haptic percept, for example when judging size, shape or position, but in some circumstances the percept is clearly affected by haptics. Here we propose that a general principle, which minimizes variance in the final estimate, determines the degree to which vision or haptics dominates. This principle is realized by using maximum-likelihood estimation to combine the inputs. To investigate cue combination quantitatively, we first measured the variances associated with visual and haptic estimation of height. We then used these measurements to construct a maximum-likelihood integrator. This model behaved very similarly to humans in a visual-haptic task. Thus, the nervous system seems to combine visual and haptic information in a fashion that is similar to a maximum-likelihood integrator. Visual dominance occurs when the variance associated with visual estimation is lower than that associated with haptic estimation.

4,142 citations


"The Ventriloquist Effect Results fr..." refers background in this paper

  • ...425 (compared with performance by the introduction of noise (following [9])....

    [...]

Journal ArticleDOI
14 Dec 2000-Nature
TL;DR: It is shown that auditory information can qualitatively alter the perception of an unambiguous visual stimulus to create a striking visual illusion, indicating that visual perception can be manipulated by other sensory modalities.
Abstract: Vision is believed to dominate our multisensory perception of the world. Here we overturn this established view by showing that auditory information can qualitatively alter the perception of an unambiguous visual stimulus to create a striking visual illusion. Our findings indicate that visual perception can be manipulated by other sensory modalities.

1,080 citations

Journal ArticleDOI
TL;DR: In this article, the smallest angular separation that can be detected between the sources of two successive tone pulses (the minimum audible angle) was determined for each of three subjects, and the threshold angles were analyzed in terms of the corresponding threshold changes in the phase, time, and intensity of the tone at the ears of the subject.
Abstract: The difference limen for the azimuth of a source of pure tones was measured as a function of the frequency of the tone and the direction of the source. Tone pulses between 250 and 10 000 cps were sounded in the horizontal plane around the head of a subject seated in an anechoic chamber. The smallest angular separation that can be detected between the sources of two successive tone pulses (the minimum audible angle) was determined for each of three subjects. These threshold angles are analyzed in terms of the corresponding threshold changes in the phase, time, and intensity of the tone at the ears of the subject. A comparison of these thresholds with those reported for dichotic stimulation indicates that the resolution of the direction of a source is determined, at frequencies below about 1400 cps, by interaural differences in phase or time, and at higher frequencies by differences in intensity. When the conditions are optimal for temporal discrimination, the threshold for an interaural difference in time is about 10μsec, and when the conditions are optimal for intensity discrimination, the threshold for an interaural difference in intensity is about 0.5 db.

864 citations


"The Ventriloquist Effect Results fr..." refers background in this paper

  • ...We regularly experience the effect when watching ization of spectrally rich stimuli such as click trains television and movies, where the voices seem to emaproduces discrimination thresholds on the order of 1 nate from the actors’ lips rather than from the actual [7, 8]....

    [...]