scispace - formally typeset

Journal ArticleDOI

Eating with our ears: assessing the importance of the sounds of consumption on our perception and enjoyment of multisensory flavour experiences

03 Mar 2015-Flavour (BioMed Central)-Vol. 4, Iss: 1, pp 3

AbstractSound is the forgotten flavour sense. You can tell a lot about the texture of a food—think crispy, crunchy, and crackly—from the mastication sounds heard while biting and chewing. The latest techniques from the field of cognitive neuroscience are revolutionizing our understanding of just how important what we hear is to our experience and enjoyment of food and drink. A growing body of research now shows that by synchronizing eating sounds with the act of consumption, one can change a person’s experience of what they think that they are eating.

...read more

Content maybe subject to copyright    Report

Citations
More filters

Journal ArticleDOI
26 Mar 2015-Cell
TL;DR: This Perspective explores the contributions of distinct senses to the authors' perception of food and the growing realization that the same rules of multisensory integration that have been thoroughly explored in interactions between audition, vision, and touch may also explain the combination of the (admittedly harder to study) flavor senses.
Abstract: The perception of flavor is perhaps the most multisensory of our everyday experiences. The latest research by psychologists and cognitive neuroscientists increasingly reveals the complex multisensory interactions that give rise to the flavor experiences we all know and love, demonstrating how they rely on the integration of cues from all of the human senses. This Perspective explores the contributions of distinct senses to our perception of food and the growing realization that the same rules of multisensory integration that have been thoroughly explored in interactions between audition, vision, and touch may also explain the combination of the (admittedly harder to study) flavor senses. Academic advances are now spilling out into the real world, with chefs and food industry increasingly taking the latest scientific findings on board in their food design.

219 citations


Cites background from "Eating with our ears: assessing the..."

  • ...It is almost 60 years since researchers first started thinking about the putative role of audition in the experience of food and drink (see Spence, 2015, for a review)....

    [...]

  • ...Hearing always comes at the bottom of the list when people—whether they be professional sensory scientists or regular consumers—are asked to rank the relative importance of each of the senses to flavor perception (see Spence, 2015 on this point)....

    [...]

  • ...It is important to remember that disconfirmed expectations can occur in both the sensory-discriminative and hedonic domains (Zellner et al., 2004; see Piqueras-Fiszman and Spence, 2015 for a review)....

    [...]

  • ...In the intervening years, a large body of sensory science research has been published, demonstrating that auditory cues do indeed play an important role in the multisensory perception of food attributes such as crispy, crackly, crunchy, carbonated, and even creamy (see Spence, 2015)....

    [...]

  • ...…(that hasmost typically been studied in the laboratory), in the real world, cognitive factors such as branding, labeling, packaging, and pricing also play an important role in determining our sensory-discriminative and hedonic expectations (see Piqueras-Fiszman and Spence, 2015 for a review)....

    [...]


Journal ArticleDOI
22 Apr 2015-Flavour
Abstract: Colour is the single most important product-intrinsic sensory cue when it comes to setting people’s expectations regarding the likely taste and flavour of food and drink. To date, a large body of laboratory research has demonstrated that changing the hue or intensity/saturation of the colour of food and beverage items can exert a sometimes dramatic impact on the expectations, and hence on the subsequent experiences, of consumers (or participants in the lab). However, should the colour not match the taste, then the result may well be a negatively valenced disconfirmation of expectation. Food colours can have rather different meanings and hence give rise to differing expectations, in different age groups, not to mention in different cultures. Genetic differences, such as in a person’s taster status, can also modulate the psychological impact of food colour on flavour perception. By gaining a better understanding of the sensory and hedonic expectations elicited by food colour in different groups of individuals, researchers are coming to understand more about why it is that what we see modulates the multisensory perception of flavour, as well as our appetitive and avoidance-related food behaviours.

167 citations


Cites background from "Eating with our ears: assessing the..."

  • ...The smell and aroma of food and drink are clearly important here, as are, on occasion, the sounds of food preparation (see [22], for a review)....

    [...]


Journal ArticleDOI
TL;DR: This review, with the focus squarely on the domain of Human-Computer Interaction (HCI), summarizes the state-of-the-art in the area and suggests that mixed reality solutions are currently the most plausible as far as delivering flavour experiences digitally is concerned.
Abstract: Review paper assesses the possibilities and pitfalls around the digitization of the chemical senses.Possibilities include delivery of ambient fragrance, digital flavour experiences.We highlight how the majority of the attempts at successful commercialization have failed, often in the face of consumer ambivalence over the perceived benefits/utility.Ultimately, we suggest that mixed reality solutions are currently the most plausible as far as delivering (or rather modulating) flavour experiences digitally is concerned.The identify key problems with digital fragrance delivery related to attention and attribution (i.e., being aware of stimulation and believing that it is doing the work). Many people are understandably excited by the suggestion that the chemical senses can be digitized; be it to deliver ambient fragrances (e.g., in virtual reality or health-related applications), or else to transmit flavour experiences via the internet. However, to date, progress in this area has been surprisingly slow. Furthermore, the majority of the attempts at successful commercialization have failed, often in the face of consumer ambivalence over the perceived benefits/utility. In this review, with the focus squarely on the domain of Human-Computer Interaction (HCI), we summarize the state-of-the-art in the area. We highlight the key possibilities and pitfalls as far as stimulating the so-called lower senses of taste, smell, and the trigeminal system are concerned. Ultimately, we suggest that mixed reality solutions are currently the most plausible as far as delivering (or rather modulating) flavour experiences digitally is concerned. The key problems with digital fragrance delivery are related to attention and attribution. People often fail to detect fragrances when they are concentrating on something else; And even when they detect that their chemical senses have been stimulated, there is always a danger that they attribute their experience (e.g., pleasure) to one of the other senses this is what we call the fundamental attribution error. We conclude with an outlook on digitizing the chemical senses and summarize a set of open-ended questions that the HCI community has to address in future explorations of smell and taste as interaction modalities

77 citations


Journal ArticleDOI
TL;DR: The latest evidence concerning the various ways in which what the authors hear can influence what they taste leads to the growing realization that the crossmodal influences of music and noise on food perception and consumer behaviour may have some important if, as yet, unrecognized implications for public health.
Abstract: Food product-extrinsic sounds (i.e., those auditory stimuli that are not linked directly to a food or beverage product, or its packaging) have been shown to exert a significant influence over various aspects of food perception and consumer behaviour, often operating outside of conscious awareness. In this review, we summarise the latest evidence concerning the various ways in which what we hear can influence what we taste. According to one line of empirical research, background noise interferes with tasting, due to attentional distraction. A separate body of marketing-relevant research demonstrates that music can be used to bias consumers' food perception, judgments, and purchasing/consumption behaviour in various ways. Some of these effects appear to be driven by the arousal elicited by loud music as well as the entrainment of people's behaviour to the musical beat. However, semantic priming effects linked to the type and style of music are also relevant. Another route by which music influences food perception comes from the observation that our liking/preference for the music that we happen to be listening to carries over to influence our hedonic judgments of what we are tasting. A final route by which hearing influences tasting relates to the emerging field of 'sonic seasoning'. A developing body of research now demonstrates that people often rate tasting experiences differently when listening to soundtracks that have been designed to be (or are chosen because they are) congruent with specific flavour experiences (e.g., when compared to when listening to other soundtracks, or else when tasting in silence). Taken together, such results lead to the growing realization that the crossmodal influences of music and noise on food perception and consumer behaviour may have some important if, as yet, unrecognized implications for public health.

59 citations


Journal ArticleDOI
TL;DR: Support is provided for the claim that ambient sound influences taste judgments, and the approach outlined here may help researchers and experience designers to obtain more profound effects of the auditory or multisensory atmosphere.
Abstract: All of the senses can potentially contribute to the perception and experience of food and drink. Sensory influences come both from the food or drink itself, and from the environment in which that food or drink is tasted and consumed. In this study, participants initially had to pair each of three soundtracks with one of three chocolates (varying on the bitter-sweet dimension). In a second part of the study, the impact of the various music samples on these participants’ ratings of the taste of various chocolates was assessed. The results demonstrate that what people hear exerts a significant influence over their rating of the taste of the chocolate. Interestingly, when the results were analysed based on the participants’ individual music-chocolate matches (rather than the average response of the whole group), more robust crossmodal effects were revealed. These results therefore provide support for the claim that ambient sound influences taste judgments, and potentially provide useful insights concerning the future design of multisensory tasting experiences. Practical Applications The approach outlined here follows the increasing demand from the field of gastronomy for greater influence over the general multisensory atmosphere surrounding eating/drinking experiences. One of the novel contributions of the present research is to show how, by considering a participant's individual response, further insight for user-studies in gastrophysics may be provided. Increasing the personalization of such experiments in the years to come may help researchers to design individualized “sonic seasoning” experiences that are even more effective. In the future, then, the approach outlined here may help researchers and experience designers to obtain more profound effects of the auditory or multisensory atmosphere.

56 citations


References
More filters

Journal ArticleDOI
24 Jan 2002-Nature
TL;DR: The nervous system seems to combine visual and haptic information in a fashion that is similar to a maximum-likelihood integrator, and this model behaved very similarly to humans in a visual–haptic task.
Abstract: When a person looks at an object while exploring it with their hand, vision and touch both provide information for estimating the properties of the object. Vision frequently dominates the integrated visual-haptic percept, for example when judging size, shape or position, but in some circumstances the percept is clearly affected by haptics. Here we propose that a general principle, which minimizes variance in the final estimate, determines the degree to which vision or haptics dominates. This principle is realized by using maximum-likelihood estimation to combine the inputs. To investigate cue combination quantitatively, we first measured the variances associated with visual and haptic estimation of height. We then used these measurements to construct a maximum-likelihood integrator. This model behaved very similarly to humans in a visual-haptic task. Thus, the nervous system seems to combine visual and haptic information in a fashion that is similar to a maximum-likelihood integrator. Visual dominance occurs when the variance associated with visual estimation is lower than that associated with haptic estimation.

3,731 citations


Book
22 Jan 1993
TL;DR: The authors draw on their own experiments to illustrate how sensory inputs converge on individual neurons in different areas of the brain, how these neurons integrate their inputs, the principles by which this integration occurs, and what this may mean for perception and behavior.
Abstract: Bringing together neural, perceptual, and behavioral studies, The Merging of the Senses provides the first detailed review of how the brain assembles information from different sensory systems in order to produce a coherent view of the external world. Stein and Meredith marshall evidence from a broad array of species to show that interactions among senses are the most ancient scheme of sensory organization, an integrative system reflecting a general plan that supersedes structure and species. Most importantly, they explore what is known about the neural processes by which interactions among the senses take place at the level of the single cell.The authors draw on their own experiments to illustrate how sensory inputs converge (from visual, auditory, and somatosensory modalities, for instance) on individual neurons in different areas of the brain, how these neurons integrate their inputs, the principles by which this integration occurs, and what this may mean for perception and behavior. Neurons in the superior colliculus and cortex are emphasized as models of multiple sensory integrators.Barry E. Stein is Professor of Physiology and M. Alex Meredith is Associate Professor of Anatomy, both at the Medical College of Virginia, Virginia Commonwealth University.

2,132 citations


Book
11 Sep 2013

1,764 citations


"Eating with our ears: assessing the..." refers background in this paper

  • ...The percentages tell their own story: Crocker [9] 0%; Amerine, Pangborn, and Roessler [10] <1%; Delwiche [11] 3%; Verhagen and Engelen [5] <1%; Stevenson [3] 2%; Shepherd [4] 1%; and Stuckey [12] 4% (these percentages were calculated by dividing the number of book pages given over to audition by the total number of book pages....

    [...]

  • ...Westport: Avi Publishing; 1961 (cited in Amerine et al., 1965). doi:10.1186/2044-7248-4-3 Cite this article as: Spence: Eating with our ears: assessing the importance of the sounds of consumption on our perception and enjoyment of multisensory flavour experiences....

    [...]

  • ...Amerine MA, Pangborn RM, Roessler EB: Principles of Sensory Evaluation of Food....

    [...]

  • ...While many people like the sound nowadays [94], traditionally, it was apparently judged to be rather unattractive (see [10], p....

    [...]


Journal ArticleDOI
TL;DR: This study investigates spatial localization of audio-visual stimuli and finds that for severely blurred visual stimuli, the reverse holds: sound captures vision while for less blurred stimuli, neither sense dominates and perception follows the mean position.
Abstract: Ventriloquism is the ancient art of making one's voice appear to come from elsewhere, an art exploited by the Greek and Roman oracles, and possibly earlier. We regularly experience the effect when watching television and movies, where the voices seem to emanate from the actors' lips rather than from the actual sound source. Originally, ventriloquism was explained by performers projecting sound to their puppets by special techniques, but more recently it is assumed that ventriloquism results from vision "capturing" sound. In this study we investigate spatial localization of audio-visual stimuli. When visual localization is good, vision does indeed dominate and capture sound. However, for severely blurred visual stimuli (that are poorly localized), the reverse holds: sound captures vision. For less blurred stimuli, neither sense dominates and perception follows the mean position. Precision of bimodal localization is usually better than either the visual or the auditory unimodal presentation. All the results are well explained not by one sense capturing the other, but by a simple model of optimal combination of visual and auditory information.

1,509 citations


"Eating with our ears: assessing the..." refers background in this paper

  • ...This is an audiotactile version of the phenomenon that we all experience when our brain glues the voice we hear onto the lips we see on the cinema screen despite the fact that the sounds actually originate from elsewhere in the auditorium [107]....

    [...]


Book
01 Jan 2004
TL;DR: This landmark reference work brings together for the first time in one volume the most recent research from different areas of the emerging field of multisensory integration with broad underlying principles that govern this interaction, regardless of the specific senses involved.
Abstract: This landmark reference work brings together for the first time in one volume the most recent research from different areas of the emerging field of multisensory integration. After many years of using a modality-specific "sense-by-sense" approach, researchers across different disciplines in neuroscience and psychology now recognize that perception is fundamentally a multisensory experience. To understand how the brain synthesizes information from the different senses, we must study not only how information from each sensory modality is decoded but also how this information interacts with the sensory processing taking place within other sensory channels. The findings cited in The Handbook of Multisensory Processes suggest that there are broad underlying principles that govern this interaction, regardless of the specific senses involved.The book is organized thematically into eight sections; each of the 55 chapters presents a state-of-the-art review of its topic by leading researchers in the field. The key themes addressed include multisensory contributions to perception in humans; whether the sensory integration involved in speech perception is fundamentally different from other kinds of multisensory integration; multisensory processing in the midbrain and cortex in model species, including rat, cat, and monkey; behavioral consequences of multisensory integration; modern neuroimaging techniques, including EEG, PET, and fMRI, now being used to reveal the many sites of multisensory processing in the brain; multisensory processes that require postnatal sensory experience to emerge, with examples from multiple species; brain specialization and possible equivalence of brain regions; and clinical studies of such breakdowns of normal sensory integration as brain damage and synesthesia.

1,023 citations


"Eating with our ears: assessing the..." refers background in this paper

  • ...It seems plausible to look for an explanation of these findings in terms of the well-established principles of multisensory integration [23,72]....

    [...]

  • ...the ventriloquist’s dummy and beeping flashing lights (see [72,73], for reviews)....

    [...]