scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Eating with our ears: assessing the importance of the sounds of consumption on our perception and enjoyment of multisensory flavour experiences

03 Mar 2015-Flavour (BioMed Central)-Vol. 4, Iss: 1, pp 3
TL;DR: A growing body of research now shows that by synchronizing eating sounds with the act of consumption, one can change a person's experience of what they think that they are eating.
Abstract: Sound is the forgotten flavour sense. You can tell a lot about the texture of a food—think crispy, crunchy, and crackly—from the mastication sounds heard while biting and chewing. The latest techniques from the field of cognitive neuroscience are revolutionizing our understanding of just how important what we hear is to our experience and enjoyment of food and drink. A growing body of research now shows that by synchronizing eating sounds with the act of consumption, one can change a person’s experience of what they think that they are eating.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
26 Mar 2015-Cell
TL;DR: This Perspective explores the contributions of distinct senses to the authors' perception of food and the growing realization that the same rules of multisensory integration that have been thoroughly explored in interactions between audition, vision, and touch may also explain the combination of the (admittedly harder to study) flavor senses.

279 citations


Cites background from "Eating with our ears: assessing the..."

  • ...It is almost 60 years since researchers first started thinking about the putative role of audition in the experience of food and drink (see Spence, 2015, for a review)....

    [...]

  • ...Hearing always comes at the bottom of the list when people—whether they be professional sensory scientists or regular consumers—are asked to rank the relative importance of each of the senses to flavor perception (see Spence, 2015 on this point)....

    [...]

  • ...It is important to remember that disconfirmed expectations can occur in both the sensory-discriminative and hedonic domains (Zellner et al., 2004; see Piqueras-Fiszman and Spence, 2015 for a review)....

    [...]

  • ...In the intervening years, a large body of sensory science research has been published, demonstrating that auditory cues do indeed play an important role in the multisensory perception of food attributes such as crispy, crackly, crunchy, carbonated, and even creamy (see Spence, 2015)....

    [...]

  • ...…(that hasmost typically been studied in the laboratory), in the real world, cognitive factors such as branding, labeling, packaging, and pricing also play an important role in determining our sensory-discriminative and hedonic expectations (see Piqueras-Fiszman and Spence, 2015 for a review)....

    [...]

Journal ArticleDOI
22 Apr 2015-Flavour
TL;DR: In this article, a large body of laboratory research has demonstrated that changing the hue or intensity/saturation of the colour of food and beverage items can exert a sometimes dramatic impact on the expectations, and hence on the subsequent experiences, of consumers.
Abstract: Colour is the single most important product-intrinsic sensory cue when it comes to setting people’s expectations regarding the likely taste and flavour of food and drink. To date, a large body of laboratory research has demonstrated that changing the hue or intensity/saturation of the colour of food and beverage items can exert a sometimes dramatic impact on the expectations, and hence on the subsequent experiences, of consumers (or participants in the lab). However, should the colour not match the taste, then the result may well be a negatively valenced disconfirmation of expectation. Food colours can have rather different meanings and hence give rise to differing expectations, in different age groups, not to mention in different cultures. Genetic differences, such as in a person’s taster status, can also modulate the psychological impact of food colour on flavour perception. By gaining a better understanding of the sensory and hedonic expectations elicited by food colour in different groups of individuals, researchers are coming to understand more about why it is that what we see modulates the multisensory perception of flavour, as well as our appetitive and avoidance-related food behaviours.

250 citations


Cites background from "Eating with our ears: assessing the..."

  • ...The smell and aroma of food and drink are clearly important here, as are, on occasion, the sounds of food preparation (see [22], for a review)....

    [...]

Journal ArticleDOI
TL;DR: This review, with the focus squarely on the domain of Human-Computer Interaction (HCI), summarizes the state-of-the-art in the area and suggests that mixed reality solutions are currently the most plausible as far as delivering flavour experiences digitally is concerned.
Abstract: Review paper assesses the possibilities and pitfalls around the digitization of the chemical senses.Possibilities include delivery of ambient fragrance, digital flavour experiences.We highlight how the majority of the attempts at successful commercialization have failed, often in the face of consumer ambivalence over the perceived benefits/utility.Ultimately, we suggest that mixed reality solutions are currently the most plausible as far as delivering (or rather modulating) flavour experiences digitally is concerned.The identify key problems with digital fragrance delivery related to attention and attribution (i.e., being aware of stimulation and believing that it is doing the work). Many people are understandably excited by the suggestion that the chemical senses can be digitized; be it to deliver ambient fragrances (e.g., in virtual reality or health-related applications), or else to transmit flavour experiences via the internet. However, to date, progress in this area has been surprisingly slow. Furthermore, the majority of the attempts at successful commercialization have failed, often in the face of consumer ambivalence over the perceived benefits/utility. In this review, with the focus squarely on the domain of Human-Computer Interaction (HCI), we summarize the state-of-the-art in the area. We highlight the key possibilities and pitfalls as far as stimulating the so-called lower senses of taste, smell, and the trigeminal system are concerned. Ultimately, we suggest that mixed reality solutions are currently the most plausible as far as delivering (or rather modulating) flavour experiences digitally is concerned. The key problems with digital fragrance delivery are related to attention and attribution. People often fail to detect fragrances when they are concentrating on something else; And even when they detect that their chemical senses have been stimulated, there is always a danger that they attribute their experience (e.g., pleasure) to one of the other senses this is what we call the fundamental attribution error. We conclude with an outlook on digitizing the chemical senses and summarize a set of open-ended questions that the HCI community has to address in future explorations of smell and taste as interaction modalities

97 citations

Journal ArticleDOI
TL;DR: The latest evidence concerning the various ways in which what the authors hear can influence what they taste leads to the growing realization that the crossmodal influences of music and noise on food perception and consumer behaviour may have some important if, as yet, unrecognized implications for public health.
Abstract: Food product-extrinsic sounds (i.e., those auditory stimuli that are not linked directly to a food or beverage product, or its packaging) have been shown to exert a significant influence over various aspects of food perception and consumer behaviour, often operating outside of conscious awareness. In this review, we summarise the latest evidence concerning the various ways in which what we hear can influence what we taste. According to one line of empirical research, background noise interferes with tasting, due to attentional distraction. A separate body of marketing-relevant research demonstrates that music can be used to bias consumers' food perception, judgments, and purchasing/consumption behaviour in various ways. Some of these effects appear to be driven by the arousal elicited by loud music as well as the entrainment of people's behaviour to the musical beat. However, semantic priming effects linked to the type and style of music are also relevant. Another route by which music influences food perception comes from the observation that our liking/preference for the music that we happen to be listening to carries over to influence our hedonic judgments of what we are tasting. A final route by which hearing influences tasting relates to the emerging field of 'sonic seasoning'. A developing body of research now demonstrates that people often rate tasting experiences differently when listening to soundtracks that have been designed to be (or are chosen because they are) congruent with specific flavour experiences (e.g., when compared to when listening to other soundtracks, or else when tasting in silence). Taken together, such results lead to the growing realization that the crossmodal influences of music and noise on food perception and consumer behaviour may have some important if, as yet, unrecognized implications for public health.

91 citations

Journal ArticleDOI
14 Jun 2019-Foods
TL;DR: A new framework of multisensory flavour integration is proposed focusing not on the food-intrinsic/extrinsics divide, but rather on whether the sensory information is perceived to originate from within or outside the body.
Abstract: When it comes to eating and drinking, multiple factors from diverse sensory modalities have been shown to influence multisensory flavour perception and liking. These factors have heretofore been strictly divided into either those that are intrinsic to the food itself (e.g., food colour, aroma, texture), or those that are extrinsic to it (e.g., related to the packaging, receptacle or external environment). Given the obvious public health need for sugar reduction, the present review aims to compare the relative influences of product-intrinsic and product-extrinsic factors on the perception of sweetness. Evidence of intrinsic and extrinsic sensory influences on sweetness are reviewed. Thereafter, we take a cognitive neuroscience perspective and evaluate how differences may occur in the way that food-intrinsic and extrinsic information become integrated with sweetness perception. Based on recent neuroscientific evidence, we propose a new framework of multisensory flavour integration focusing not on the food-intrinsic/extrinsic divide, but rather on whether the sensory information is perceived to originate from within or outside the body. This framework leads to a discussion on the combinability of intrinsic and extrinsic influences, where we refer to some existing examples and address potential theoretical limitations. To conclude, we provide recommendations to those in the food industry and propose directions for future research relating to the need for long-term studies and understanding of individual differences.

75 citations


Cites background from "Eating with our ears: assessing the..."

  • ..., the sounds that we hear when eating) can contribute to our perception of crispness, freshness and pleasantness for foods such as crisps, biscuits and fruit [70–73] (see Reference [74] for a review)....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: In this paper, a panel of 60 subjects classified the 8 foods according to their texture: crispy, crunchy and crackly and these textural characteristics were described by spectral characteristics of biting sounds.
Abstract: Separate air and bone conducted food sounds generated by six subjects biting into eight foods were recorded and analysed by a fast Fourier transform (FFT) signal analyser. A panel of 60 subjects classified the 8 foods according to their texture: crispy, crunchy and crackly and these textural characteristics were described by spectral characteristics of biting sounds. Crispy foods (such as extruded flat breads) were found to generate high pitched sounds that show a high level of frequencies higher than 5 kHz, especially for air conduction. Crunchy foods (such as raw carrot) generate low pitched sounds with a characteristic peak on frequency range 1.25 to 2 kHz for air conduction. And crackly foods (such as dry biscuits) generate low pitched sounds with a high level of bone conduction. We hypothesize that discrimination between crunchy and crackly foods could be due to vibrations propagated by bone conduction that also generated vibrotactile sensations.

136 citations


"Eating with our ears: assessing the..." refers background in this paper

  • ...9 kHz and above when crushed mechanically [59,104]....

    [...]

  • ...[59,100], it will certainly be interesting in future research to determine whether there are ways in which they can either be cancelled out, or else modified, while...

    [...]

  • ...Different languages just use different terms, or else simply have no terms at all, to capture some of these textural distinctions: To give you some idea of the problems that one faces when working in this area, the French describe the texture of lettuce as craquante (crackly) or croquante (crunchy) but not as croustillant, which would be the direct translation of crispy [59,64]....

    [...]

  • ...Basically, she found that those foods that are associated with higher-pitched biting sounds are more likely to be described as ‘crispy’ than as ‘crunchy’ ([55,57,58]; see also [59,60])....

    [...]

Journal Article
TL;DR: The results of a number of studies show that the modulation of the auditory cues elicited by our contact or interaction with different surfaces (such as abrasive sandpapers or even our own skin) and products can dramatically change the way in which they are perceived, despite the fact that we are often unaware of the influence of such auditory cues on our perception.
Abstract: The sounds that are elicited when we touch or use many everyday objects typically convey potentially useful information regarding the nature of the stimuli with which we are interacting. Here we review the rapidly-growing literature demonstrating the influence of auditory cues (such as overall sound level and the spectral distribution of the sounds) on multisensory product perception. The results of a number of studies now show that the modulation of the auditory cues elicited by our contact or interaction with different surfaces (such as abrasive sandpapers or even our own skin) and products (including electric toothbrushes, aerosol sprays, food mixers, and cars) can dramatically change the way in which they are perceived, despite the fact that we are often unaware of the influence of such auditory cues on our perception. The auditory cues generated by products can also be modified in order to change people's perception of the quality/efficiency of those products. The principles of sound design have also been used recently to alter people's perception of a variety of foodstuffs. Findings such as these demonstrate the automatic and obligatory nature of multisensory integration, and show how the cues available in one sensory modality can modulate people's perception of stimuli in other sensory modalities (despite the fact that they may not be aware of the importance of such crossmodal influences). We also highlight evidence showing that auditory cues can influence product perception at a more semantic level, as demonstrated by research on signature sounds and emotional product sound design.

133 citations


"Eating with our ears: assessing the..." refers background in this paper

  • ...I, for one, am convinced that the chocolate crackling sound is accentuated in the Magnum adverts [34,35]....

    [...]

  • ...In a way, the approach to the auditory design of foods is one that the car industry have been utilizing for decades, as they have tried to perfect the sound of the car door as it closes [99] or the distinctive sound of the engine for the driver of a high-end marque (see [35], for a review)....

    [...]

Journal ArticleDOI
TL;DR: It is demonstrated that sensory crispness in almonds is an amalgam of acoustic and mechanical effects occurring during chewing and the chemometric approach is a powerful method for the objective analysis of large, complex data sets in the context of human sensory studies.
Abstract: This study combines passive acoustic and mechanical measures of sensory crispness. We show that the acoustic signal is dominated by ‘bursts’ of sound associated with crack failure events in the product which also release measurable amounts of elastic energy. One-way analysis of variance (ANOVA) and principal component analysis (PCA) were performed on the sensory, acoustical, mechanical and compositional parameters. We show that this chemometric approach is a powerful method for the objective analysis of large, complex data sets in the context of human sensory studies and the objective measure of a sensory parameter; in this case crispness. We demonstrate that sensory crispness in almonds is an amalgam of acoustic and mechanical effects occurring during chewing. We show that our method is capable of predicting the crispness of roasted almonds. Copyright © 2007 John Wiley & Sons, Ltd.

132 citations


"Eating with our ears: assessing the..." refers background in this paper

  • ...Despite the informational richness contained in the auditory feedback provided by biting into and/or chewing a food, people are typically unaware of the effect that such sounds have on their multisensory perception or evaluation of particular stimuli (see also [71])....

    [...]

Journal ArticleDOI
TL;DR: These results provide the first demonstration of the tactilecapture of audition, and demonstrate that the apparent location of a sound can be biased toward tactile stimulation when it is synchronous, but notWhen it is asynchronous, with the auditory event.
Abstract: Previous research has demonstrated that the localization of auditory or tactile stimuli can be biased by the simultaneous presentation of a visual stimulus from a different spatial position. We investigated whether auditory localization judgments could also be affected by the presentation of spatially displaced tactile stimuli, using a procedure designed to reveal perceptual interactions across modalities. Participants made left-right discrimination responses regarding the perceived location of sounds, which were presented either in isolation or together with tactile stimulation to the fingertips. The results demonstrate that the apparent location of a sound can be biased toward tactile stimulation when it is synchronous, but not when it is asynchronous, with the auditory event. Directing attention to the tactile modality did not increase the bias of sound localization toward synchronous tactile stimulation. These results provide the first demonstration of the tactile capture of audition.

131 citations


"Eating with our ears: assessing the..." refers background in this paper

  • ...from the headphones, due to the well-known ventriloquism illusion [82]....

    [...]