Neural coding of continuous speech in auditory cortex during monaural and dichotic listening
Nai Ding,Jonathan Z. Simon +1 more
TLDR
These findings characterize how the spectrotemporal features of speech are encoded in human auditory cortex and establish a single-trial-based paradigm to study the neural basis underlying the cocktail party phenomenon.Abstract:
The cortical representation of the acoustic features of continuous speech is the foundation of speech perception. In this study, noninvasive magnetoencephalography (MEG) recordings are obtained from human subjects actively listening to spoken narratives, in both simple and cocktail party-like auditory scenes. By modeling how acoustic features of speech are encoded in ongoing MEG activity as a spectrotemporal response function, we demonstrate that the slow temporal modulations of speech in a broad spectral region are represented bilaterally in auditory cortex by a phase-locked temporal code. For speech presented monaurally to either ear, this phase-locked response is always more faithful in the right hemisphere, but with a shorter latency in the hemisphere contralateral to the stimulated ear. When different spoken narratives are presented to each ear simultaneously (dichotic listening), the resulting cortical neural activity precisely encodes the acoustic features of both of the spoken narratives, but slightly weakened and delayed compared with the monaural response. Critically, the early sensory response to the attended speech is considerably stronger than that to the unattended speech, demonstrating top-down attentional gain control. This attentional gain is substantial even during the subjects' very first exposure to the speech mixture and therefore largely independent of knowledge of the speech content. Together, these findings characterize how the spectrotemporal features of speech are encoded in human auditory cortex and establish a single-trial-based paradigm to study the neural basis underlying the cocktail party phenomenon.read more
Citations
More filters
Journal ArticleDOI
Cortical tracking of hierarchical linguistic structures in connected speech
Nai Ding,Lucia Melloni,Lucia Melloni,Lucia Melloni,Hang Zhang,Xing Tian,David Poeppel,David Poeppel +7 more
TL;DR: It is found that, during listening to connected speech, cortical activity of different timescales concurrently tracked the time course of abstract linguistic structures at different hierarchical levels, such as words, phrases and sentences.
Journal ArticleDOI
Mechanisms Underlying Selective Neuronal Tracking of Attended Speech at a “Cocktail Party”
Elana Zion Golumbic,Elana Zion Golumbic,Nai Ding,Stephan Bickel,Stephan Bickel,Peter Lakatos,Catherine A. Schevon,Guy M. McKhann,Robert R. Goodman,Ronald G. Emerson,Ashesh D. Mehta,Ashesh D. Mehta,Jonathan Z. Simon,David Poeppel,Charles E. Schroeder,Charles E. Schroeder +15 more
TL;DR: It is found that brain activity dynamically tracks speech streams using both low-frequency phase and high-frequency amplitude fluctuations and that optimal encoding likely combines the two.
Journal ArticleDOI
Emergence of neural encoding of auditory objects while listening to competing speakers
Nai Ding,Jonathan Z. Simon +1 more
TL;DR: Recording from subjects selectively listening to one of two competing speakers using magnetoencephalography indicates that concurrent auditory objects, even if spectrotemporally overlapping and not resolvable at the auditory periphery, are neurally encoded individually in auditory cortex and emerge as fundamental representational units for top-down attentional modulation and bottom-up neural adaptation.
Journal ArticleDOI
Attentional Selection in a Cocktail Party Environment Can Be Decoded from Single-Trial EEG
James O’Sullivan,Alan J. Power,Nima Mesgarani,Siddharth Rajaram,John J. Foxe,Barbara G. Shinn-Cunningham,Malcolm Slaney,Shihab A. Shamma,Edmund C. Lalor +8 more
TL;DR: It is shown that single-trial unaveraged EEG data can be decoded to determine attentional selection in a naturalistic multispeaker environment and a significant correlation between the EEG-based measure of attention and performance on a high-level attention task is shown.
Journal ArticleDOI
Speech rhythms and multiplexed oscillatory sensory coding in the human brain.
Joachim Gross,Nienke Hoogenboom,Gregor Thut,Philippe G. Schyns,Stefano Panzeri,Pascal Belin,Simon Garrod +6 more
TL;DR: A neuroimaging study reveals how coupled brain oscillations at different frequencies align with quasi-rhythmic features of continuous speech such as prosody, syllables, and phonemes.
References
More filters
Journal ArticleDOI
Auditory Attentional Control and Selection during Cocktail Party Listening
Kevin T. Hill,Lee M. Miller +1 more
TL;DR: Functional magnetic resonance imaging is used to examine auditory attention to natural speech under such high processing-load conditions and finds a left-dominant fronto-parietal network with a bias toward spatial processing in dorsal precentral sulcus and superior parietal lobule, and a bias towards pitch in inferior frontal gyrus.
Journal ArticleDOI
Right Hemispheric Laterality of Human 40 Hz Auditory Steady-state Responses
TL;DR: It is demonstrated that asymmetric organization in the cerebral auditory cortex is already established on the level of sensory processing and likely reflects periodic stimulus attributes and might be relevant for pitch processing based on temporal stimulus regularities.
Journal ArticleDOI
Ultra-fine frequency tuning revealed in single neurons of human auditory cortex.
TL;DR: It is reported that frequency tuning in single neurons recorded from human auditory cortex in response to random-chord stimuli is far narrower than that typically described in any other mammalian species (besides bats), and substantially exceeds that attributed to the human auditory periphery.
Journal ArticleDOI
Estimating sparse spectro-temporal receptive fields with natural stimuli.
TL;DR: A new, computationally efficient algorithm for estimating tuning properties, boosting, is compared to a more commonly used algorithm, normalized reverse correlation, and it is found that models estimated by boosting also predict responses to non-speech stimuli more accurately.
Journal ArticleDOI
Neural correlates of auditory perceptual awareness under informational masking.
TL;DR: It is shown that neural correlates of auditory awareness in informational masking emerge between early and late stages of processing within the auditory cortex, presumably from the primary auditory cortex.
Related Papers (5)
Emergence of neural encoding of auditory objects while listening to competing speakers
Nai Ding,Jonathan Z. Simon +1 more