scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Pitch perception

01 Nov 2009-
TL;DR: A review of the psychophysical study of pitch perception can be found in this article, where the authors show that the pitch of complex stimuli is likely based on the temporal regularities in a sound's waveform, with the strongest pitches occurring for stimuli with low-frequency components.
Abstract: This article is a review of the psychophysical study of pitch perception. The history of the study of pitch has seen a continual competition between spectral and temporal theories of pitch perception. The pitch of complex stimuli is likely based on the temporal regularities in a sound’s waveform, with the strongest pitches occurring for stimuli with low-frequency components. Thus, temporal models, especially those based on autocorrelationlike processes, appear to account for the majority of the data.

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: A conceptual framework for understanding such effects based on mechanisms of neural plasticity is offered, and initial data from a new study motivated by the OPERA hypothesis is presented, focusing on the impact of musical training on speech perception in cochlear-implant users.

229 citations

Journal ArticleDOI
TL;DR: The mismatch negativity elicited by lexical pitch contrast was lateralized to the left hemisphere, which is consistent with the pattern of function-dependent brain asymmetry (i.e., left hemisphere lateralization for speech processing) in nontonal language speakers.

53 citations

Journal ArticleDOI
TL;DR: It is important to maintain an updated understanding of the scope of the issues present in this population and to continue to identify those solutions that can provide measurable improvements in the lives of Veterans who have been exposed to high-intensity blasts during the course of their military service.
Abstract: Auditory system functions, from peripheral sensitivity to central processing capacities, are all at risk from a blast event. Accurate encoding of auditory patterns in time, frequency, and space are required for a clear understanding of speech and accurate localization of sound sources in environments with background noise, multiple sound sources, and/or reverberation. Further work is needed to refine the battery of clinical tests sensitive to the sorts of central auditory dysfunction observed in individuals with blast exposure. Treatment options include low-gain hearing aids, remote-microphone technology, and auditory-training regimens, but clinical evidence does not yet exist for recommending one or more of these options. As this population ages, the natural aging process and other potential brain injuries (such as stroke and blunt trauma) may combine with blast-related brain changes to produce a population for which the current clinical diagnostic and treatment tools may prove inadequate. It is important to maintain an updated understanding of the scope of the issues present in this population and to continue to identify those solutions that can provide measurable improvements in the lives of Veterans who have been exposed to high-intensity blasts during the course of their military service.

52 citations

Journal ArticleDOI
TL;DR: Simulation of empirical studies that investigated the processing of harmonic structures revealed that most of the musical syntax manipulations used with behavioral and neurophysiological methods as well as with developmental and cross-cultural approaches can be accounted for by the auditory memory model, leading to question whether current research on musical syntax can really be compared with linguistic processing.
Abstract: During the last decade, it has been argued that (1) music processing involves syntactic representations similar to those observed in language, and (2) that music and language share similar syntactic-like processes and neural resources. This claim is important for understanding the origin of music and language abilities and, furthermore, it has clinical implications. The Western musical system, however, is rooted in psychoacoustic properties of sound, and this is not the case for linguistic syntax. Accordingly, musical syntax processing could be parsimoniously understood as an emergent property of auditory memory rather than a property of abstract processing similar to linguistic processing. To support this view, we simulated numerous empirical studies that investigated the processing of harmonic structures, using a model based on the accumulation of sensory information in auditory memory. The simulations revealed that most of the musical syntax manipulations used with behavioral and neurophysiological methods as well as with developmental and cross-cultural approaches can be accounted for by the auditory memory model. This led us to question whether current research on musical syntax can really be compared with linguistic processing. Our simulation also raises methodological and theoretical challenges to study musical syntax while disentangling the confounded low-level sensory influences. In order to investigate syntactic abilities in music comparable to language, research should preferentially use musical material with structures that circumvent the tonal effect exerted by psychoacoustic properties of sounds.

48 citations

Journal ArticleDOI
TL;DR: The way in which musical pitch works as a peculiar form of cognition that reflects upon the organization of the surrounding world as perceived by majority of music users within a socio-cultural formation is revealed.
Abstract: This paper reveals the way in which musical pitch works as a peculiar form of cognition that reflects upon the organization of the surrounding world as perceived by majority of music users within a socio-cultural formation. The evidence from music theory, ethnography, archeology, organology, anthropology, psychoacoustics, and evolutionary biology is plotted against experimental evidence. Much of the methodology for this investigation comes from studies conducted within the territory of the former USSR. To date, this methodology has remained solely confined to Russian speaking scholars. A brief overview of pitch-set theory demonstrates the need to distinguish between vertical and horizontal harmony, laying out the framework for virtual music space that operates according to the perceptual laws of tonal gravity. Brought to life by bifurcation of music and speech, tonal gravity passed through eleven discrete stages of development until the onset of tonality in the seventeenth century. Each stage presents its own method of integration of separate musical tones into an auditory-cognitive unity. The theory of "melodic intonation" is set forth as a counterpart to harmonic theory of chords. Notions of tonality, modality, key, diatonicity, chromaticism, alteration, and modulation are defined in terms of their perception, and categorized according to the way in which they have developed historically. Tonal organization in music, and perspective organization in fine arts are explained as products of the same underlying mental process. Music seems to act as a unique medium of symbolic representation of reality through the concept of pitch. Tonal organization of pitch reflects the culture of thinking, adopted as a standard within a community of music users. Tonal organization might be a naturally formed system of optimizing individual perception of reality within a social group and its immediate environment, setting conventional standards of intellectual and emotional intelligence.

38 citations

References
More filters
Journal ArticleDOI
TL;DR: A subjective scale for the measurement of pitch was constructed from determinations of the half-value of pitches at various frequencies as mentioned in this paper, which differs from both the musical scale and the frequency scale, neither of which is subjective.
Abstract: A subjective scale for the measurement of pitch was constructed from determinations of the half‐value of pitches at various frequencies. This scale differs from both the musical scale and the frequency scale, neither of which is subjective. Five observers fractionated tones of 10 different frequencies at a loudness level of 60 db. From these fractionations a numerical scale was constructed which is proportional to the perceived magnitude of subjective pitch. In numbering the scale the 1000‐cycle tone was assigned the pitch of 1000 subjective units (mels). The close agreement of the pitch scale with an integration of the differential thresholds (DL's) shows that, unlike the DL's for loudness, all DL's for pitch are of uniform subjective magnitude. The agreement further implies that pitch and differential sensitivity to pitch are both rectilinear functions of extent on the basilar membrane. The correspondence of the pitch scale and the experimentally determined location of the resonant areas of the basilar membrane suggests that, in cutting a pitch in half, the observer adjusts the tone until it stimulates a position half‐way from the original locus to the apical end of the membrane. Measurement of the subjective size of musical intervals (such as octaves) in terms of the pitch scale shows that the intervals become larger as the frequency of the mid‐point of the interval increases (except in the two highest audible octaves). This result confirms earlier judgments as to the relative size of octaves in different parts of the frequency range.

1,036 citations

Journal ArticleDOI
TL;DR: Comparison of recent psychoacoustic data on consonance with those on roughness reveals that “psychoacoustic consonance” merely corresponds to the absence of roughness and is only slightly and indirectly correlated with musical intervals, so psychoac acoustic consonance cannot be considered as the basis of the sense of musical intervals.
Abstract: Comparison of recent psychoacoustic data on consonance with those on roughness reveals that “psychoacoustic consonance” merely corresponds to the absence of roughness and is only slightly and indirectly correlated with musical intervals. Thus, psychoacoustic consonance cannot be considered as the basis of the sense of musical intervals. The basis of that sense seems to be provided by the concept of virtual pitch. This concept is introduced with a model. The concept accounts for many psychoacoustic and musical phenomena as, e.g., the ambiguity of pitch of complex tones, the “residue,” the pitch of inharmonic signals, the dominance of certain harmonics, pitch shifts, the sense for musical intervals, octave periodicity, octave enlargement, “stretching” of musical scales, and the “tonal meaning” of chords in music.

677 citations

Journal ArticleDOI
TL;DR: A software package with a modular architecture has been developed to support perceptual modeling of the fine-grain spectro-temporal information observed in the auditory nerve, including new forms of periodicity-sensitive temporal integration that generate stabilized auditory images.
Abstract: A software package with a modular architecture has been developed to support perceptual modeling of the fine‐grain spectro‐temporal information observed in the auditory nerve. The package contains both functional and physiological modules to simulate auditory spectral analysis, neural encoding, and temporal integration, including new forms of periodicity‐sensitive temporal integration that generate stabilized auditory images. Combinations of the modules enable the user to approximate a wide variety of existing, time‐domain, auditory models. Sequences of auditory images can be replayed to produce cartoons of auditory perceptions that illustrate the dynamic response of the auditory system to everyday sounds.

594 citations

Journal ArticleDOI
25 Aug 2005-Nature
TL;DR: The existence of neurons in the auditory cortex of marmoset monkeys that respond to both pure tones and missing fundamental harmonic complex sounds with the same f0, providing a neural correlate for pitch constancy are shown.
Abstract: Pitch perception plays a critical role in identifying and segregating auditory objects 1 , especially in the context of music and speech. The perception of pitch is not unique to humans and has been experimentally demonstrated in several animal species 2,3 . Pitch is the subjective attribute of a sound’s fundamental frequency (f0), that is determined by both the temporal regularity and average repetition rate of its acoustic waveform. Spectrally dissimilar sounds can have the same pitch if they share a common f0. Even when the acoustic energy at f0 is removed (“missing fundamental”) the same pitch is still perceived 1 . Despite its importance for hearing, how pitch is represented in the cerebral cortex remains unknown. Here we show the existence of neurons in the auditory cortex of marmoset monkeys that respond to both pure tones and missing fundamental harmonic complex sounds (MFs) with the same f0, providing a neural correlate for pitch constancy 1 . These pitchselective neurons are located in a restricted low-frequency cortical region near the anterolateral border of primary auditory cortex (AI), consistent with the location of a pitch-selective area identified in recent human imaging studies 4,5 .

547 citations

Journal ArticleDOI
TL;DR: Intrinsic oscillations of short duration, i.e., regularly timed discharges of units in response to stimuli without a corresponding temporal structure, were frequently observed in the ICC and were commonly found to be integer multiples of 0.4 ms.
Abstract: 1. Temporal properties of single- and multiple-unit responses were investigated in the inferior colliculus (IC) of the barbiturate-anesthetized cat. Approximately 95% of recording sites were located in the central nucleus of the inferior colliculus (ICC). Responses to contralateral stimulation with tone bursts and amplitude-modulated tones (100% sinusoidal modulation) were recorded. Five response parameters were determined for neurons at each location: 1) characteristic frequency (CF); 2) onset latency of responses to CF-tones 60 dB above threshold; 3) Q10 dB (CF divided by bandwidth of tuning curve 10 dB above threshold); 4) best modulation frequency for firing rate (rBMF or BMF; amplitude modulation frequency that elicited the highest firing rate); and 5) best modulation frequency for synchronization (sBMF; amplitude modulation frequency that elicited the highest degree of phase-locking to the modulation frequency). 2. Response characteristics for single units and multiple units corresponded closely. A BMF was obtained at almost all recording sites. For units with a similar CF, a range of BMFs was observed. The upper limit of BMF increased approximately proportional to CF/4 up to BMFs as high as 1 kHz. The lower limit of encountered BMFs for a given CF also increased slightly with CF. BMF ranges for single-unit and multiple-unit responses were similar. Twenty-three percent of the responses revealed rBMFs between 10 and 30 Hz, 51% between 30 and 100 Hz, 18% between 100 and 300 Hz, and 8% between 300 and 1000 Hz. 3. For single units with modulation transfer functions of bandpass characteristics, BMFs determined for firing rate and synchronization were similar (r2 = 0.95). 4. Onset latencies for responses to CF tones 60 dB above threshold varied between 4 and 120 ms. Ninety percent of the onset latencies were between 5 and 18 ms. A range of onset latencies was recorded for different neurons with any given CF. The onset response latency of a given unit or unit cluster was significantly correlated with the period of the BMF and the period of the CF (P less than 0.05). 5."Intrinsic oscillations" of short duration, i.e., regularly timed discharges of units in response to stimuli without a corresponding temporal structure, were frequently observed in the ICC. Oscillation intervals were commonly found to be integer multiples of 0.4 ms. Changes of stimulus frequency or intensity had only minor influences on these intrinsic oscillations.(ABSTRACT TRUNCATED AT 400 WORDS)

547 citations