scispace - formally typeset
Search or ask a question
JournalISSN: 0730-7829

Music Perception 

University of California Press
About: Music Perception is an academic journal published by University of California Press. The journal publishes majorly in the area(s): Melody & Music psychology. It has an ISSN identifier of 0730-7829. Over the lifetime, 582 publications have been published receiving 32291 citations.


Papers
More filters
Journal ArticleDOI

717 citations

Journal ArticleDOI
TL;DR: A theory of the processing of such sequences, partly implemented as a computer program, is presented and it is assumed that perceivers try to generate an internal clock while listening to a temporal pattern.
Abstract: To gain insight into the internal representation of temporal patterns, we studied the perception and reproduction of tone sequences in which only the tone-onset intervals were varied. A theory of the processing of such sequences, partly implemented as a computer program, is presented. A basic assumption of the theory is that perceivers try to generate an internal clock while listening to a temporal pattern. This internal clock is of a flexible nature that adapts itself to certain characteristics of the pattern under consideration. The distribution of accented events perceived in the sequence is supposed to determine whether a clock can (and which clock will) be generated internally. Further it is assumed that if a clock is induced in the perceiver, it will be used as a measuring device to specify the temporal structure of the pattern. The nature of this specification is formalized in a tentative coding model. Three experiments are reported that test different aspects of the model. In Experiment 1, subjects reproduced various temporal patterns that only differed structurally in order to test the hypothesis that patterns more readily inducing an internal clock will give rise to more accurate percepts. In Experiment 2, clock induction is manipulated experimentally to test the clock notion more directly. Experiment 3 tests the coding portion of the model by correlating theoretical complexity of temporal patterns based on the coding model with complexity judgments. The experiments yield data that support the theoretical ideas.

637 citations

Journal ArticleDOI
TL;DR: This paper found that listeners are sensitive to musically expressed emotion in an unfamiliar tonal system, and that this sensitivity is facilitated by psychophysical cues such as tempo, rhythmic complexity, melodic complexity, and pitch range.
Abstract: Studies of the link between music and emotion have primarily focused on listeners' sensitivity to emotion in the music of their own culture. This sensitivity may reflect listeners' enculturation to the conventions of their culture's tonal system. However, it may also reflect responses to psychophysical dimensions of sound that are independent of musical experience. A model of listeners' perception of emotion in music is proposed in which emotion in music is communicated through a combination of universal and cultural cues. Listeners may rely on either of these cues, or both, to arrive at an understanding of musically expressed emotion. The current study addressed the hypotheses derived from this model using a cross-cultural approach. The following questions were investigated: Can people identify the intended emotion in music from an unfamiliar tonal system? If they can, is their sensitivity to intended emotions associated with perceived changes in psychophysical dimensions of music? Thirty Western listeners rated the degree of joy, sadness, anger, and peace in 12 Hindustani raga excerpts (field recordings obtained in North India). In accordance with the raga-rasa system, each excerpt was intended to convey one of the four moods or "rasas" that corresponded to the four emotions rated by listeners. Listeners also provided ratings of four psychophysical variables: tempo, rhythmic complexity, melodic complexity, and pitch range. Listeners were sensitive to the intended emotion in ragas when that emotion was joy, sadness, or anger. Judgments of emotion were significantly related to judgments of psychophysical dimensions, and, in some cases, to instrument timbre. The findings suggest that listeners are sensitive to musically expressed emotion in an unfamiliar tonal system, and that this sensitivity is facilitated by psychophysical cues.

585 citations

Journal ArticleDOI
TL;DR: In this paper, the effects of tempo and mode on spatial ability, arousal, and mood were examined for a Mozart sonata performed by a skilled pianist and recorded as a MIDI file, which was edited to produce four versions of the sonata that varied in tempo (fast or slow) and mode (major or minor).
Abstract: We examined effects of tempo and mode on spatial ability, arousal, and mood. A Mozart sonata was performed by a skilled pianist and recorded as a MIDI file. The file was edited to produce four versions that varied in tempo (fast or slow) and mode (major or minor). Participants listened to a single version and completed measures of spatial ability, arousal, and mood. Performance on the spatial task was superior after listening to music at a fast rather than a slow tempo, and when the music was presented in major rather than minor mode. Tempo manipulations affected arousal but not mood, whereas mode manipulations affected mood but not arousal. Changes in arousal and mood paralleled variation on the spatial task. The findings are consistent with the view that the "Mozart effect" is a consequence of changes in arousal and mood.

529 citations

Journal ArticleDOI
TL;DR: In this paper, six cyclically repeating interonset interval patterns were presented at six different note rates (slow to very fast). Listeners were asked to tap along with the underlying beat or pulse.
Abstract: In Experiment 1, six cyclically repeating interonset interval patterns (1,2:1,2:1:1,3:2:1,3:1:2, and 2:1:1:2) were each presented at six different note rates (very slow to very fast). Each trial began at a random point in the rhythmic cycle. Listeners were asked to tap along with the underlying beat or pulse. The number of times a given pulse (period, phase) was selected was taken as a measure of its perceptual salience. Responses gravitated toward a moderate pulse period of about 700 ms. At faster tempi, taps coincided more often with events followed by longer interonset intervals. In Experiment 2, listeners heard the same set of rhythmic patterns, plus a single sound in a different timbre, and were asked whether the extra sound fell on or off the beat. The position of the downbeat was found to be quite ambiguous. A quantitative model was developed from the following assumptions. The phenomenal accent of an event depends on the interonset interval that follows it, saturating for interonset intervals greater than about 1 s. The salience of a pulse sensation depends on the number of events matching a hypothetical isochronous template, and on the period of the template—pulse sensations are most salient in the vicinity of roughly 100 events per minute (moderate tempo). The metrical accent of an event depends on the saliences of pulse sensations including that event. Calculated pulse saliences and metrical accents according to the model agree well with experimental results (r > 0.85). The model may be extended to cover perceived meter, perceptible subdivisions of a beat, categorical perception, expressive timing, temporal precision and discrimination, and primacy/recency effects. The sensation of pulse may be the essential factor distinguishing musical rhythm from nonrhythm.

494 citations

Performance
Metrics
No. of papers from the Journal in previous years
YearPapers
202314
202227
202131
202027
201919
201813