scispace - formally typeset
Search or ask a question

Showing papers presented at "International Conference on Auditory Display in 2002"


Proceedings Article
01 Jul 2002
TL;DR: Three sonifications are presented within this paper: spectral mapping sonification, which offers a quite direct inspection of the recorded data, distance matrix sonification which allows to detect nonlinear long range correlations at high time resolution, and differential sonificationWhich summarizes the comparison of EEG measurements under different conditions for each subject.
Abstract: This paper presents techniques to render acoustic representations for EEG data. In our case, data are obtained from psycholinguistic experiments where subjects are exposed to three different conditions based on different auditory stimuli. The goal of this research is to uncover elements of neural processing correlated with high-level cognitive activity. Three sonifications are presented within this paper: spectral mapping sonification which offers a quite direct inspection of the recorded data, distance matrix sonification which allows to detect nonlinear long range correlations at high time resolution, and differential sonification which summarizes the comparison of EEG measurements under different conditions for each subject. This paper describes the techniques and presents sonification examples for experimental data.

67 citations


Proceedings Article
01 Jan 2002
TL;DR: The Data-Solid Sonification Model is introduced, which provides an acoustic representation of the local neighborhood relations in high-dimensional datasets for binary classification problems and is parameterized by a reduced data representation obtained from a growing neural gas network.
Abstract: This paper presents a new interface for controlling sonification models. A haptic controller interface is developed which allows both to manipulate a sonification model, e.g. by interacting with it and to provide a haptic data representation. A variety of input types are supported with a hand-sized interface, including shaking, squeezing, hammering, moving, rotating and accelerating. The paper presents details on the interface under development and demonstrates application of the device for controlling a sonification model. For this purpose, the Data-Solid Sonification Model is introduced, which provides an acoustic representation of the local neighborhood relations in high-dimensional datasets for binary classification problems. The model is parameterized by a reduced data representation obtained from a growing neural gas network. Sound examples are given to demonstrate the device and the sonification model.

32 citations


Proceedings Article
01 Jul 2002
TL;DR: This paper reviews results of listening tests on auditory spatial impression that describe the relation between individual characteristics of spatial impression and the precedence effect, and concludes that measurements of ICC within 1/3-octave bands are preferred for estimating ASW.
Abstract: This paper reviews results of listening tests on auditory spatial impression (ASI) that describe the relation between individual characteristics of spatial impression and the precedence effect. ASI is a general concept defined as the spatial extent of the sound image, and is comprised at least two components. One is auditory source width (ASW), defined as the width of a sound image fused temporally and spatially with the direct (preceding) sound image; the other is listener envelopment (LEV), defined as the degree of fullness of the sound image surrounding the listener, and which excludes the direct sound image for which ASW is judged. Listeners can perceive separately these two components of ASI, and their subjective reports demonstrate that they can distinguish between them. The perception of ASW and LEV has close connection with The precedence effect (the law of the first wave front). Acoustic signal components that arrive within the time and amplitude limits of the effect contribute to ASW, and those beyond the upper limits contribute to LEV. It is possible to control ASW and LEV independently by controlling physical factors that influence each of the components. It is well-known, for example, that the degree of interaural cross-correlation (ICC) is an important physical factor in the control of ASI. ASW can be predicted from ICC (and thereby controlled by the manipulation of ICC) regardless of the number and directions of arrival of sound sources. But measurements of ICC within 1/3-octave bands are preferred for estimating ASW, whereas the use of wide band and 1-octave band signals, as described in the ISO standard, are not. On the other hand, LEV cannot be controlled only through manipulation of ICC, as LEV is also affected by the spatial distribution of sounds (e.g., front/back energy ratio).

23 citations


Proceedings Article
01 Jan 2002
TL;DR: The Crystallization Sonification model is introduced, a sonification model for exploratory analysis of high-dimensional datasets designed to provide information about the intrinsic data dimensionality and the global data Dimensionality, as well as the transitions between a local and global view on a dataset.
Abstract: This paper introduces Crystallization Sonification, a sonification model for exploratory analysis of high-dimensional datasets. The model is designed to provide information about the intrinsic data dimensionality (which is a local feature) and the global data dimensionality, as well as the transitions between a local and global view on a dataset. Furthermore the sound allows to display the clustering in high-dimensional datasets. The model defines a crystal growth process in the high-dimensional data-space which starts at a user selected “condensation nucleus” and incrementally includes neighboring data according to some growth criterion. The sound summarizes the temporal evolution of this crystal growth process. For introducing the model, a simple growth law is used. Other growth laws which are used in the context of hierarchical clustering are also suited and their application in crystallization sonification offers new ways to inspect the results of data clustering as an alternative to dendrogram plots. In this paper, the sonification model is described and example sonifications are presented for some synthetic high-dimensional datasets.

21 citations


Proceedings Article
Alan Dorin1
01 Jan 2002
TL;DR: The selection of the rules of the cellular automata which enable rhythmic outcomes to be generated by the system, and the motivation for organizing the cells in their current configuration are discussed.
Abstract: This paper describes a MIDI instrument for the generative composition of polyrhythmic patterns. Patterns are determined by the activation of cells in a non-homogeneous set of interconnected cellular automata grids which form the faces of a cube in virtual three-dimensional space. This paper discusses the selection of the rules of the cellular automata which enable rhythmic outcomes to be generated by the system, and the motivation for organizing the cells in their current configuration. The fluid sonification and visualization of the cellular automata arrays was a specific aim of this work for gallery installation. Hence the means by which the rigidly-defined and deterministic automata behave was intended to be visualized and sonified in such a way as to convey a sense of emerging liquidity from the mechanism. A brief discussion of these issues appears in this paper.

14 citations


Proceedings Article
01 Jun 2002
TL;DR: Fundamental experiments on sound localization with distance confirmed that the horopter curves in auditory space are not always physically straight either, and that the form of the curves depends on the distance between the subject and the sound sources.
Abstract: For binocular visual space, it is a well-known phenomenon that the horopter curves are not always physically straight, and that the form of the curves depends on the distance. A similar phenomenon is also known for tactile space. We conducted fundamental experiments on sound localization with distance, and confirmed that the horopter curves in auditory space are not always physically straight either, and that the form of the curves depends on the distance between the subject and the sound sources. We also clarified that the form of the horopter for auditory space is not the same but relatively similar to that for visual space, and is quite different from that for tactile space.

5 citations


Proceedings Article
01 Jul 2002

4 citations