scispace - formally typeset
Search or ask a question

Showing papers presented at "International Conference on Auditory Display in 2006"


Proceedings Article
01 Jun 2006
TL;DR: The authors discusses aesthetic issues of sonifications and the relationship between sonification and music and sound art (ars musica) and posits that many sonifications have poor internal ecological validity which makes listening more difficult, thereby resulting in poorer data extraction and inference on the part of the listener.
Abstract: This paper discusses aesthetic issues of sonifications and the relationships between sonification (ars informatica) and music & sound art (ars musica). It is posited that many sonifications have suffered from poor internal ecological validity which makes listening more difficult, thereby resulting in poorer data extraction and inference on the part of the listener. Lessons are drawn from the

56 citations


Proceedings Article
01 Jun 2006
TL;DR: A novel approach in EEG data sonification for process monitoring and exploratory as well as comparative data analysis using an excitory/articulatory speech model and a particularly selected parameter mapping to obtain auditory gestalts that correspond to features in the multivariate signals.
Abstract: We introduce a novel approach in EEG data sonification for process monitoring and exploratory as well as comparative data analysis. The approach uses an excitory/articulatory speech model and a particularly selected parameter mapping to obtain auditory gestalts (or auditory objects) that correspond to features in the multivariate signals. The sonification is adaptable to patient-specific data patterns, so that only characteristic deviations from background behavior (pathologic features) are involved in the sonification rendering. Thus the approach combines data mining techniques and case-dependent sonification design to give an application-specific solution with high potential for clinical use. We explain the sonification technique in detail and present sound examples from clinical data sets.

44 citations


Proceedings Article
01 Jun 2006
TL;DR: The sonification of electromyographic (EMG) data was found to be effective in displaying known characteristics of the data and an experiment was conducted to verify its efficacy as an auditory display, to study known parameters in the data to see if these can be detected in a sound-only display.
Abstract: This paper describes the sonification of electromyographic (EMG) data and an experiment that was conducted to verify its efficacy as an auditory display of the data. A real-time auditory display for EMG has two main advantages over the graphical representation: it frees the eyes of the analyst, or the physiotherapist, and it can be heard by the patient too who can then try to match with his/her movement the target sound of a healthy person. The sonification was found to be effective in displaying known characteristics of the data. The ‘roughness’ of the sound was found to be related to the age of the patients. The sound produced by the sonification was also judged to be appropriate as an audio metaphor of the data it displays; a factor that contributes to its potential to become a useful feedback tool for the patients. 1. INTRODUCTION Physiotherapists use EMG sensors to monitor the electrical activity of the muscles of patients. Electrodes attached to the skin of the subject detect the electrical signals from the muscles below the skin, and send it to a computer where the signal is transformed into digital information. The computer typically runs an application that receives the data, performs some basic statistics on it and displays it in graphical form. Such applications nowadays display the data in real-time as it is gathered and can also store it for later analysis. The physiotherapist will try to spot irregularities as the data is gathered, but the data can only be thoroughly analysed at a later time because of its complexity. EMG signals are believed to be full of information about the muscle activity and it is hypothesised that this visual analysis does not exploit to the full the information contained in the data. 1.1. Traditional analysis of the raw signal The raw signal contains all the possible information we can have from EMG. “[The analyst] should monitor the raw signal, even though other signal processing may be used, so that artefacts can be detected and controlled as necessary” [1, p. 74] “In the past, probably the most common way to interpret EMG was by visual inspection of the raw signal. The observer should be able to identify when the raw signal indicates that a muscle is active and when it is relaxed. The relative amount of activity may be classified either by words, such as nil, negligible, slight, moderate, marked or very marked, or by numerical values, such as 0-5, with 0 being no activity and 5 being maximal activity. Such visual observations are based on signal amplitude and frequency.”[1, p. 74] The raw signal should be monitored for all investigations, because the investigator can pick out major artefacts and eliminate that area or part of the signal. Monitoring the raw signal in real-time means looking at a graph that contains a lot of noise. The expert analyst is used to check for anomalies in the signal, but this monitoring requires a lot of experience and focus (since the analyst cannot look away from the screen). Sound can be a good alternative for monitoring the raw signal, and one that allows vital eye contact and focus with the patient to be maintained. 2. SONIFICATION OF EMG DATA One aim of this research is to study if it is possible to meaningfully map EMG data to sound parameters in order to create an informative sound display of the data. We aim to study known parameters in the data (such as the effect of a patient’s age on their muscle condition) and test to see if these can be detected in a sound-only display. If found to be effective, this new display has the potential to become a new useful tool for more general analysis and the monitoring of EMG, in particular when coupled with other standard analysis methods. Previous work with EMG and sound is covered in [6] (using EMG as a musical input), [7] which considers EMG signals as sound-ready data: “EMG signals are in the audible range. From the origin of the technique long time ago, a loudspeaker is connected to the output of the amplifier and in this way the patient can hear his or her own "muscle noise" in real time. The patient can hear the variation with the increase of effort or the rhythmical appearance of the discharge of Motor Unit Potentials (the basic functional units of the neuromuscular system). This activity is highly informative for the physician carrying the test.” [7] . As we have shown in [8], the analysis of complex data sets, which are normally displayed visually, can be enhanced by using an audio display and an interesting observation concerning this is made in [9] that: “the interference patterns of a surface electromyographic (EMG) signal are too complex to permit visual analysis.” [9] ICQDMq-b-G5.

41 citations


Proceedings Article
01 Jun 2006
TL;DR: The device allows continuous localized interaction by providing a malleable interaction surface, and acts as a tangible user interface object, integrated into a tangible computing framework called tDesk.
Abstract: This article introduces a novel human computer interaction device, developed in the scope of a Master’s Thesis. The device allows continuous localized interaction by providing a malleable interaction surface. Diverse multi-finger as well as multi-handed manipulations can be applied. Furthermore, the device acts as a tangible user interface object, integrated into a tangible computing framework called tDesk. Software to convert the malleable element’s shape into an internal surface representation has been developed. Malleable interactions are applied to a new Modelbased Sonification approach for exploratory data analysis. Highdimensional data are acoustically explored via their informative interaction sound in result to the user’s excitation.

21 citations


Proceedings Article
01 Jun 2006
TL;DR: A sonification model following the Modelbased Sonification approach that allows to scan high-dimensional data distributions by means of a physical object in the hand of the user to create a strong metaphor for understanding and relating feedback sounds in response to the user's own activity, position and orientation is developed.
Abstract: In this paper we develop a sonification model following the Modelbased Sonification approach that allows to scan high-dimensional data distributions by means of a physical object in the hand of the user. In the sonification model, the user is immersed in a 3D space of invisible but acoustically active objects which can be excited by him. Tangible computing allows to identify the excitation object (e.g. a geometric surface) with a physical object used as controller, and thus creates a strong metaphor for understanding and relating feedback sounds in response to the user’s own activity, position and orientation. We explain the technique and our current implementation in detail and give examples at hand of synthetic and real-world data sets.

20 citations


Proceedings Article
01 Jun 2006
TL;DR: The authors have explored sonification of program slice s in an attempt to determine if it is practical to offload some of the visual information in the development environment, resulting in an understanding of how to sonify slices in a manner appropriate for the software development activities.
Abstract: Comprehending a computer program can be a daunting task There is much to understand, including the interact ion among different portions of the code Program slicing can help one to understand this interaction Because present-day vi sual development environments tend to become cluttered, the authors have explored sonification of program slice s in an attempt to determine if it is practical to offload some of the visual information Three slice sonification techni ques were developed, resulting in an understanding of how to sonify slices in a manner appropriate for the software dev eloper undertaking program comprehension activities The investigation has also produced a better understand ing of sonification techniques that are musical yet non-me lodic and non-harmonic These techniques were demonstrated to a small set of developers, each reporting that the techniqu es are promising and useful

17 citations


Proceedings Article
01 Jan 2006
TL;DR: In this paper, the authors examine the IEC 60601-1-8 standard for medical equipment alarms, which incorporates a long-standing suggestion that alarms should indicate their source through distinctive melodies.
Abstract: A newly-released international standard for medical equipment alarms, IEC 60601-1-8, incorporates a long-standing suggestion that alarms should indicate their source through distinctive melodies. In this paper we examine this suggestion. We describe the proposed alarms, outline the history of the idea, and review recent research on the effectiveness of the alarms, some of it performed in our laboratory. Finally we discuss the concept of “urgency mapping” for alarms, noting where it may and may not be effective.

14 citations


Proceedings Article
01 Jun 2006
TL;DR: An attempt to improve the interaction experience by adding auditory cues to inform the user about the progress and progression of gestural commands and preliminary trials indicate increased user satisfaction and comprehension when using auditory-enhancing gestures over the non-enhanced gestures.
Abstract: The use of the mouse to allow interaction via gestures has attracted much interest recently and the popular FIREFOX web browser has been enhanced by an extension supporting mouse gestures. These gestures reveal an interaction problem: feedback is limited (often only a terse message in the browser’s status bar) and navigation errors easily result when the user unknowingly executes a gesture when trying to accomplish some other task (e.g. copying text from a web page). This paper describes an attempt to improve the interaction experience by adding auditory cues to inform the user about the progress and progression of gestural commands. FIREFOX was chosen as it has an open extension architecture that is easily modified. Preliminary trials indicate increased user satisfaction and comprehension when using auditory-enhanced gestures over the non-enhanced gestures.

5 citations


Proceedings Article
01 Jan 2006
TL;DR: Event-based mapping of parameters is found to be informative in terms of auto- and cross-correlations of the multivariate data and the sound synthesis is suitable for online sonification.
Abstract: We describe techniques to sonify rhythmic activity of epileptic seizures as measured by human EEG. Event-based mapping of parameters is found to be informative in terms of auto- and cross-correlations of the multivariate data. For the study, a group of patients with childhood absence seizures are selected. We find consistent intra-patient conservation of the rhythmic pattern as well as inter-patient variations, especially in terms of cross-correlations. The sound synthesis is suitable for online sonification. Thus, the application of the proposed sonification in clinical monitoring is possible.

4 citations