Interactive sonification
Transcript of Interactive sonification
J Multimodal User Interfaces (2012) 5:85–86DOI 10.1007/s12193-012-0095-7
E D I TO R I A L
Interactive sonification
Roberto Bresin · Thomas Hermann · Andy Hunt
Published online: 20 April 2012© OpenInterface Association 2012
In October 2010, Roberto Bresin, Thomas Hermann andAndy Hunt launched a call for papers for a special issue onInteractive Sonification of the Journal on Multimodal UserInterfaces (JMUI). The call was published in eight majormailing lists in the field of Sound and Music Computingand on related websites. Twenty manuscripts were submit-ted for review, and eleven of them have been accepted forpublication after further improvements. Three of the papersare further developments of works presented at ISon 2010—Interactive Sonification workshop. Most of the papers wentthrough a three-stage review process.
The papers give an interesting overview of the field ofInteractive Sonification as it is today. Their topics includethe sonification of data exploration and of motion, a newsound synthesis model suitable for interactive sonificationapplications, a study on perception in the everyday peripheryof attention, and the proposal of a conceptual framework forinteractive sonification.
R. Bresin (�)KTH Royal Institute of Technology, Stockholm, Swedene-mail: [email protected]
T. HermannAmbient Intelligence Group, CITEC, Bielefeld University,Bielefeld, Germanye-mail: [email protected]
A. HuntDepartment Electronics, University of York Heslington,York YO10 5DD, UKe-mail: [email protected]
1 Data exploration
This is the traditional topic of Interactive Sonification, andis reflected by the number of papers in this category, five outof eleven. In an interesting study Grond and Hermann applyvoice synthesis (vowel sounds) for the sonification of mathe-matical functions. In another work, Ferguson and co-authorsdiscuss the interaction design for multi-touch computing forexploring and interacting with representations of time-seriesdata simultaneously in both the visual and auditory modali-ties. Metatla and co-workers focus on data exploration, andpresent an approach to designing hierarchy-based sonifica-tion for supporting non-visual interaction with relational di-agrams. In another study Ness and colleagues propose theuse of a multimodal tangible interface that allows the usersto explore data in both time and space dimensions while re-ceiving immediate sonic feedback of their actions. This in-terface is applied to phenology, the study of periodic biolog-ical processes, and can be used to explore the effects of cli-mate change. Sonification of the environment for deliveringlocation-based information to mobile users is the challengefaced by El-Shimy and collaborators in their research work.They propose a system that allows for increased awarenessof the environment for users with limited vision capabili-ties or whose visual attention is otherwise occupied in othertasks.
2 Motion
One of the papers which presents a follow-up study of workpresented at ISon 2010 is the work by Dubus in which hepresents an evaluation of four sound models for the sonifi-cation of elite rowing. The sonified data were those of themovement (speed and acceleration) of a single scull rowing
86 J Multimodal User Interfaces (2012) 5:85–86
boat. Results show a good ability of athletes to efficiently ex-tract basic characteristics of the sonified data, and highlightthe important issue of aesthetics in interactive sonificationdesign. In another paper dedicated to motion, Varni and col-leagues present three interactive sonification models of thesynchronisation of gestures between two people each shak-ing a mobile phone. Interactive sonification of their handmovements helped users to keep synchronised with eachother. In a study on the sonification of everyday actions,e.g., pressing a button on an ATM machine, Susini and co-workers found that the level of usability (low vs. high) of theuser interface affects the choice of sounds that best deliver asense of naturalness of the interaction.
3 Sound synthesis
The representation of continuous processes in interactionand interface design often uses liquid metaphors, such asdripping or streaming of fluids. In an original work Dri-oli and Rocchesso present a physics-based sound synthesismodel of liquid phenomena suitable for interactive sonifica-tion of this class of processes.
4 Perception
In a follow-up study of their work presented at ISon 2010Bakker and colleagues present a qualitative study on the ev-
eryday periphery of attention. They found that sound playsa major role, supporting their approach to use interactivesonification as an interaction style for peripheral interac-tion.
5 Conceptual framework
In a further development of their study presented at ISon2010, Diniz and co-authors present in more detail their the-oretical foundations combining gestalt-based electroacous-tic composition techniques, user body-centred spatial explo-ration and mediation technology for the definition of a con-ceptual framework for interactive sonification.
6 Concluding comments
The diversity of topics brought together in this special is-sue demonstrates the richness and interdisciplinary of re-search in Interactive Sonification. The guest editors hopethat the articles will provide inspiration and trigger newcross-discipline ideas.