Sonification of Haptic Interaction in a Virtual...

3
Sonification of Haptic Interaction in a Virtual Scene Emma Frid Roberto Bresin Sound and Music Computing CSC, KTH Royal Institute of Technology Stockholm, Sweden [email protected] [email protected] Jonas Moll Eva-Lotta S¨ alln¨ as Pysander Interaction Design CSC, KTH Royal Institute of Technology Stockholm, Sweden [email protected] [email protected] ABSTRACT This paper presents a brief overview of work-in-progress for a study on correlations between visual and haptic spa- tial attention in a multimodal single-user application com- paring different modalities. The aim is to gain insight into how auditory and haptic versus visual representations of temporal events may affect task performance and spatial attention. For this purpose, a 3D application involving one haptic model and two different sound models for interac- tive sonification are developed. Keywords: interactive sonification, haptic feedback, spatial attention 1. BACKGROUND Integration of haptic feedback in computer music applica- tions, especially in the context of Digital Musical Instru- ments (DMIs), is a growing research field (see e.g. [1, 2]). Numerous studies have focused on how force feed- back devices, i.e. controllers that read position informa- tion and provide continuous force feedback as a response to user movements, can be used in applications involving both sound and haptics [3, 4, 5, 6]. Audio-tactile and audio-proprioceptive interaction has been found to play an important role for spatial orientation in virtual scenes [7]. Moreover, it has been suggested that auditory and tactile signals are more effective than visual signals when it comes to drawing cross-modal attention to particular positions [8]. The current study is motivated by the fact that few previous investigations have focused on cross-modal links in spatial attention for sonified 3D hap- tic interfaces. 2. AIM The purpose of this study is to investigate how visual spa- tial attention and haptic spatial attention correlate in a single- user application comparing combinations of different modal- ities. We aim to investigate how different representations Copyright: ©2014 Emma Frid et al. This is an open-access article distributed under the terms of the Creative Commons Attribution 3.0 Unported License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. of temporal events affect task performance by triggering a shift of attention. The following proposed hypotheses will be tested : 1) by providing auditory and/or haptic feedback a visual attention shift will be triggered, and 2) auditory feedback can elicit an increased sense of effort; a user’s gestures can be affected by ecological knowledge of sound producing events related to the implemented sound model. 3. METHOD A SensAble TM Phantom® Desktop haptic device 1 is used together with eye-tracking technology to analyze how fo- cus of attention is affected by combinations of different modalities. The haptic device has a pen-like stylus, at- tached to a robotic arm, which is used to haptically inter- act with objects in virtual environments. A 3D application based on a simple task where the user is supposed to throw a ball into a goal (see Figure 1) has been developed. The application provides haptic, visual and auditory feedback. Eye-tracking data will be correlated with haptic tracking data in order to investigate hypothesis 1), i.e. if focus might shift from the ball to the goal depending on the provided feedback. Hypothesis 2) will be tested through comparison between the haptic and non-haptic condition. Figure 1: Experimental setup with the SensAble Phantom Desktop, Tobii X2-60 eye-tracker and 3D application. Experiments with first-year students from the Computer Science program at KTH Royal Institute of Technology will be carried out. Initially, pilot experiments involving vocal sketching [9] will be carried out. The pilot tests will provide ideas for design of two different sound mod- els, but also serve as a first evaluation of the entire setup. 1 http://www.dentsable.com/ haptic-phantom-desktop.htm Proceedings of the Sound and Music Computing Sweden Conference 2014 14

Transcript of Sonification of Haptic Interaction in a Virtual...

Page 1: Sonification of Haptic Interaction in a Virtual Sceneoru.diva-portal.org/smash/get/diva2:1356448/FULLTEXT01.pdf · 2019. 10. 1. · “Low cost force-feedback interaction with haptic

Sonification of Haptic Interaction in a Virtual Scene

Emma FridRoberto Bresin

Sound and Music ComputingCSC, KTH Royal Institute of Technology

Stockholm, [email protected]@kth.se

Jonas MollEva-Lotta Sallnas Pysander

Interaction DesignCSC, KTH Royal Institute of Technology

Stockholm, [email protected]

[email protected]

ABSTRACT

This paper presents a brief overview of work-in-progressfor a study on correlations between visual and haptic spa-tial attention in a multimodal single-user application com-paring different modalities. The aim is to gain insight intohow auditory and haptic versus visual representations oftemporal events may affect task performance and spatialattention. For this purpose, a 3D application involving onehaptic model and two different sound models for interac-tive sonification are developed.

Keywords: interactive sonification, haptic feedback,spatial attention

1. BACKGROUND

Integration of haptic feedback in computer music applica-tions, especially in the context of Digital Musical Instru-ments (DMIs), is a growing research field (see e.g. [1,2]). Numerous studies have focused on how force feed-back devices, i.e. controllers that read position informa-tion and provide continuous force feedback as a responseto user movements, can be used in applications involvingboth sound and haptics [3, 4, 5, 6].

Audio-tactile and audio-proprioceptive interaction hasbeen found to play an important role for spatial orientationin virtual scenes [7]. Moreover, it has been suggested thatauditory and tactile signals are more effective than visualsignals when it comes to drawing cross-modal attention toparticular positions [8]. The current study is motivated bythe fact that few previous investigations have focused oncross-modal links in spatial attention for sonified 3D hap-tic interfaces.

2. AIM

The purpose of this study is to investigate how visual spa-tial attention and haptic spatial attention correlate in a single-user application comparing combinations of different modal-ities. We aim to investigate how different representations

Copyright: ©2014 Emma Frid et al. This is an open-access article distributed

under the terms of the Creative Commons Attribution 3.0 Unported License, which

permits unrestricted use, distribution, and reproduction in any medium, provided

the original author and source are credited.

of temporal events affect task performance by triggering ashift of attention. The following proposed hypotheses willbe tested : 1) by providing auditory and/or haptic feedbacka visual attention shift will be triggered, and 2) auditoryfeedback can elicit an increased sense of effort; a user’sgestures can be affected by ecological knowledge of soundproducing events related to the implemented sound model.

3. METHOD

A SensAbleTM Phantom® Desktop haptic device 1 is usedtogether with eye-tracking technology to analyze how fo-cus of attention is affected by combinations of differentmodalities. The haptic device has a pen-like stylus, at-tached to a robotic arm, which is used to haptically inter-act with objects in virtual environments. A 3D applicationbased on a simple task where the user is supposed to throwa ball into a goal (see Figure 1) has been developed. Theapplication provides haptic, visual and auditory feedback.

Eye-tracking data will be correlated with haptic trackingdata in order to investigate hypothesis 1), i.e. if focus mightshift from the ball to the goal depending on the providedfeedback. Hypothesis 2) will be tested through comparisonbetween the haptic and non-haptic condition.

Figure 1: Experimental setup with the SensAble Phantom Desktop, TobiiX2-60 eye-tracker and 3D application.

Experiments with first-year students from the ComputerScience program at KTH Royal Institute of Technologywill be carried out. Initially, pilot experiments involvingvocal sketching [9] will be carried out. The pilot testswill provide ideas for design of two different sound mod-els, but also serve as a first evaluation of the entire setup.

1 http://www.dentsable.com/haptic-phantom-desktop.htm

Proceedings of the Sound and Music Computing Sweden Conference 2014

14

Page 2: Sonification of Haptic Interaction in a Virtual Sceneoru.diva-portal.org/smash/get/diva2:1356448/FULLTEXT01.pdf · 2019. 10. 1. · “Low cost force-feedback interaction with haptic

The subsequent experiments will contain auditory-haptic,auditory-visual, auditory-visual-haptic, haptic-visual andvisual-only conditions. A between-group design will beadopted, where each group will solve the task in one of theconditions.

Subjects will be given a period of 5 minutes for practicebefore the actual experiment starts. After the practice trial,they will be instructed to try to throw the ball into the goal40 times. Task performance, defined as a quota between 40hits and total number of trials, will be computed for eachsubject. We define visual attention in terms of time that auser is focusing on a specific area of the screen.

3.1 Apparatus

The 3D application, based on the haptic software libraryChai3D [10], is written in C++. As previously mentioned,a SensAble Phantom Desktop device will be used to pro-vide force feedback. The sound models for providing audi-tory feedback have been developed in Max and sound syn-thesis is done on a separate computer. Communication be-tween Max and the 3D application is done via OpenSound-Control (OSC) [11]. A pair of Sennheiser HD 433 head-phones will be used for auditory feedback. Eye-trackingdata will be recorded using a commercial X2-60 eye-trackerfrom Tobii Technology 2 . The Morae software 3 for us-ability testing will be used to set up, record and analyzestudy data.

3.2 Sound Design

A summary of different interaction events and suggestedcorresponding auditory feedback can be seen in Table 1.Most interaction sounds were designed as Earcons [12],since many of the sound-triggering events in the 3D appli-cation have no intuitive mapping to an auditory event. Asfor the sonification of the interaction with the haptic ball,i.e. the gesture where the user is aiming at the target, wecompare two sound models: one simple model based onfiltered white noise (simulating a whooshing sound), andone sound model designed using the friction preset fromthe Sound Design Toolkit [13].

The models are designed in such a manner that soundchanges in terms of stereo panning and frequency depend-ing on movements in the x- and y direction respectively.Velocity is mapped to volume and a specific mapping formovement along the z-axis is adopted for each sound model(see Table 1).

4. PRELIMINARY RESULTS

Pilot tests involving vocal sketching are being performed atthe time of writing. Initial findings have led to conclusionsregarding adjustments that are required in order to ensurerobust behaviour and reliable interaction in the virtual 3Denvironment. Improvements on the application as well assound models will be done in an iterative manner as the

2 http://www.tobii.com/en/eye-tracking-research/global/products/hardware/tobii-x2-60-eye-tracker/

3 http://www.techsmith.com/morae.html

Table 1: Auditory feedback and mapping.

EVENT AUDIO MESSAGEgoal MIDI sequence, increasing pitchmiss MIDI sequence, decreasing pitchhit wall impact sound model: dissonant bell (SDT)grasp ball filtered noise, increasing frequency + clickball bouncing impact sound model: wood (SDT)aim at target velocity mapped to volume

movement in x,y,z mapped to:panning, frequency, comb-filter characteristics�

panning, frequency, rubbing force?�=filtered noise model, ?=friction model (SDT)

pilot tests proceed, until the setup is stable enough for theactual experiments to be carried out.

5. FUTURE WORK

As a continuation of this study, future investigations couldinvolve assessment of how visual spatial attention could beaffected by auditory and haptic feedback in a multi-usersetting.

Acknowledgments

This work was supported by the Swedish Research Council(Grant No. D0511301).

6. REFERENCES

[1] L. L. Chu, “Haptic feedback in computer music perfor-mance,” in Proceedings of the International ComputerMusic Conference. ICMA, 1996, pp. 57–58.

[2] S. M. O’Modhrain and C. Chafe, “Incorporatinghaptic feedback into interfaces for music applica-tions,” in Proceedings of the International Symposiumon Robotics with Applications, World AutomationConference, 2000.

[3] M. Giordano, S. Sinclair, and M. M. Wanderley, “Bow-ing a vibration-enhanced force feedback device,” inProceedings of the New Interfaces for Musical Expres-sion Conference (NIME), 2012, pp. 445–448.

[4] P. Moss and B. Cunitz, “Haptic theremin : Developinga haptic musical controller using the Sensable PhantomOmni,” in Proceedings of the International ComputerMuisc Conference, 2005, pp. 275–277.

[5] A. Kontogeorgakopoulos and G. Kouroupetroglou,“Low cost force-feedback interaction with haptic dig-ital audio effects,” in Gesture and Sign Languagein Human-Computer Interaction and Embodied Com-munication, ser. Lecture Notes in Computer Science,E. Efthimiou, G. Kouroupetroglou, and S.-E. Fotinea,Eds. Springer Berlin Heidelberg, 2012, vol. 7206, pp.48–56.

[6] K. Crommentuijn and F. Winberg, “Designing auditorydisplays to facilitate object localization in virtual hap-tic 3D environments,” in Proceedings of the 8th inter-national ACM SIGACCESS conference on Computersand accessibility. ACM, 2006, pp. 255–256.

Proceedings of the Sound and Music Computing Sweden Conference 2014

15

Page 3: Sonification of Haptic Interaction in a Virtual Sceneoru.diva-portal.org/smash/get/diva2:1356448/FULLTEXT01.pdf · 2019. 10. 1. · “Low cost force-feedback interaction with haptic

[7] M. E. Altinsoy and M. Stamm, “Touch the sound: Therole of audio-tactile and audio-proprioceptive interac-tion on the spatial orientation in virtual scenes,” inProceedings of Meetings on Acoustics, vol. 19, no. 1.Acoustical Society of America, 2013, pp. 3458–3458.

[8] C. Spence and J. Driver, “Cross-modal links in atten-tion between audition, vision, and touch: Implicationsfor interface design,” International Journal of Cogni-tive Ergonomics, vol. 4, pp. 351–373, 1997.

[9] I. Ekman and M. Rinott, “Using vocal sketching fordesigning sonic interactions,” in Proceedings of the 8thACM Conference on Designing Interactive Systems,ser. DIS ’10. New York, USA: ACM, 2010, pp.123–131.

[10] F. Conti, F. Barbagli, D. Morris, and C. Sewell, “CHAI:An open-source library for the rapid development ofhaptic scenes,” in Proceedings of the IEEE WorldHaptics Conference, Pisa, Italy, March 2005, available:http://www.chai3d.org/.

[11] M. Wright, “Open Sound Control: an enabling tech-nology for musical networking,” Organised Sound,vol. 10, no. 3, pp. 193–200, 2005.

[12] M. M. Blattner, D. A. Sumikawa, and R. M. Greenberg,“Earcons and icons: Their structure and common de-sign principles,” Human–Computer Interaction, vol. 4,no. 1, pp. 11–44, 1989.

[13] S. Monache, P. Polotti, and D. Rocchesso, “A toolkitfor explorations in sonic interaction design,” in Pro-ceedings of the 5th Audio Mostly Conference: A Con-ference on Interaction with Sound, New York, NY,USA, 2010, pp. 7–13.

Proceedings of the Sound and Music Computing Sweden Conference 2014

16