Aalborg Universitet Haptic Feedback for Enhancing...

of 13 /13
Aalborg Universitet Haptic Feedback for Enhancing Realism of Walking Simulations Turchet, Luca; Burelli, Paolo; Serafin, Stefania Published in: I E E E Transactions on Haptics DOI (link to publication from Publisher): 10.1109/TOH.2012.51 Publication date: 2013 Document Version Early version, also known as pre-print Link to publication from Aalborg University Citation for published version (APA): Turchet, L., Burelli, P., & Serafin, S. (2013). Haptic Feedback for Enhancing Realism of Walking Simulations. DOI: 10.1109/TOH.2012.51 General rights Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights. ? Users may download and print one copy of any publication from the public portal for the purpose of private study or research. ? You may not further distribute the material or use it for any profit-making activity or commercial gain ? You may freely distribute the URL identifying the publication in the public portal ? Take down policy If you believe that this document breaches copyright please contact us at [email protected] providing details, and we will remove access to the work immediately and investigate your claim. Downloaded from vbn.aau.dk on: juli 15, 2018

Embed Size (px)

Transcript of Aalborg Universitet Haptic Feedback for Enhancing...

  • Aalborg Universitet

    Haptic Feedback for Enhancing Realism of Walking Simulations

    Turchet, Luca; Burelli, Paolo; Serafin, Stefania

    Published in:I E E E Transactions on Haptics

    DOI (link to publication from Publisher):10.1109/TOH.2012.51

    Publication date:2013

    Document VersionEarly version, also known as pre-print

    Link to publication from Aalborg University

    Citation for published version (APA):Turchet, L., Burelli, P., & Serafin, S. (2013). Haptic Feedback for Enhancing Realism of Walking Simulations.DOI: 10.1109/TOH.2012.51

    General rightsCopyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright ownersand it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights.

    ? Users may download and print one copy of any publication from the public portal for the purpose of private study or research. ? You may not further distribute the material or use it for any profit-making activity or commercial gain ? You may freely distribute the URL identifying the publication in the public portal ?

    Take down policyIf you believe that this document breaches copyright please contact us at [email protected] providing details, and we will remove access tothe work immediately and investigate your claim.

    Downloaded from vbn.aau.dk on: juli 15, 2018

    https://doi.org/10.1109/TOH.2012.51http://vbn.aau.dk/en/publications/haptic-feedback-for-enhancing-realism-of-walking-simulations(bb9bb904-ee55-4756-a205-b2afec8dcde8).html

  • IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 1, JANUARY 2007 1

    Haptic feedback for enhancing realism ofwalking simulations

    Luca Turchet, Paolo Burelli and Stefania Serafin

    AbstractIn this paper we describe several experiments whose goal is to evaluate the role of plantar vibrotactile feedbackin enhancing the realism of walking experiences in multimodal virtual environments. In order to achieve this goal we built aninteractive and a non-interactive multimodal feedback system. While during the use of the interactive system subjects physicallywalked, during the use of the non-interactive system the locomotion was simulated while subjects were sitting on a chair. In boththe configurations subjects were exposed to auditory and audio-visual stimuli presented with and without the haptic feedback.Results of the experiments provide a clear preference towards the simulations enhanced with haptic feedback showing thatthe haptic channel can lead to more realistic experiences in both interactive and non-interactive configurations. The majority ofsubjects clearly appreciated the added feedback. However, some subjects found the added feedback disturbing and annoying.This might be due on one hand to the limits of the haptic simulation and on the other hand to the different individual desire to beinvolved in the simulations. Our findings can be applied to the context of physical navigation in multimodal virtual environmentsas well as to enhance the user experience of watching a movie or playing a video game.

    Index TermsHaptic feedback, realism, virtual environments, physics based models.

    1 INTRODUCTION

    WHILE five years ago the design, development,incorporation in virtual world, and perceptualevaluation of haptic technologies was still in earlyphase [1], nowadays research on haptic feedback isreaching a more mature state. However, the role ofhaptic feedback to enhance realism in multimodalvirtual environments has not been extensively investi-gated in the research community. A notable exceptionis the work presented in [2], where realism is im-proved by simulating tapping with event-based hapticfeedback. In addition, results presented in [3] showthat haptic force feedback significantly increases thesense of presence in collaborative distributed virtualenvironments.On the other hand the use of senses other than visionand hearing has remained relatively unexplored inmovies and video games despite a continuous requestfrom the public and gamers for richer experienceswhile interacting with such media. Nevertheless, re-cent research efforts have focused on the design anddevelopment of non-interactive haptic interfaces forenhancing movies, providing evidence that hapticfeedback can play a relevant role in augmenting thetheatre experience beyond the visual and the auditorymodalities [4], [5], [6]. In addition, the importance of

    L. Turchet and S. Serafin are with the Department of Architec-ture, Design and Media Technology, Aalborg University Copenhagen,Lautrupvang 15, 2750 Ballerup.E-mail: tur,[email protected]

    P. Burelli is with the Center For Computer Games Research of the ITUniversity of Copenhagen, Rued Langgaards Vej 7, 2300 Copenhagen.E-mail: [email protected]

    the haptic feedback in video games was raised byChang [7] who suggested that haptic technologies willbecome an integral part of the video games designprocess.

    In the research community haptic feedback in amultimodal context has been mostly connected to theinteraction with the vision. Nevertheless, lately theinterest in investigating the interaction between touchand audition has grown. The possibility of investi-gating the interaction between auditory and hapticfeedback has been facilitated by the rapid progressof haptic technology, together with the developmentof efficient and accurate simulation algorithms. How-ever, research on the interaction between touch andaudition has focused mainly on hand based inte-ractions [8], [9], [10], while few studies have beenconducted on the interaction of these two sensorymodalities in the feet. One exception is the work ofGiordano and co-workers, who showed that the feetare effective at probing the world with discriminativetouch, with and without access to auditory informa-tion [11]. In particular their results suggested that thevibrotaction plays a relevant role in the discriminationof floor surfaces.

    Recently a small number of foot-haptic interfacesfor navigation in virtual environments have beenproposed. Typically the haptic feedback is providedduring the users locomotion by means of augmentedfloors [12] or enhanced shoes [13], [14]. Research hasshown that the use of plantar cutaneous vibrationfeedback is sufficient to elicit a percept of complianceduring walking [15]. However, the role of the hapticfeedback in enhancing the realism of the walkingexperience in multimodal virtual environments has

    Digital Object Indentifier 10.1109/ToH.2012.51 1939-1412/12/$31.00 2012 IEEE

    This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.

  • IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 1, JANUARY 2007 2

    not been empirically demonstrated.In this paper, we are interested in investigating

    subjects reaction to synthetic auditory and hapticstimuli which simulate the sensation of walking ondifferent ground surfaces both in a non-interactiveand interactive configuration. Recently we developeda system which can provide combined auditory andhaptic sensations that arise while walking on aggre-gate and solid surfaces. The system is composed ofan audio-haptic synthesis engine [16], and a pair ofshoes enhanced with sensors and actuators [13].In a previous study [17], we presented the resultsof a preliminary surface recognition experiment. Thisexperiment was conducted under three different con-ditions: auditory feedback, haptic feedback, and both.Participants were sitting in a chair, passively receivingthe stimuli through headphones and the mentionedshoes. By presenting the stimuli to the participantsnon-interactively, we introduced a high degree ofcontrol on the simulation. However, this method ofdelivery is highly contrived since it eliminates thetight sensorimotor coupling that is natural duringwalking and foot interactions. It is true for the audi-tory channel, but even more so for the haptic channel.In spite of these drastically constrained conditionssubjects were able to recognize most of the stimuli inthe audition only condition, and some of the materialproperties such as hardness in the haptic only condi-tion. Nevertheless, the combination of auditory andhaptic cues did not significantly improve recognition.

    A follow up experiment was run in another study[18] allowing subjects to walk in a controlled labo-ratory, where their steps were tracked and used todrive the simulation. Subjects were asked to recognizethe different simulated surfaces they were exposed towith uni-modal (auditory or haptic) and bi-modal (au-ditory and haptic) cues. Overall, results showed thatsubjects were able to recognize most of the synthe-sized surfaces with high accuracy. Results moreovershowed that, using the proposed system, the auditorymodality dominated the haptic one and that the haptictask was more difficult than the other two. Indeedsubjects performed the recognition task better whenusing auditory feedback versus haptic feedback, andthe combination of the two typologies of feedbackonly in some conditions significantly enhanced recog-nition.

    In both [17] and [18] subjects were also asked to ratethe degree of realism of the stimuli. Results of both thestudies showed that the combination of auditory andhaptic feedback did not always enhance the realismof the simulation compared to the uni-modal one.However, those studies followed a between-subjectsexperimental design, where each subject was pre-sented with either auditory, or haptic or audio-hapticstimuli. Therefore it was not possible to draw anyclear conclusion about the effect of the haptic feedbackin enhancing the realism of the simulations based on

    the auditory feedback alone.Starting from those results, in the present study wedescribe an experiment where subjects were asked tocompare stimuli involving the haptic feedback withthose in which the haptic feedback was not present.The role of the haptic feedback in enhancing realismof a walking experience was investigated both in aninteractive and a non-interactive context. In order toachieve this goal we built an interactive feedbacksystem and a non-interactive feedback system. Whileduring the use of the interactive system subjectsphysically walked, instead during the use of the non-interactive system the locomotion was passively simu-lated while subjects were sitting on a chair. For thislatter purpose two case studies which are frequent inboth video games and movies were analyzed: walkingand running.

    2 THE INTERACTIVE FEEDBACK SYSTEMRecently a multimodal interactive architecture hasbeen developed with the goal of creating audio-haptic-visual simulations of walking-based interac-tions [19] (see Figure 1). The architecture used du-ring the proposed experiments consisted of a motioncapture system (MoCap)(Optitrack by Naturalpoint),a nVisor SX head-mounted display (HMD), with1280x1024 resolution in each eye and a diagonal FOVof 60 degrees, two soundcards (RME Fireface 800),twelve loudspeakers (Dynaudio BM5A), an ArduinoDiecimila board, a pair of Pyle Pro PCA14 mini 2X15W stereo amplifiers, two haptic shoes and two com-puters.This system was placed in an acoustically isolatedlaboratory consisting of a control room and a largerinteraction room (5.45 m large, 5.55 m long and 2.85m high) where the setup was installed and wherethe experiments were performed. The control roomhosted two desktop computers. The first computer ranthe motion capture software and the visual engine,while the second ran the audio-haptic synthesis en-gine. Users were required to wear both the HMD anda pair of shoes enhanced with sensors and actuatorsable to provide haptic feedback during the act ofwalking [13]. Specifically, the shoes were a pair oflight-weight sandals (Model Arpenaz-50, Decathlon,Villeneuve dAscq, France). This particular model waschosen since it has light, stiff foam soles where it isrelatively easy to insert sensors and actuators. Fourcavities were made in the sole to accommodate fourvibrotactile actuators (Haptuator, Tactile Labs Inc.,Deux-Montagnes, Qc, Canada). These electromagneticrecoil-type actuators have an operational, linear band-width of 50500 Hz and can provide up to 3 G ofacceleration when connected to light loads [20]. Twoactuators were placed under the heel and the othertwo under the toe of one foot. They were bonded inplace to ensure good transmission of the vibrations

    This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.

  • IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 1, JANUARY 2007 3

    inside the soles. When activated, vibrations propa-gated far in the foam. In addition, the sole had twoforce sensitive resistors intended to pick the foot-floor interaction force in order to drive the audioand haptic synthesis. The two sensors were placedin correspondence to the heel and toe respectively ineach shoe (see Figure 2).Markers were placed on top of the HMD to trackorientation and position of the head. The trackedcoordinates were sent from the first to the secondcomputer which processed them in order to controlboth the visual and the audio-haptic engine.

    Concerning the surround sound system, eight loud-speakers were placed on the ground at the ver-tices of a regular octagon, while four loudspeakerswere placed in correspondence to the vertices of thethe rectangular floor at the height of 1.40 m. Allthe loudspeakers were used for the delivery of thesoundscapes, while only the eight on the groundwere responsible for the diffusion of the footstepsounds. Such configuration was chosen according tothe results of preliminary studies on footstep soundsdelivery methods.

    Fig. 2. A picture of one pressure sensor and twoactuators embedded in the shoe in correspondence tothe heel.

    2.1 Multimodal interactive feedbackA multimodal synthesis engine able to reproducevisual, auditory and haptic feedback was also develo-ped on the basis of the architecture described above.The auditory feedback was obtained by the combina-tion of a footstep and a soundscape sound synthesisengine implemented in the Max/MSP sound synthe-sis and multimedia real-time platform1. The footstepsounds synthesizer employed was the one proposedin [16], which allows to simulate, both offline and inreal-time, footstep sounds on several different mate-rials, both aggregate and solid. Concerning the real-time simulation, various systems for the generation ofsuch input have been developed and tested [16], [21].

    1. www.cycling74.com

    In the proposed interactive experiments, the footstepsounds synthesis was driven during the locomotionof the subject wearing the above mentioned shoes.The description of the control algorithms based on theanalysis of the values of the pressure sensors comingfrom the shoes can be found in [13].As concerns the soundscapes diffusion, the envi-ronmental sounds were delivered dynamically usinga sound diffusion algorithm based on ambisonics.Specifically, to achieve the dynamism we used theambisonic tools for Max/MSP2 which allows themovement of virtual sound sources along trajecto-ries defined on a bidimensional and tridimensionalspace [22].

    Regarding the haptic feedback, it was provided bymeans of the haptic shoes previously described. Thehaptic synthesis was driven by the same engine usedfor the synthesis of footstep sounds, and is able tosimulate the haptic sensation of walking on diffe-rent surfaces, as illustrated in [13]. Such audio-hapticfootsteps synthesizer has been extensively tested bymeans of several audio-haptic experiments and resultscan be found in [17], [18], [23], [24]. During theinteractive set of experiments, the laboratory surfaceon which participants were trampling on was a carpet.For the purpose of the experiment the audio-hapticsynthesis engine was set to simulate three aggregatesurfaces: forest floor, snow and sand (see Figure 3).Such simulations are based on the physically in-formed sonic models (PhiSM) algorithm [25]. Thisalgorithm simulates particle interactions by using astochastic parameterization thereby avoiding needingto model each of many particles explicitly. Instead,the particles are assigned a probability to create anacoustic waveform. In the case of many particles, theinteraction can be represented using a simple Poissondistribution, where the sound probability is constantat each time step, giving rise to an exponential pro-bability weighting time between events. Nevertheless,the audio frequency range, viz. 20 Hz-20 kHz, is farwider than the vibrotactile frequency range, viz. 10Hz-1.0 kHz. In order to simulate the three materialsat haptic level the audio signals were converted intovibrotactile signals by means of spectrum trunca-tion, pitch shifting and amplitude adjustment. Thespectrum truncation was the result of the frequencyresponse of both the actuators and the shoes. Theamount of shift and amplitude were set to preservethe structural features of the original signals simula-ting the three materials. Indeed, our goal in the designof the haptic feedback was to accomplish the mul-timodal congruence between audio and haptic cues,with particular regard to their matching in amplitudeenvelope and temporal evolution.

    Concerning the visual feedback provided by theHMD, three outdoor scenarios were developed using

    2. www.icst.net/research/projects/ambisonics-tools/

    This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.

  • IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 1, JANUARY 2007 4

    Fig. 1. Schematic representation of the overall architecture developed for the interactive experiments.

    Fig. 3. Typical waveforms (left) and spectra (right) of the three simulated materials. The duration of the waveformsis expressed in milliseconds, the magnitude of the spectra in decibel.

    the Unity3D engine3 in order to render the visualsensation of exploring different landscapes. The goalof such outdoor scenarios was to provide a visualrepresentation of the physically simulated surfaces

    3. http://unity3d.com/

    provided in the audio-haptic engine. In more detail,during the experiments a forest, a snowy landscapeand a beach were visually rendered to match thephysically simulated forest floor, snow and sand.The area fully seen by the cameras delimited thezone available for the users to walk. It consisted of a

    This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.

  • IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 1, JANUARY 2007 5

    rectangle 2.50x2.60 m, whose perimeter was indicatedin the simulated landscapes by means of a fence.

    3 THE NON-INTERACTIVE FEEDBACK SYS-TEMThe non-interactive feedback system (see Figure 4)was designed to convey the visual-audio-haptic feed-back to the users without any interaction. Users weresitting on a chair passively receiving the multimodalfeedback which was provided through the screen ofa 15-inch mac book pro laptop, a set of headphones4,and the haptic shoes previously mentioned.

    Fig. 4. Schematic representation of the overall archi-tecture developed for the non-interactive experiments.

    The visual feedback was developed using theUnity3D game engine and consisted of the same land-scapes presented during the interactive experimentswith in addition an avatar walking or running overa flat surface. In particular, the experiments includedtwo avatar models to match the participants gender.The male model was composed by 3754 triangles,while the female model by 4488. Both models fea-tured a skeleton with 39 joints. The walk and runanimations were based on motion capture data andhave been developed by Mixamo using Stefano Cor-razzas patented technology [26]. An example of thevisual feedback provided in the experiments can beseen in Figure 5. The auditory feedback consisted ofaudio files containing both soundscapes and footstepsounds. The utilized soundscapes were the same usedfor the interactive experiments but converted in stereoformat. The footstep sounds were created by meansof the offline use of the footstep sounds synthesizerutilized during the interactive experiments. This syn-thesizer is based on physical models which are driven

    4. Sennheiser HD 600, http://www.sennheiser.com

    Fig. 5. Three examples of the visual feedback providedin the experiments. From top to bottom: a femalewalking in a beach, a male running in a forest and amale walking on snow.

    by a signal, in the audio domain, expressing theground reaction force (GRF), i.e., the reaction forcesupplied by the ground at every step [27]. In oursimulations the GRF corresponds to the amplitudeenvelope extracted from an audio signal containinga footstep sound.In order to produce the walk or the run over differentsurface materials, we created different audio filesusing the recording of real footstep sounds on con-crete. Such sounds were chosen among those availablein the Hollywood Edge sound effects library.5 As anexample, Figure 6 shows both the waveform and thecorresponding extracted GRF of a footstep sound ona concrete floor during the walk and during the run.

    5. http://www.hollywoodedge.com

    This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.

  • IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 1, JANUARY 2007 6

    The time interval between each step was set in orderto match the walk and the run of the avatar appearingat visual level, precisely 566.66 and 333.33 millise-conds for the walk and the run respectively. The sameaudio files were used also during the experimentswithout the visual feedback. In this regard, in ourprevious research on audio and haptic simulation ofbumps, holes and flat surfaces we found that a surfaceprofile is perceived as flat, both at audio and at hapticlevel, when the simulated steps are placed at equaltemporal distances [24].For the purpose of the experiments, the engine wasconfigured in order to synthesize footstep sounds onthe same surface materials of the interactive experi-ment (forest floor, snow and sand).

    Regarding the haptic feedback, it was generated bymeans of the same files produced for the auditoryfeedback, involving the same procedures mentionedin section 2.1 to convert the audio signals into vibro-tactile ones. In addition, at haptic level each step wasprovided alternating the left and the right shoe, cohe-rently with the order of the steps produced visuallyby the avatar.

    4 INTERACTIVE FEEDBACK EXPERIMENTSWe conducted three between-subjects experimentswhose goal was to assess the role of the haptic feed-back in enhancing the realism of the footstep soundsinteractively generated during the user locomotion.The simulated surface materials were snow, sand andforest floor. Experiment 1 consisted of the presenta-tion audio-haptic footsteps simulations alone, while inexperiment 2 and 3 the soundscapes and the sound-scapes plus the visual feedback were added respec-tively. Each experiment was divided in two parts. Du-ring the first part participants were exposed to 8 trialslasting each 14 seconds. Each trial consisted of twoparts, lasting each 6 seconds, which were divided by2 seconds of absence of interactive feedback. In threetrials the three simulated surfaces were presented inthe auditory only condition in the first part and inthe audio-haptic condition in the second part (referredas trials A-AH from now on). In three other trialsthey were presented in the audio-haptic condition inthe first part and in the auditory only condition inthe second part (referred as trials AH-A from nowon). The remaining two trials were used as controlcondition, and they consisted of the presentation, inboth the parts, of the auditory feedback alone and ofthe audio-haptic feedback respectively. For these twotrials the forest floor material was utilized. After thepresentation of each trial participants were asked tocompare the two parts of the walk by answering thefollowing questions:

    Have you noticed any difference between the firstand the second part of the walk?

    If yes, what has changed?

    Which of the two parts do you prefer? Why?At the conclusion of the first part of the experiment

    participants were told that the difference consisted ofthe use of the haptic feedback.

    During the second part of the experiment partici-pants were exposed to 6 trials, consisting of the sametrials presented in the first part without including thecontrol conditions. For each A-AH trials participantswere asked to choose one of the following statements:

    The presence of the haptic feedback in the secondpart of the walk:

    has increased the realism of the simulationrespect to the first part of the walk.

    has decreased the realism of the simulationrespect to the first part of the walk.

    has not produced any difference in the rea-lism of the simulation respect to the first partof the walk.

    Similarly, during the AH-A trials participants wereasked to answer the same questions asked for the A-AH trials considering this time the absence of thehaptic feedback in the second part of the walk. Incase of an answer concerning an increase or a decreaseof the realism, participants were asked to evaluate towhich extent it occurred, on a 9-points Likert scale (1= very little, 9 = very much).In both the parts of the experiment the trials were pre-sented once in randomized order. Participants wereinstructed to walk also during the two seconds inwhich the interactive feedback would not have ge-nerated.During experiment 3, one experimenter was placed inthe same room where the experiment was performedin order to prevent the participants to fall downbecause of eventual balance losses due do the visualfeedback provided through the HMD.

    Our hypotheses were manifold. Concerning the firstpart of the experiments, we hypothesized that partici-pants would have noticed the presence of the hapticfeedback with higher percentages in experiment 1,rather than the other two, with the lowest percentagesfor the experiment 3. Secondly we hypothesized aclear preference for the audio-haptic condition com-pared to the auditory one. As regards the second partof the experiments, we expected that the presence ofthe context (soundscapes in experiment 2 and sound-scapes plus visual feedback in experiment 3) wouldhave led to an increase of the evaluated enhancementof the realism of the audio-haptic condition comparedto experiment 1.

    4.1 ParticipantsThirty-six participants were divided in 3 groups (n= 12) to perform the 3 between-subjects experiments.The 3 groups were composed respectively of 9 menand 3 women, aged between 20 and 30 (mean = 24.5,

    This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.

  • IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 1, JANUARY 2007 7

    Fig. 6. Waveforms (top) and relative extracted GRF (bottom) of a typical footstep sound on a concrete floor whilewalking (left) and while running (right). It is possible to notice the different temporal length of the two steps aswell as the different times in which the heel and toe strikes occur. In addition in the plots of both the GRFs it ispossible to notice the sub-events heel and toe.

    standard deviation = 3), 5 men and 7 women, agedbetween 19 and 25 (mean = 21.5, standard deviation= 1.67), and 11 men and 4 women, aged between 19and 25 (mean = 21.75, standard deviation = 1.91).All participants reported normal hearing conditions,normal or corrected-to-normal vision, and no locomo-tion problems. The size of the used pair of sandals was43 (EUR). In order for the size of sandals not to affectperformance, subjects wore shoes sizes from 41 to 45(EUR).

    4.2 ResultsResults of the first part of the interactive feedbackexperiments are illustrated in Table 1. The first notice-able thing is that on average participants understoodquite well that the difference between the two parts ofthe walk in each trial consisted of the haptic feedback.The significance of this result was confirmed by abinomial test (p < 0.0001, p = 0.0002, p = 0.006, forexperiment 1, 2 and 3 respectively). Similarly, theywere significantly precise in the evaluations of thecontrol conditions (p = 0.02, p = 0.006, p = 0.0002,for experiment 1, 2 and 3 respectively).As concerns the preference of the two parts, it wascalculated only for the subjects who correctly under-stood which was the difference. As it is possible tonotice, on average participants preferred the audio-haptic condition rather the auditory one, although thepercentages of preference were higher for experiment3 compared to experiment 1 and 2, and for experi-ment 2 compared to experiment 1. The binomial testrevealed that there is a significant preference for the

    audio-haptic condition only in experiment 2 and 3 (p= 0.003 and p < 0.0001 respectively).When asked to motivate their preference, participantswho preferred the presence of the haptic feedbackanswered that their experience interacting with thesystem seemed to them closer to the real life, morefun, and they even felt it was easier to walk. In addi-tion some participants reported that the realism of theexperience was increased by the haptic feedback sincethey had the impression that sound came directlyfrom their feet, while when the footstep sounds wereconveyed alone they had harder time in localizing thesound source under their feet. As a matter of fact,an audible output is generated by the haptic shoesas result of the activation of the actuators. Howeversuch sounds have a low amplitude, a low qualityand are masked by the footstep sounds produced byloudspeakers.Conversely, participants who preferred the auditoryfeedback alone reported that without the haptic feed-back the experience was more normal, more comfort-able, and less weird. Some participants reported thatthey had the impression that the vibrations did not fitwell with the provided sounds, while others said thatthey would have expected other types of vibrations.

    Table 2 shows the results of the second part ofthe three interactive feedback experiments. As can beseen, in all the experiments and in all the trials themost part of participants evaluated the haptic feed-back as enhancing the realism of the simulations. Thebinomial test revealed a significant preference for theincrement of realism in presence of haptic feedback

    This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.

  • IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 1, JANUARY 2007 8

    (p < 0.0001, p = 0.01, p < 0.0001, for experiment1, 2 and 3 respectively) and for the decrement ofrealism in its absence (p = 0.003, p = 0.01, p < 0.0001,for experiment 1, 2 and 3 respectively). Concerningthe evaluations of the increment of realism in pres-ence of haptic feedback, results show that they areon average higher than 4.5, (i.e. towards the verymuch), therefore this increase is not negligible. Thesame holds for the evaluations of the decrement ofrealism in absence of haptic feedback. In addition,comparing the evaluations between the stimuli it ispossible to order the surface materials in terms ofenhanced realism. Indeed in all experiments snowpresented the highest ratings, and sand was ratedwith ratings higher than forest floor.From a comparison between the three experiments itemerges that the average values of the increment ofrealism in trials A-AH, as well as of the decrement ofrealism in trials AH-A, are lower when the footstepsare simulated alone compared to when they are pro-vided with soundscapes and with soundscapes plusvisual feedback. In particular the highest ratings arepresent for experiment 3. A statistical analysis wasperformed by means of an ANOVA with repeatedmeasures by considering the three experiments foreach of the two dependent variables (increment ofrealism in presence of haptic feedback, and decrementof realism in absence of haptic feedback). Resultsrevealed that all such differences are not statisticallysignificant.

    5 NON-INTERACTIVE FEEDBACK EXPERI-MENTS

    Three between-subjects experiments were conductedwith the goal of assessing the role of the hapticfeedback in enhancing the realism of the footstepsounds presented by using the non-interactive feed-back system described in section 3. Participants saton a chair passively receiving the audio-haptic simu-lations which were presented in the three conditions,alone (experiment 4), with soundscapes (experiment5), and with the soundscapes plus the visual feed-back (experiment 6). The experimental procedure wassimilar to the one utilized during the second part ofthe interactive feedback experiments. In particular thestructure of the trials, the used surface materials, aswell as the asked questions were the same. Partici-pants were exposed to 12 trials, 6 simulating a walkand 6 simulating a run. The trials were presented oncein randomized order.All experiments were carried out in an acousticallyisolated laboratory where the setups for the experi-ments were installed. Participants were asked to inte-ract with a simple graphical user interface composedonly by buttons to be pressed, and to collect theiranswers on a sheet.

    A part of the experiment similar to the first partof the interactive experiments was not conductedbecause participants had to wear the shoes but theydid not have to walk. Therefore it would have beenobvious for them to look at the feet to find differencesbetween the two parts of the trials. In addition thepreference for the auditory or the audio-haptic feed-back would have been deducted by the participantschoices of the presented questions. Furthermore theexperiments would have been too long, and partici-pants could have answered randomly, because of theloss of attention. These aspects were assessed duringa pilot test with 4 subjects in all three experiments.

    Our hypotheses were that the majority of partici-pants would have preferred the audio-haptic condi-tion in both the trials A-AH and AH-A, and thatthe enhancement of the realism of the audio-hapticcondition would have evaluated with higher ratingsfor experiments 2 and 3 compared to experiment 1.

    5.1 ParticipantsThirty-six participants were divided in 3 groups (n= 12) to perform the 3 between-subjects experiments.The 3 groups were composed respectively of 8 menand 4 women, aged between 19 and 27 (mean = 21.91,standard deviation = 2.42), 8 men and 4 women, agedbetween 19 and 26 (mean = 22.08, standard deviation= 2.06), and 7 men and 5 women, aged between 20and 32 (mean = 23.75, standard deviation = 3.22).All participants reported normal hearing conditions,normal or corrected-to-normal vision, and no locomo-tion problems. The sandals were the same adopted forthe interactive experiments, therefore also in this casesubjects wore shoes sizes from 41 to 45 (EUR).

    5.2 ResultsTables 3 and 4 show the results of the three non-interactive experiments. The first noticeable thingemerging from results is that for the most part partici-pants evaluated the haptic feedback as enhancing therealism of the simulations in all the experiments andin all the trials. The binomial test revealed a significantpreference for the increment of realism in presence ofhaptic feedback (p < 0.0001 for all experiments) andfor the decrement of realism in its absence (p < 0.0001for all experiments). As regards the evaluations of theincrease of realism in presence of haptic feedback, aswell as the evaluations of the decrease of realism in itsabsence, results show that they are most of the timeshigher than 4.5, (i.e. towards the very much). Fromthe comparison of the evaluations between the stimuliit is possible to notice that the surface materials canbe ordered by the enhanced realism. Indeed in allexperiments snow presents the highest ratings, andsand presents ratings higher than forest floor.In addition, from a comparison between the threeexperiments both in the case of the walk and run,

    This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.

  • IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 1, JANUARY 2007 9

    TABLE 1Results of the first part of the interactive feedback experiments.

    Experiment 1 Experiment 2 Experiment 3

    % Difference % Preference % Difference % Preference % Difference % Preference

    Trials Correct Answers A AH Correct Answers A AH Correct Answers A AH

    A-AH Snow 75 44.45 55.55 75 22.23 77.77 75 0 100

    A-AH Sand 75 55.56 44.44 66.66 25 75 75 33.34 66.66

    A-AH Forest floor 83.33 50 50 66.66 12.5 87.5 66.66 25 75

    AH-A Snow 75 33.34 66.66 91.66 45.46 54.54 58.33 14.29 85.71

    AH-A Sand 83.33 30 70 58.33 28.58 71.42 58.33 0 100

    AH-A Forest floor 75 33.34 66.66 75 33.34 66.66 66.66 37.5 62.5

    A-A Forest floor 75 - - 75 - - 91.67 - -

    AH-AH Forest floor 75 - - 83.34 - - 83.34 - -

    TABLE 2Results of the second part of the interactive feedback experiments. For each stimulus the percentage of the

    chosen answers and the corresponding evaluation (mean and standard deviation) are shown.

    Trials A-AH Presence of the haptic feedback Trials AH-A Absence of the haptic feedback

    Increased Decreased No difference Increased Decreased No difference

    Exp. 1Snow 100% 0% 0% Snow 16.67% 75% 8.33%

    6.162.12 51.41 5.881.76

    Sand 83.34% 8.33% 8.33% Sand 0% 83.33% 16.66%4.92.13 60 4.71.94

    Forest floor 75% 8.33% 16.67% Forest floor 8.33% 66.67% 25%3.442 40 40 3.121.45

    Exp. 2Snow 75% 25% 0% Snow 25% 66.67% 8.33%

    5.772.33 41 4.332.51 6.122.1

    Sand 75% 16.67% 8.33% Sand 16.66% 83.34% 0%52.06 2.52.12 3.53.53 5.11.72

    Forest floor 66.66% 16.67% 16.67% Forest floor 16.67% 66.66% 16.67%4.872.69 21.14 61.41 2.871.95

    Exp. 3Snow 91.67% 0% 8.33% Snow 0% 91.67% 8.33%

    6.91.81 6.542.11

    Sand 83.34% 8.33% 8.33% Sand 0% 91.67% 8.33%6.22.25 60 6.361.91

    Forest floor 83.34% 0% 16.66% Forest floor 0% 83.34% 16.66%5.92.13 5.32

    it emerges that the average values of the incrementof realism in trials A-AH, as well as of the decre-ment of realism in trials AH-A, are lower when thefootsteps are simulated alone compared to when theyare provided with soundscapes and with soundscapesplus visual feedback. In particular the highest ratingsare present for experiment 6. However, the results ofan ANOVA with repeated measures revealed that allsuch differences are not statistically significant.

    6 DISCUSSIONFrom a comparison of results from the first parts of thethree interactive feedback experiments, it is possibleto notice that on average participants did not notice

    the presence of the haptic feedback with significanthigher percentages in experiment 1 compared to theother two. A deeper analysis conducted at subjectlevel revealed that subjects could be divided in threecategories: those who understood the difference in alltrials, those who understood it some times, and thosewho never understood it. In particular the latter groupwas composed by 2 participants in experiment 1 and2, and 3 participants in experiment 3. When at the endof the experiment they were told that the differenceconsisted in the haptic feedback, they reported thatthey were so totally driven by the auditory feedbackin experiment 1 and 2 and by the visual feedbackin experiment 3 that they even did not perceive the

    This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.

  • IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 1, JANUARY 2007 10

    TABLE 3Results of the non-interactive feedback experiments (walking trials).

    Trials A-AH Presence of the haptic feedback Trials AH-A Absence of the haptic feedback

    Increased Decreased No difference Increased Decreased No difference

    Exp. 4Snow 83.34% 16.66% 0% Snow 16.67% 75% 8.33%

    5.13.24 41.14 40 5.552.87

    Sand 75% 25% 0% Sand 0% 91.67% 8.33%52.06 5.333.78 4.451.43

    Forest floor 83.34% 16.66% 0% Forest floor 8.33% 83.34% 8.33%3.62.06 1.50.7 40 42.58

    Exp. 5Snow 83.34% 8.33% 8.33% Snow 0% 83.34% 16.66%

    6.91.85 50 6.86.61

    Sand 66.66% 16.67% 16.67% Sand 0% 91.67% 8.33%52.26 4.52.12 5.272.24

    Forest floor 91.67% 8.33% 0% Forest floor 8.33% 83.34% 8.33%4.182.35 70 80 4.62.31

    Exp. 6Snow 91.67% 0% 8.33% Snow 16.66% 83.34% 0%

    5.182.27 42.82 6.10.99

    Sand 91.67% 0% 8.33% Sand 8.33% 91.67% 0%4.721.42 20 4.91.7

    Forest floor 91.67% 0% 8.33% Forest floor 8.33% 83.34% 8.33%4.091.51 20 3.91.44

    TABLE 4Results of the non-interactive feedback experiments (running trials).

    Trials A-AH Presence of the haptic feedback Trials AH-A Absence of the haptic feedback

    Increased Decreased No difference Increased Decreased No difference

    Exp. 4Snow 66.67% 33.33% 0% Snow 16.66% 83.34% 0%

    4.53.29 3.52.38 4.53.53 5.12.18

    Sand 100% 0% 0% Sand 8.33% 91.67% 0%4.251.76 50 4.812.04

    Forest floor 58.34% 33.33% 8.33% Forest floor 8.33% 91.67% 0%4.141.67 2.250.95 30 3.92.58

    Exp. 5Snow 83.34% 8.33% 8.33% Snow 8.33% 91.67% 0%

    6.22.29 50 30 6.181.66

    Sand 75% 25% 0% Sand 8.33% 83.34% 8.33%5.771.92 31 20 5.91.52

    Forest floor 83.34% 16.66% 0% Forest floor 0% 83.34% 16.66%4.72.11 42.82 5.42.06

    Exp. 6Snow 83.34% 8.33% 8.33% Snow 0% 100% 0%

    5.62.79 20 5.331.82

    Sand 91.67% 8.33% 0% Sand 0% 100% 0%4.272.45 40 51.65

    Forest floor 91.67% 0% 8.33% Forest floor 0% 83.34% 16.66%4.452.42 4.51.71

    presence of the haptic feedback. In addition when inthe second part of the experiment they were askedto focus on the vibrations provided by the shoesin order to evaluate the enhancement of the realismproduced by the haptic feedback, either their ratingswere among the lowest, or they reported that thepresence of the haptic feedback did not produce anydifference in the realism of the simulation compared

    to its absence. Clearly, from this result it is possibleto conclude that the perception of the interactivehaptic feedback provided at feet level by using theproposed system, varies in great measure from personto person.

    Results of the second parts of the interactive feed-back experiments as well as of the non-interactivefeedback experiments confirmed our hypothesis that

    This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.

  • IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 1, JANUARY 2007 11

    the audio-haptic condition would have been preferredto the auditory one. Indeed at both interactive andnon-interactive level, on average more than 66% ofparticipants reported that the presence of the hapticfeedback increased the realism of the simulation com-pared to when it was not provided, and its absencedecreased the realism of the simulation compared towhen it was present.Nevertheless, in both the interactive and non-interactive feedback experiments some participantsdid not like the presence of the provided hapticfeedback. The reasons of this lie in the fact that onone hand participants felt more comfortable withoutthe proposed vibrations, and on the other hand theprovided simulations did not match with their expec-tations.However, most of the participants were satisfied bythe proposed simulations, and this emerged also fromthe open comment they left at the end of the experi-ence. In particular 9 participants in the interactive ex-periments and 11 in the non-interactive ones reportedthat the simulation of snow was the most convincingone. This aspect can be also noticed looking at theresults illustrated in Tables 2, 3 and 4. In addition, itis possible to order the surface materials in terms ofrealism enhanced by the haptic feedback, with snowhaving the highest evaluations, and sand presentingevaluations higher than forest floor.

    From a comparison between Tables 3 and 4, itis possible to notice that there are no substantialdifferences between the evaluations expressed by par-ticipants on the simulated walk and those on thesimulated run.

    As regards the effect of the context on the perceivedrealism induced by the haptic feedback, our hypotesiswas confirmed. Indeed results of the second partof the interactive feedback experiments, as well asof the non interactive feedback experiments, showthat the average ratings for the experiments in whichthe audio-haptic simulations were provided alonewere lower than those in presence of soundscapes orsoundscapes plus visual feedback, although not in asignificant way. However, the experimental design fol-lowed a between-subjects approach, and a confirma-tion of this result necessitates further investigations.

    Overall our results provide evidence of the impor-tance of the use of the tactile channel to enhancethe walking experience in both multimodal interactiveand non-interactive contexts. This result is in accordwith the findings reported in [11] and [15] whichshowed that plantar vibrotactile feedback plays a rele-vant role in the perception of both real and simulatedground surfaces during the act of walking.However, the fact that not all participants preferredthe proposed interactive and non-interactive hapticfeedback might be due to the quality of the sur-faces simulations. This consideration is supported bythe fact that the results reported in our previous

    works [18], [17] on audio-haptic discrimination tasksof simulated ground surfaces revealed that auditionplayed a role of dominance on the haptic modality,while audio-haptic recognition tasks involving realmaterials suggested that the haptic modality is thedominant one [11]. Nevertheless, the different evalu-ations expressed by each participant may be linkedto the individual propensity to be involved in thesimulations [28].

    7 CONCLUSIONIn this paper we proposed several experiments to in-vestigate the role of haptic feedback in enhancing therealism of a walking experience in multimodal envi-ronments both in an interactive and a non-interactiveconfiguration. Results of the experiments show thathaptic feedback delivered at feet level significantly en-hances the perceived realism. However, the subjectsreaction was divided among those who appreciatedthe haptic feedback, and those who found it annoying.Nevertheless this might be due on one hand to thelimits of the haptic simulation and on the other handto the different individual propensity to be involvedin the simulations.

    Considering the overall simulated architecture, se-veral subjects found it satisfactory. This is especiallythe case for the environment simulating snow, wherehigh ratings were reported in all experiments.

    The results here reported provide evidence that theuse of the haptic channel can lead to more realistic ex-periences in both interactive and non-interactive con-figurations. Indeed our findings can find applicationin the context of physical navigation in multimodalvirtual environments as well as in entertainment sys-tems, for example to enhance the user experience ofwatching a movie or playing a video game.

    ACKNOWLEDGMENTSThe research leading to these results has receivedfunding from the European Communitys SeventhFramework Programme under FET-Open grant agree-ment 222107 NIW - Natural Interactive Walking. 6

    REFERENCES[1] A. El Saddik, The potential of haptics technologies, IEEE

    Instrumentation & Measurement, vol. 10, no. 1, pp. 1017, 2007.[2] K. Kuchenbecker, J. Fiene, and G. Niemeyer, Improving con-

    tact realism through event-based haptic feedback, Visualiza-tion and Computer Graphics, IEEE Transactions on, vol. 12, no. 2,pp. 219230, 2006.

    [3] E.-L. Sallnas, K. Rassmus-Grohn, and C. Sjostrom, Supportingpresence in collaborative environments by haptic force feed-back, ACM Transactions on Computer-Human Interaction, vol. 7,no. 4, pp. 461476, 2000.

    [4] P. Lemmens, F. Crompvoets, D. Brokken, J. Van Den Eeren-beemd, and G.-J. De Vries, A body-conforming tactile jacketto enrich movie viewing, World Haptics 2009, pp. 712, 2009.

    6. www.niwproject.eu

    This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.

  • IEEE TRANSACTIONS ON HAPTICS, VOL. 6, NO. 1, JANUARY 2007 12

    [5] Y. Kim, J. Cha, I. Oakley, and J. Ryu, Exploring tactile movies:An initial tactile glove design and concept evaluation, IeeeMultimedia, pp. 119, 2009.

    [6] E. Dijk and M. Weffers, Breathe with the ocean: A system forrelaxation using combined audio and haptic stimuli, in Proc.Special Symposium on Haptic and Audio-Visual Stimuli: EnhancingExperiences and Interaction, 2010.

    [7] D. Chang, Haptics: gamings new sensation, Computer,vol. 35, no. 8, pp. 8486, 2002.

    [8] S. Lederman et al., Auditory texture perception, Perception,vol. 8, no. 1, pp. 93103, 1979.

    [9] D. DiFranco, G. Beauregard, and M. Srinivasan, The effect ofauditory cues on the haptic perception of stiffness in virtualenvironments, in Proceedings of the ASME Dynamic Systemsand Control Division, vol. 61, 1997, pp. 1722.

    [10] F. Avanzini and P. Crosato, Integrating physically basedsound models in a multimodal rendering architecture, TheJournal of Visualization and Computer Animation, vol. 17, no. 34, pp. 411419, 2006.

    [11] B. Giordano, Y. Visell, H.-Y. Yao, V. Hayward, J. Cooperstock,and S. McAdams, Identification of walked-upon materialsin auditory, kinesthetic, haptic and audio-haptic conditions.Journal of the Acoustical Society of America, 2012. Accepted.

    [12] Y. Visell, A. Law, and J. R. Cooperstock, Touch is everywhere:Floor surfaces as ambient haptic interfaces, IEEE Transactionson Haptics, vol. 2, pp. 148159, 2009.

    [13] L. Turchet, R. Nordahl, A. Berrezag, S. Dimitrov, V. Hayward,and S. Serafin, Audio-haptic physically based simulation ofwalking on different grounds. in Proceedings of IEEE Interna-tional Workshop on Multimedia Signal Processing. IEEE Press,2010, pp. 269273.

    [14] S. Papetti, F. Fontana, M. Civolani, A. Berrezag, and V. Hay-ward, Audio-tactile display of ground properties using in-teractive shoes, in Haptic and Audio Interaction Design, ser.Lecture Notes in Computer Science, 2010, vol. 6306, pp. 117128.

    [15] Y. Visell, B. L. Giordano, G. Millet, and J. R. Cooperstock,Vibration influences haptic perception of surface complianceduring walking, PLoS ONE, vol. 6, no. 3, p. 11, 2011.

    [16] L. Turchet, S. Serafin, S. Dimitrov, and R. Nordahl, Physicallybased sound synthesis and control of footsteps sounds, inProceedings of Digital Audio Effects Conference, 2010, pp. 161168.

    [17] R. Nordahl, A. Berrezag, S. Dimitrov, L. Turchet, V. Hayward,and S. Serafin, Preliminary experiment combining virtualreality haptic shoes and audio synthesis, in Haptics: Gener-ating and Perceiving Tangible Sensations, ser. Lecture Notes inComputer Science. Springer Berlin / Heidelberg, 2010, vol.6192, pp. 123129.

    [18] S. Serafin, L. Turchet, R. Nordahl, S. Dimitrov, A. Berrezag, andV. Hayward, Identification of virtual grounds using virtualreality haptic shoes and sound synthesis, in Proceedings ofEurohaptics symposium on Haptic and Audio-Visual Stimuli: En-hancing Experiences and Interaction, 2010, pp. 6170.

    [19] S. Serafin, L. Turchet, N. Nilsson, and R. Nordahl, A multi-modal architecture for simulating natural interactive walkingin virtual environments, PsychNology Journal, vol. 9, no. 3, pp.245268, 2012.

    [20] H.-Y. Yao and V. Hayward, Design and analysis of a recoil-type vibrotactile transducer. Journal of the Acoustical Society ofAmerica, vol. 128, no. 2, pp. 619627, 2010.

    [21] R. Nordahl, S. Serafin, and L. Turchet, Sound synthesis andevaluation of interactive footsteps and environmental soundsrendering for virtual reality applications, IEEE Transactionson Visualization and Computer Graphics, vol. 17, no. 9, pp. 12341244, 2011.

    [22] J. Schacher and M. Neukom, Ambisonics Spatialization Toolsfor Max/MSP, in Proceedings of the International ComputerMusic Conference, 2006.

    [23] L. Turchet, S. Serafin, S. Dimitrov, and R. Nordahl, Conflictingaudio-haptic feedback in physically based simulation of walk-ing sounds, in Haptic and Audio Interaction Design, ser. LectureNotes in Computer Science. Springer Berlin / Heidelberg,2010, vol. 6306, pp. 97106.

    [24] L. Turchet and S. Serafin, An investigation on temporalaspects in the audio-haptic simulation of footsteps, in Mul-tidisciplinary Aspects of Time and Time Perception, ser. Lecture

    Notes in Computer Science. Springer Berlin / Heidelberg,2011, vol. 6789, pp. 101115.

    [25] P. Cook, Physically Informed Sonic Modeling (PhISM): Syn-thesis of Percussive Sounds, Computer Music Journal, vol. 21,no. 3, pp. 3849, 1997.

    [26] S. Corrazza, Distributed markerless motion capture, PatentUS 2010/0 285 877 A1, 11 11, 2010.

    [27] Y. Visell, F. Fontana, B. Giordano, R. Nordahl, S. Serafin,and R. Bresin, Sound design and perception in walkinginteractions, International Journal of Human-Computer Studies,vol. 67, no. 11, pp. 947959, 2009.

    [28] D. Scott, C. Stohler, C. Egnatuk, H. Wang, R. Koeppe, and J.-K.Zubieta, Individual differences in reward responding explainplacebo-induced expectations and effects. Neuron, vol. 55,no. 2, pp. 325336, 2007.

    Luca Turchet Luca Turchet was born inVerona, Italy, where he received the Laureadegree (summa cum laude) in computer sci-ence from the University of Verona, in 2006.At the same time, he studied classical gui-tar and composition at the Music Conserva-tory E.F. DallAbaco of Verona receiving theDiploma degrees in 2007 and 2009 respec-tively (both summa cum laude). In 2006 hewas a Visiting Scholar at the Music Technol-ogy Group of Barcelona, Spain, and in 2008

    at the Cork Institute of Technology - School of Music in Cork, Ireland.In 2009 he started the PhD at the Medialogy Department of theAalborg University Copenhagen working on the Natural InteractiveWalking Project. His main research interests are in interactive sounddesign, haptic technology, multimedia and multimodal systems,sound spatialization, perception and human-computer interaction.

    Paolo Burelli Paolo Burelli is a Ph.D. stu-dent at IT University of Copenhagen in the-Center For Computer Games Research, hisresearch work focuses on automatic cameracontrol in computer games, he also previ-ously worked at University of Udine in theHCILab as a research fellow and at Eurotechas an embedded Linux developer. Paolo haspublished articles on virtual camera controlin AAAI and IEEE conferences on artificialintelligence and computer graphics, he has

    also been part of the organization committee for CIG 2010 andWICED 2012. His research interests include computer games,human-computer interaction, perception and artificial intelligencewith a particular focus on virtual cinematography.

    Stefania Serafin Stefania Serafin is Asso-ciate Professor in sound modeling at AalborgUniversity in Copenhagen. She received aPh.D. in Computer Based Music Theory andAcoustics from Stanford University in 2004,and a Master in Acoustics, Computer Sci-ence and Signal Processing Applied to Musicfrom Ircam (Paris) in 1997. She has beenvisiting professor at the University of Virginia(2003), and visiting scholar at Stanford Uni-versity (1999), Cambridge University (2002)

    and KTH Stockholm (2003). She is principal investigator for the EUfunded project Natural Interactive Walking, and Danish delegate forthe EU COST Action on Sonic Interaction Design. Her main researchinterests are sound models for interactive systems and multimodalinterfaces, and sonic interaction design.

    This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication.