AirPiano: Enhancing Music Playing Experience in Virtual ...jessekim.com/docs/AirPiano.pdf · system...

6
AirPiano: Enhancing Music Playing Experience in Virtual Reality with Mid-Air Haptic Feedback Inwook Hwang, 1 Hyungki Son, 2 and Jin Ryong Kim 1 Abstract— We present AirPiano, an enhanced music playing system to provide touchable experiences in HMD-based virtual reality with mid-air haptic feedback. AirPiano allows users to enjoy enriched virtual piano-playing experiences with touchable keys in the air. We implement two haptic rendering schemes to mimic the resisting force of piano keys using ultrasonic vibrations. Constant Feedback is designed to provide short and intense feedback whereas Adaptive Feedback is designed to follow the changes in feedback from a real keypress. A user study is conducted using three simple musical pieces. Results show that adding mid-air haptic rendering can significantly improve the user experience on the virtual piano. Adaptive Feedback further increases clarity, reality, enjoyment, and user satisfaction, when compared to circumstances in which no haptic feedback is being generated. Our design of AirPiano can shed light on the introduction of a touchable musical playing system in virtual reality with mid-air haptic feedback. I. INTRODUCTION With recent advancements in 3D graphics, body tracking, and optical technologies, Head-Mounted Display (HMD)- based virtual reality are being revived with great attention. Since they can create immersive visual experiences, vari- ous virtual reality products in games, medical simulation, military training, education, and sports have flooded the consumer market. An example of a typical virtual reality experience that provides enriched multimodal interaction can be playing a musical instrument. Considering that HMD-based virtual realities can create a realistic music playing experience with immersive visual and auditory feedback, people can enjoy their music playing experiences without actually having to obtain a real musical instrument. Adding haptic feedback in virtual reality can improve the music playing experiences significantly. In fact, there are several studies which have proposed using force feedback equipment to provide haptic sensations to such users in virtual reality [1], [2]. However, kinesthetic force feedback devices usually entail complicated and strict design processes of compatible equipment, which may result in ineffectiveness and bulkiness when multi-finger interaction is introduced. *This work was partly supported by Institute for Information & com- munications Technology Promotion(IITP) grant funded by the Korea gov- ernment(MSIP) (No. B0117-16-1005) and a Cross-Ministry Giga Korea Project (GK16C0100; Development of Interactive and Realistic Massive Giga Content Technology) of the Ministry of Science, ICT and Future Planning, Korea. 1 Inwook Hwang and Jin Ryong Kim are with Electronics and Telecom- munications Research Institute (ETRI), 218 Gajeong-ro, Yuseong-gu, Dae- jeon, Republic of Korea 34129 inux, [email protected] 2 Hyungki Son with University of Science and Technology, 217 Gajeong-ro, Yuseong-gu, Daejeon, Republic of Korea 34113 [email protected] Fig. 1. Virtual scene of AirPiano. This can be hugely problematic in the virtual music playing scenario, because many popular musical instruments such as the piano, flute, clarinet, and guitar require multiple fingers to play them. A wearable device with tactile feedback such as tactile gloves may be one solution to address this issue, as it can provide haptic feedback to each fingertip. However, the wearable device entails a major practical issue of inconvenience in that it has to be physically worn by users with complicated wires and circuits to drive the actuators. In this paper, we present AirPiano, a multi-finger music playing system to provide an enriched piano playing experi- ence with mid-air haptic feedback in an HMD-based virtual reality environment (Fig. 1). AirPiano delivers tactile sensa- tions of the naturally resisting forces of piano keys to one or more fingertips simultaneously via ultrasonic non-contact haptic display. We explore the effectiveness of AirPiano by conducting a set of piano playing tasks requiring one finger, two fingers each from separate hands, and two fingers from the same hand. We believe that our research makes the following contri- butions: An interaction design that naturally integrates visual, auditory, and tactile cues to improve virtual reality experience Mimicking the mechanisms of piano keys through mid- air haptic response resulting in higher user satisfaction and enriched user experience II. RELATED WORKS A. Haptic Feedback of Musical Instruments There exists a broad range of work centering on the reproduction of tactile feedback of musical instruments: The 2017 IEEE World Haptics Conference (WHC) Fürstenfeldbruck (Munich), Germany 6–9 June 2017 978-1-5090-1424-8/17/$31.00 ©2017 IEEE 213

Transcript of AirPiano: Enhancing Music Playing Experience in Virtual ...jessekim.com/docs/AirPiano.pdf · system...

Page 1: AirPiano: Enhancing Music Playing Experience in Virtual ...jessekim.com/docs/AirPiano.pdf · system in virtual reality with mid-air haptic ... via ultrasonic non-contact haptic ...

AirPiano: Enhancing Music Playing Experience in Virtual Reality withMid-Air Haptic Feedback

Inwook Hwang,1 Hyungki Son,2 and Jin Ryong Kim1

Abstract— We present AirPiano, an enhanced music playingsystem to provide touchable experiences in HMD-based virtualreality with mid-air haptic feedback. AirPiano allows users toenjoy enriched virtual piano-playing experiences with touchablekeys in the air. We implement two haptic rendering schemesto mimic the resisting force of piano keys using ultrasonicvibrations. Constant Feedback is designed to provide short andintense feedback whereas Adaptive Feedback is designed tofollow the changes in feedback from a real keypress. A userstudy is conducted using three simple musical pieces. Resultsshow that adding mid-air haptic rendering can significantlyimprove the user experience on the virtual piano. AdaptiveFeedback further increases clarity, reality, enjoyment, and usersatisfaction, when compared to circumstances in which nohaptic feedback is being generated. Our design of AirPiano canshed light on the introduction of a touchable musical playingsystem in virtual reality with mid-air haptic feedback.

I. INTRODUCTIONWith recent advancements in 3D graphics, body tracking,

and optical technologies, Head-Mounted Display (HMD)-based virtual reality are being revived with great attention.Since they can create immersive visual experiences, vari-ous virtual reality products in games, medical simulation,military training, education, and sports have flooded theconsumer market.

An example of a typical virtual reality experience thatprovides enriched multimodal interaction can be playinga musical instrument. Considering that HMD-based virtualrealities can create a realistic music playing experience withimmersive visual and auditory feedback, people can enjoytheir music playing experiences without actually having toobtain a real musical instrument.

Adding haptic feedback in virtual reality can improve themusic playing experiences significantly. In fact, there areseveral studies which have proposed using force feedbackequipment to provide haptic sensations to such users invirtual reality [1], [2]. However, kinesthetic force feedbackdevices usually entail complicated and strict design processesof compatible equipment, which may result in ineffectivenessand bulkiness when multi-finger interaction is introduced.

*This work was partly supported by Institute for Information & com-munications Technology Promotion(IITP) grant funded by the Korea gov-ernment(MSIP) (No. B0117-16-1005) and a Cross-Ministry Giga KoreaProject (GK16C0100; Development of Interactive and Realistic MassiveGiga Content Technology) of the Ministry of Science, ICT and FuturePlanning, Korea.

1Inwook Hwang and Jin Ryong Kim are with Electronics and Telecom-munications Research Institute (ETRI), 218 Gajeong-ro, Yuseong-gu, Dae-jeon, Republic of Korea 34129 inux, [email protected]

2Hyungki Son with University of Science and Technology,217 Gajeong-ro, Yuseong-gu, Daejeon, Republic of Korea [email protected]

Fig. 1. Virtual scene of AirPiano.

This can be hugely problematic in the virtual music playingscenario, because many popular musical instruments suchas the piano, flute, clarinet, and guitar require multiplefingers to play them. A wearable device with tactile feedbacksuch as tactile gloves may be one solution to address thisissue, as it can provide haptic feedback to each fingertip.However, the wearable device entails a major practical issueof inconvenience in that it has to be physically worn by userswith complicated wires and circuits to drive the actuators.

In this paper, we present AirPiano, a multi-finger musicplaying system to provide an enriched piano playing experi-ence with mid-air haptic feedback in an HMD-based virtualreality environment (Fig. 1). AirPiano delivers tactile sensa-tions of the naturally resisting forces of piano keys to oneor more fingertips simultaneously via ultrasonic non-contacthaptic display. We explore the effectiveness of AirPiano byconducting a set of piano playing tasks requiring one finger,two fingers each from separate hands, and two fingers fromthe same hand.

We believe that our research makes the following contri-butions:

• An interaction design that naturally integrates visual,auditory, and tactile cues to improve virtual realityexperience

• Mimicking the mechanisms of piano keys through mid-air haptic response resulting in higher user satisfactionand enriched user experience

II. RELATED WORKS

A. Haptic Feedback of Musical Instruments

There exists a broad range of work centering on thereproduction of tactile feedback of musical instruments: The

2017 IEEE World Haptics Conference (WHC) Fürstenfeldbruck (Munich), Germany 6–9 June 2017

978-1-5090-1424-8/17/$31.00 ©2017 IEEE

213

Page 2: AirPiano: Enhancing Music Playing Experience in Virtual ...jessekim.com/docs/AirPiano.pdf · system in virtual reality with mid-air haptic ... via ultrasonic non-contact haptic ...

virtual drums and Cellomobo [3], the rehabilitation purposevirtual piano which uses an exoskeleton [2], the virtual reedimplemented with a kinesthetic device [4], and the virtualviolin which has a bow attachment that generates hapticeffects [5].

A number of researchers have shown great interest inthe area of synthesizing haptic feedback for keyboard basedinstruments. Lozada et al. [6] developed a physical key-typehaptic interface that controlled the resistant forces in real-time. Gillespie et al. [7] proposed a mechanical keypressmodel for kinesthetic feedback. They established a hybriddynamical system comprising four modes: Acceleration, FreeFlight, Rail Stop, and Repetition. There also has been a num-ber of projects focused on tactile reproduction of keypressingsensations of various musical instruments: MIKEY [8], [9],and Haptone [10]. The MIKEY project aimed to emulatethe keys of a grand piano, harpsichord, and organ, usingvoicecoil motors and a simplified dynamics simulator [8],[9]. Haptone succeeded in generating percussive vibrations ofsteel and wood, and the vibrations of string instruments [10].In these previous studies, the primary focus of effectivelyproducing realistic haptic responses was on developing acustomized physical input mechanism. However, such inter-faces often turned out to be inadequate in practicality orfunctionality, especially when multi-finger interaction wasconcerned.

B. Mid-air Haptic Feedback

Due to the inconvenience and ineffectiveness broughtabout by wearable haptic devices, other researchers shiftedtheir focus towards delivering haptic feedback in mid-air.Several kinds of mid-air haptic feedback attempts were madein the past decade: Air Nozzles [11], Single Air Cannon [12],Focused Ultrasound [13], and Laser Radiation [14]. Amongthese different methods, focused ultrasound was actively re-searched due to its advantages of better scalability and simplehardware setup. The use of focused ultrasound originatedfrom clinical practices as Gavrilov and Tsirulnikov [15]explored theoretical methods of generating tactile stimuli.Carter et al. [16] integrated the ultrasound display with visualdisplay by using an ultrasound-transparent cloth as a screen.Hasegawa and Shinoda [17] expanded the phasing schemeto arbitrary pressure distribution in a space with nine arraysof transducers. Inoue et al. [18] developed a surroundedultrasonic mid-air display for volumetric rendering. Sand etal. [19] attached a haptic display onto the front part of aHMD for tangible interaction in a virtual environment. In ad-dition to the various developments mentioned above, Wilsonet al. [20] measured perceptual performance of the ultrasonichaptic display in terms of localization and apparent motionaccuracy. An emotional analysis of mid-air tactile feedbackwas conducted by Obrist et al. [21]. For the application ofmid-air visuo-haptic interface, Monnai et al. [22] suggestedbutton click and musical keyboard applications with differenttactile patterns. Yoshino and Shinoda [23] developed a buttoninterface for blind people and showed about 95% accuracywith mid-air tactile guide in the evaluation.

III. AIRPIANO

We developed a HMD-based multi-finger virtual piano thatutilizes mid-air haptic feedback, ensuring users receive tactileresponses during their interaction with it. In this section, wewill describe how we designed and implemented AirPiano’stouchable keys to deliver tangible sensations to the users’fingertips.

A. AirPiano Setup

We used a HMD device (1080×1200@90 Hz pereye, Oculus Rift CV1, https://www.oculus.com/)and a mid-air haptic display (Ultrahaptics UHEV1,https://www.ultrahaptics.com). The visual appearanceof AirPiano and its surrounding environment wereput together with Unity3D game engine (Ver. 5.4,https://www.unity3d.com). AirPiano was created as asimplified piano through a real-scaled 3D modeling; tohave ten white keys and seven black keys, each with astroke depth of 13.3 mm. The number of keys was decidedin consideration of the optimum workspace of the mid-airhaptic display (about 0.4 m×0.4 m×0.4 m). In the virtualscene, AirPiano was placed on a desk inside a furnishedvirtual room, so that the user can naturally play the pianothrough the binocular displays in HMD. The trackingmechanism of the external infrared camera for the HMDallows users to view the entire scene at any angle. Rigged3D human hand meshes were also placed within the sceneto mirror the movements of the user while visible throughthe HMD. The position and pose of the user’s hands weretracked through a miniature infrared tracker of Leap Motion,which was placed on near-side edge of the haptic display.The field of vision this tracker possesses is approximately aone meter range in any direction of its 150 degree viewingangle. The static position accuracy of the tracker is knownto be about 0.5 mm while the dynamic tracking accuracy isworse [24] and degraded depending on the shape or pose ofthe user’s hands and the amount of created occlusions.

The mid-air haptic display consists of a 16 by 16 planararray of transducers which emit 40 kHz ultrasound in upwarddirection with a width of 80 degrees. The haptic display wasplaced on the physical table where its height is conformedwith the position of AirPiano to deliver tactile feedback ofkeypress action in virtual reality. The phase of ultrasoundsignal was decided for each transducer in the haptic displayto generate one or more focus points [16]. The 40 kHz ofultrasound waves can translate into faint constant pressure (5mN in maximum), and amplitude modulation of ultrasonicsignals can transfer stronger stimuli. Our initial observa-tion showed the perceived intensity is maximized whenthe modulation frequency is around 200 Hz. However, sincehigher modulation frequency can generate audible noise thatcan degrade user experience, we decided to set 140 Hz ofultrasound signal as a fair tradeoff between intensity andnoise level.

214

Page 3: AirPiano: Enhancing Music Playing Experience in Virtual ...jessekim.com/docs/AirPiano.pdf · system in virtual reality with mid-air haptic ... via ultrasonic non-contact haptic ...

Rel

ativ

e In

tens

ity (%

)

Travel Distanceinitial contact fully pressed

AdaptiveConstant

200 ms

0

50

100

Fig. 2. Haptic feedback intensities of the two feedback methods. Theadaptive method has different intensity profiles for pressing (rightwardarrow) and releasing (leftward arrow) actions.

B. Designing Touchable Keys

Since performing on a piano is multimodal musical activ-ity, AirPiano is specifically designed to orchestrate visual,tactile, and auditory modalities during keypressing actions.

The collisions between the fingers of the hand model andAirPiano’s virtual keys were detected through Unity gameengine at 120 Hz. Whenever a collision is detected on a key,the key moved down to remove overlap between the fingerand the key. A collision from the other sides of a key wasignored. Simultaneous contact between multiple fingers andkeys can also be handled in real-time.

We designed two methods of haptic rendering for keypressfeedback: Constant Feedback and Adaptive Feedback. Fig. 2represents the feedback intensities of the two methods for akeypress. Constant Feedback is designed to provide clearand confident feedback for short and intense keypressingactions. In Constant Feedback method, the haptic feedbackis provided at its maximum intensity, and is delivered duringthe entire time the key is being pressed. Adaptive Feedbackis designed to follow the changes in feedback of keypress-ing actions [7] thereby simulating more realistic behaviorof piano keys resulting in an enriched experience. In thekeypress mechanism of a real piano, the stiff surface of akey can be felt when the fingertip started to touch a key.During a keypressing action, the weight of the key createsupward kinesthetic resistive feedback. When the key reachesits limit, higher stiffness and vibration are felt with the pianosound.

In Adaptive Feedback, a short decaying vibration is ren-dered immediately after a finger makes contact on a key.The initial intensity is selected from a range of 50–100% ofmaximum intensity, in proportion to the speed the fingertiphits the key. The intensity decays linearly over 200 ms untilit reaches a lower level of 50% (of maximum intensity).In this way, the user can feel varying levels of vibrationdepending on the speed with which their fingers strikeAirPiano’s keyboard. During the key travel, the vibrationintensity is fixed at 50%. When the key hits the bottom,vibration intensity is increased to 100%, to communicatehigher stiffness. As the key moves back upwards to itsnormal position, the feedback intensity changes to the 50%

level again. The intensity level and the pressure on a focalpoint was in a linear relationship with a little offset. Sincethe emitting power of the haptic display was fully utilizedwhen there were one or two contact points, simultaneouspressing of more than two keys contributed to degradationof individual haptic stimuli.

Auditory Feedback in AirPiano was designed to matchthe visual and tactile cues of virtual keypressing actions.We applied the same mechanism that a real-world pianodemonstrates when its keys are pressed. Despite the continualchanges in visual and tactile responses during a keypressingaction, sounds in AirPiano are only produced when a keyreaches its limit and hits the bottom, at above a certain speed.We used the grand piano sounds from SimplePiano (by DougCox) for our AirPiano. The fade-out duration of a sound wasset to be proportional to the length of time the key sticksto the bottom. There was no noticeable asynchronous delayamong the visual, haptic and auditory feedback in AirPiano.

IV. USER STUDY DESIGN

We designed our user study based on the feedback fromtwo piano teachers (1 male and 1 female) in the local pianoschool. During the interview, they provided sufficient adviceon finger combination, musical notes, and hand shape inplaying the piano.

A. Subjects

Sixteen subjects (nine males and seven females; mean ageof 29.0) participated in the user study. Most participantsreported that they were novice or had a little piano playingexperiences. Since our study is focused on feasibility ofAirPiano, we did not invite any expert pianists for the experi-ment. Participants were free of any illnesses or disabilities inregards to their sensorimotor functions by self-report. Eachparticipant was rewarded with a $3 coffee voucher for theirparticipation.

B. Apparatus

Fig. 3 shows the hardware setup of AirPiano for the userstudy. AirPiano was running on an Intel i7 PC with 64 GB ofRAM and an Nvidia GTX 980 GPU. An Oculus Rift HMD,a mid-air tactile display by Ultrahaptics, and a Leap Motioncontroller were connected to the PC. The haptic display waslocated on a table 45 cm high, and the virtual piano keys wereplaced 30 cm higher than the surface of the haptic display.Over-ear headphones were used to render clear piano soundsand block out exterior noise.

C. Experimental Design

A within-subjects experiment focusing on three hapticfeedback conditions of AirPiano was conducted: No Feed-back, Constant Feedback, and Adaptive Feedback. Therewere three playing conditions with related musical notes (16participants × 3 conditions × 3 pieces).

Taking into account that participants’ piano playing skillswould be widely spreaded from novice to advanced and itis important to observe the usability of piano experience

215

Page 4: AirPiano: Enhancing Music Playing Experience in Virtual ...jessekim.com/docs/AirPiano.pdf · system in virtual reality with mid-air haptic ... via ultrasonic non-contact haptic ...

Fig. 3. Experimental hardware setup.

with different finger combinations, we selected three musicpieces: ‘Twinkle, twinkle little star’ requires the right indexfinger (Little Star for short), ‘Chopsticks’ required two indexfingers, and a piece involving a series of two-note chordsrequires the left thumb and index finger (Chords). As wementioned in the beginning of this section, those three musi-cal notes were also recommended by two qualified teachersin the local piano school. We believe that performing selectedthree musical notes is a first step towards understandingthe benefits of mid-air haptics in AirPiano since these threemusical notes can cover different finger combinations withone and both hands for melodies and chords.

Questionnaire sheets were prepared to measure the partici-pants’ responses for their key press experiences in five areas:Clarity – I was able to perceive clear feedback; Reality –Feedback was similar to real piano key-press; Enjoyment –I enjoyed playing the piano; Comfort – I was comfortablewhile playing the piano; and Overall Satisfaction – I wassatisfied with my piano playing experience. Participantswere asked to respond each question by marking a checkon a horizontal line (visual analog scale) with a label oneach end: ‘Strongly Disagree’ and ‘Strongly Agree.’ Theparticipants were also debriefed regarding their experienceswith AirPiano.

D. Experimental Procedure

An instruction was given to each participant with a shorttraining session to become familiarized with AirPiano priorto the main experiment. In the training session, the par-ticipants learned how to play each of the songs through acolor guiding system where piano keys would change colorto signal the next key to be played.

The main experiment consisted of three sessions: LittleStar, Chopsticks, and Chords. For each session, three hapticfeedback conditions of No Feedback, Constant Feedback,and Adaptive Feedback were provided with two repetitions.Their presenting order was balanced within participants. Thefirst cycle was provided to get familiarized with correspond-ing music piece with three haptic conditions. During thesecond cycle of session, we asked the participants to answerthe five questions after completing each feedback condition.A three minute break was provided after completing each

* * *

?

Fig. 4. Averages and standard errors of the subjective ratings for the totalplaying conditions. Statistical significance of differences between conditionswere also represented (?: p <0.1, *: p <0.05).

session. The entire experiment took around 40 minutes perparticipant.

V. RESULTS AND DISCUSSION

Participants’ continuous-scale ratings for the five measureswere collected and linearly scaled from 0 to 100 (0: StronglyDisagree, 100: Strongly Agree). The averaged rating andstandard error are plotted in Fig. 4. Constant and AdaptiveFeedback methods had higher scores in all five measures.In addition, Adaptive Feedback was rated as the most pre-ferred method in all measures except clarity. We conducteda within-subject three-way analysis of variance (ANOVA).Playing condition and haptic feedback condition were con-sidered as fixed-effect factors and subject was treated as arandom effect factor (d f = (2, 30)). The ANOVA showedthe significance (p <0.05) of haptic feedback condition inclarity, reality, and enjoyment and marginal significance insatisfaction (p <0.1).

Student-Neuman-Keuls (SNK) multiple comparison testwas also conducted to reveal the differences between feed-back conditions. The statistical group differences are rep-resented in Fig. 4. Clarity and reality ratings of haptic-enabled conditions were significantly higher than those withno haptic conditions (p <0.05). Adaptive Feedback methodalso showed a higher score than No Feedback conditionin terms of enjoyment (p <0.05) and satisfaction (p <0.1).Overall, the existence of mid-air haptic feedback in AirPianoincreased reality and virtual playing experience. In addition,we found that our Adaptive Feedback method added morevalue to AirPiano, in terms of user enjoyment and satis-faction. Adaptive Feedback method showed the possibilityof replicating realistic feedback with aerial haptic feedbacksince it reached higher ratings in most of the subjectivemeasures (i.e. reality, enjoyment, comfort, and satisfaction)than those with Constant Feedback although the results werenot significant.

The comments of the participants collected during theexperiments revealed how they felt about the various hapticfeedback conditions. Most reported that the existence ofhaptic feedback on AirPiano is helpful in interacting with the

216

Page 5: AirPiano: Enhancing Music Playing Experience in Virtual ...jessekim.com/docs/AirPiano.pdf · system in virtual reality with mid-air haptic ... via ultrasonic non-contact haptic ...

?

??

Fig. 5. Averages and standard errors of the subjective ratings for each of the three playing conditions: (a) Little Star, (b) Chopsticks, and (c) Chords.Statistical significance of differences between conditions were also represented (?: p < 0.1, *: p <0.05).

virtual instrument and they enjoyed the AirPiano experiencewith touchable keys (11 of 16). However, some participantsreported that they felt uncomfortable with mid-air vibrations,rather than believable piano keys (5 of 16). There were somecomments about the vibration intensity. Four participantsreported that the intensity in Constant Feedback was toostrong, and two participants felt that the intensity in AdaptiveFeedback was too weak. Participants may have felt the dif-ference in terms of vibration intensity between Constant andAdaptive Feedback, as Constant Feedback transfers greaterenergy than Adaptive Feedback as a pressed key moves.

We also investigated the evaluation results by playingcondition. Fig. 5 shows the averages, standard errors, andstatistical differences for each of the three playing conditions.The three playing conditions have a similar trend on theirresults, especially in clarity and reality. Two way ANOVAtests showed the significance of haptic feedback in clarityand reality for all conditions (p <0.05). Adaptive Feedbackturned out to be a step ahead of No Feedback in enjoyment ofChopsticks (p <0.05) and Chords (p <0.1). In Chords, thetwo haptic-enabled conditions were marginally better thanhaptic absent conditions (p <0.1) in satisfaction. Participantsreported that multi-finger playing of Chopsticks and Chordswas more difficult than single finger playing of Little Star,especially when the finger tracking was poor. This was due tothe limitation of finger tracking technology in Leap Motion.Some participants answered that the vibration was morehelpful in confirming keypress, especially during multi-fingerconditions.

Our study implies the feasibility of mid-air multimodalinteraction in virtual reality by naturally integrating visual,auditory, and tactile cues without wearing any special equip-ment. Unlike force feedback device that usually accompa-nies some limitations when multi-fingers are engaged withvirtual reality interaction, our touchable keys with mid-air haptic feedback simultaneously allows the multi-fingerinteraction with higher degree of freedom with affirmativeuser satisfaction. In fact, most of the participants enjoyedand satisfied with the touchable keys in the air during theexperiment. AirPiano further shows the possibility of mimicpiano keys with mid-air haptic feedback. Although the resultswere not statistically significant, we observed the trends of

higher subjective responses in Adaptive method than thosein Constant method in most of the measurements (exceptfor clarity). Overall, our study clearly reflects that AirPianocan magnify the multimodal experiences with aerial hapticfeedback in virtual reality, leading to greater perceptibilityand user satisfaction.

VI. LIMITATIONS AND IMPROVEMENTS

There exists a clear constraint of AirPiano in that trackingmultiple fingers to play more complicated musical notes thatrequire precise timing and polyphonic playing. This kind oftask is difficult to achieve due to the limitation of currentvision based infrared tracking technology such as LeapMotion and Intel’s Real Sense. For this reason, we ratherfocused on the feasibility of AirPiano by conducting the userstudy with non-expert users playing very simple pieces thatcover fundamental finger combinations with melodies andchords. However, a variety of finger tracking methods arebeing intensively researched from academia to industry, andwe believe that virtual reality experiences in AirPiano can befurther improved by adapting such technologies in the nearfuture.

VII. CONCLUSION

In this study, we presented a mid-air virtual piano withmultimodal interaction of visual, auditory, and haptic cueingas well as its user study. The system was developed throughutilizing a head-tracking HMD and a ultrasonic haptic devicewith a hand tracker. We designed the virtual piano withtwo kinds of tactile feedback: Constant Feedback and Adap-tive Feedback. Constant Feedback was designed to provideshort and intense feedback whereas Adaptive Feedback wasdesigned to follow the changes in tactile feedback from areal piano keypress. The user study revealed effectiveness ofour mid-air haptic feedback design. We were successful increating a fully functional multimodal virtual music playingsystem that provides greater multimodal virtual reality expe-rience. This study can be a guideline for studies involvingmid-air haptic feedback in multimodal experience or tactilesubstitution of force feedback. We expect the results of thisstudy can contribute to the widespread of mid-air haptics invirtual reality. Our next goal will be enabling more natural

217

Page 6: AirPiano: Enhancing Music Playing Experience in Virtual ...jessekim.com/docs/AirPiano.pdf · system in virtual reality with mid-air haptic ... via ultrasonic non-contact haptic ...

playing in AirPiano with better multi-finger tracking andenhanced haptic feedback.

VIII. ACKNOWLEDGMENTS

The authors thank Samuel Jeong for designing 3D modeldata and Hyunwoo Lim for implementing virtual scene ofAirPiano. We also thank Kyungmi Park and Jongmo Parkof Philhamonik Music Academy in Daejeon, Korea forproviding valuable feedback on user study design.

REFERENCES

[1] K. Kolarov and D. Ruspini, “Graphical and haptic manipulation of3d objects,” in Proceedings of the First PHANToM Users GroupWorkshop, 1996.

[2] S. V. Adamovich, G. G. Fluet, A. Mathai, Q. Qiu, J. Lewis, andA. S. Merians, “Design of a complex virtual reality simulation totrain finger motion for persons with hemiparesis: a proof of conceptstudy,” Journal of neuroengineering and rehabilitation, vol. 6, no. 28,2009.

[3] E. Berdahl, H.-C. Steiner, and C. Oldham, “Practical hardware andalgorithms for creating haptic musical instruments,” in Proceedings ofthe International Conference on New Interfaces for Musical Expres-sion (NIME), ser. NIME ’08, 2008, pp. 61–66.

[4] T. Smyth, T. N. Smyth, and A. E. Kirkpatrick, “Exploring the virtualreed parameter space using haptic feedback,” in Proceedings of theIEEE Workshop on Multimedia Signal Processing, 2006, pp. 45–49.

[5] C. Nichols, “The vbow: development of a virtual violin bow haptichuman-computer interface,” in Proceedings of the conference on Newinterfaces for musical expression (NIME), 2002, pp. 1–4.

[6] J. Lozada, M. Hafez, and X. Boutillon, “A novel haptic interface formusical keyboards,” in Proceedings of the IEEE/ASME internationalconference on advanced intelligent mechatronics, 2007, pp. 1–6.

[7] R. B. Gillespie, B. Yu, R. Grijalva, and S. Awtar, “Characterizing thefeel of the piano action,” Computer Music Journal, vol. 35, no. 1, pp.43–57, 2011.

[8] R. Oboe and G. De Poli, “Multi-instrument virtual keyboard: Themikey project,” in Proceedings of the Conference on New Interfacesfor Musical Expression (NIME), ser. NIME ’02. Singapore,Singapore: National University of Singapore, 2002, pp. 1–6. [Online].Available: http://dl.acm.org/citation.cfm?id=1085171.1085178

[9] R. Oboe, “A multi-instrument, force-feedback keyboard,” ComputerMusic Journal, vol. 30, no. 3, pp. 38–52, 2006.

[10] D. Ogawa, K. Tanabe, V. Yem, T. Hachisu, and H. Kajimto, “Haptone:Haptic instrument for enriched musical play,” in In Proceedingsof the ACM SIGGRAPH Emerging Technologies, ser. SIGGRAPH’16. New York, NY, USA: ACM, 2016, pp. 12:1–12:2. [Online].Available: http://doi.acm.org/10.1145/2929464.2929477

[11] Y. Suzuki and M. Kobayashi, “Air jet driven force feedback in virtualreality,” IEEE computer graphics and applications, vol. 25, no. 1, pp.44–47, 2005.

[12] R. Sodhi, I. Poupyrev, M. Glisson, and A. Israr, “Aireal: Interactivetactile experiences in free air,” ACM Transactions on Graphics,vol. 32, no. 4, pp. 134:1–134:10, July 2013. [Online]. Available:http://doi.acm.org/10.1145/2461912.2462007

[13] T. Iwamoto, M. Tatezono, and H. Shinoda, “Non-contact method forproducing tactile sensation using airborne ultrasound,” in Proceedingsof the EuroHaptics, 2008, pp. 504–513.

[14] H. Lee, J. S. Kim, S. Choi, J. H. Jun, J. R. Park, A. H. Kim, H. B. Oh,H. S. Kim, and S. C. Chung, “Mid-air tactile stimulation using laser-induced thermoelastic effects: The first study for indirect radiation,”in Proceedings of the IEEE World Haptics Conference (WHC), June2015, pp. 374–380.

[15] L. R. Gavrilov and E. M. Tsirulnikov, “Focused ultrasound as a toolto input sensory information to humans (review),” Acoustical Physics,vol. 58, no. 1, pp. 1–21, 2012.

[16] T. Carter, S. A. Seah, B. Long, B. Drinkwater, and S. Subramanian,“Ultrahaptics: Multi-point mid-air haptic feedback for touchsurfaces,” in Proceedings of the ACM Symposium on UserInterface Software and Technology (UIST), ser. UIST ’13. NewYork, NY, USA: ACM, 2013, pp. 505–514. [Online]. Available:http://doi.acm.org/10.1145/2501988.2502018

[17] K. Hasegawa and H. Shinoda, “A method for distribution control ofaerial ultrasound radiation pressure for remote vibrotactile display,” inProceedings of the SICE Annual Conference 2013, 2013, pp. 223–228.

[18] S. Inoue, Y. Makino, and H. Shinoda, “Active touch perceptionproduced by airborne ultrasonic haptic hologram,” in Proceedings ofthe IEEE World Haptics Conference (WHC), ser. WHC 2015. IEEE,June 2015, pp. 362–367.

[19] A. Sand, I. Rakkolainen, P. Isokoski, J. Kangas, R. Raisamo,and K. Palovuori, “Head-mounted display with mid-air tactilefeedback,” in Proceedings of the ACM Symposium on VirtualReality Software and Technology (VRST), ser. VRST ’15. NewYork, NY, USA: ACM, 2015, pp. 51–58. [Online]. Available:http://doi.acm.org/10.1145/2821592.2821593

[20] G. Wilson, T. Carter, S. Subramanian, and S. A. Brewster,“Perception of ultrasonic haptic feedback on the hand: Localisationand apparent motion,” in Proceedings of the ACM Conference onHuman Factors in Computing Systems (CHI), ser. CHI ’14. NewYork, NY, USA: ACM, 2014, pp. 1133–1142. [Online]. Available:http://doi.acm.org/10.1145/2556288.2557033

[21] M. Obrist, S. Subramanian, E. Gatti, B. Long, and T. Carter,“Emotions mediated through mid-air haptics,” in Proceedings of theACM Conference on Human Factors in Computing Systems (CHI),ser. CHI ’15. New York, NY, USA: ACM, 2015, pp. 2053–2062.[Online]. Available: http://doi.acm.org/10.1145/2702123.2702361

[22] Y. Monnai, K. Hasegawa, M. Fujiwara, K. Yoshino, S. Inoue, andH. Shinoda, “Haptomime: mid-air haptic interaction with a floatingvirtual screen,” in Proceedings of the ACM Symposium on UserInterface Software and Technology (UIST). ACM, 2014, pp. 663–667.

[23] K. Yoshino and H. Shinoda, “Contactless touch interface supportingblind touch interaction by aerial tactile stimulation,” in Proceedingsof the IEEE Haptics Symposium (HAPTICS), 2014, pp. 347–350.

[24] J. Guna, G. Jakus, M. Pogacnik, S. Tomazic, and J. Sodnik, “Ananalysis of the precision and reliability of the leap motion sensor andits suitability for static and dynamic tracking,” Sensors, vol. 14, no. 2,pp. 3702–3720, 2014.

218