Combining Wristband Display and Wearable Haptics for...

2
Combining Wristband Display and Wearable Haptics for Augmented Reality Gianluca Paolocci 1,2 * , Tommaso Lisini Baldi 1 , Davide Barcelli 1 , and Domenico Prattichizzo 1,2 1 Department of Information Engineering and Mathematics, University of Siena 2 Department of Advanced Robotics, Istituto Italiano di Tecnologia ABSTRACT Taking advantages of widely distributed hardware such as smart- phones and tablets, Mobile Augmented Reality (MAR) market is rapidly growing. Major improvements can be envisioned in in- creasing the realism of virtual interaction and providing multimodal experiences. We propose a novel system prototype that locates the display on the forearm using a rigid support to avoid constraints due to hand-holding, and is equipped with hand tracking and cutaneous feedback. The hand tracking enables the manipulation of virtual objects, while the haptic rendering enhances the user’s perception of the virtual entities. The experimental setup has been tested by ten participants, that expressed their impressions about usability and functionality of the wrist-mounted system w.r.t. the traditional hand-held condition. Subjects’ personal evaluations suggest that the AR experience provided by the wrist-based approach is more engaging and immersive. Index Terms: Human-centered computing—Mixed / augmented reality; 1 I NTRODUCTION Augmented reality is expected to become an essential technology in the next future, because of its ability to enhance reality by adding informative or entertaining contents. However, its functionality is mostly unexploited due to the necessity of wearing additional hard- ware (e.g. during works or daily activities). Suitable prototypes for a massive deployment of AR devices can be envisioned in wearable lightweight interfaces, which do not isolate the user from the context or limit his movement, but provide support to the user’s activities. Smart-phones respond appropriately to many of these requirements, and due to the fact that nowadays almost everyone owns one, we selected phones as core technology of our system to reach the largest number of potential users. Lack of realism and tactile interaction with virtual objects can be identified as current major limitations of Mobile Augmented Reality (MAR), together with the need to carry the device with one hand. In fact, most industrial scenarios and daily activities require both hands to be available at the same time. Few researches succeded in achieving hand-based interaction with virtual objects for Mobile Augmented Reality. Hand tracking trough Leap Motion was exploited in [7] to display an avatar su- perimposed to the real hand on the smart-phone screen. However, this approach lacked haptic feedback, and setup was bulky because Leap Motion data could not be streamed directly to the phone but required additional hardware. Instrumenting the whole hand through a glove for sensing and rendering contact forces using vibrations was the idea in [4], while [3] featured a combination of touch screen interaction and index finger tracking to address MAR issues. * email: [email protected] Figure 1: Wearable haptics for Mobile Augmented Reality. An IMU-based system tracks the motion of the index finger while the haptic thimble provides the user with cutaneous stimuli. The proposed device creates the sensation of making/breaking contact with virtual objects. An Arduino microcontroller, powered by a Li-Po battery, is in charge of collecting data from the sensors and control the haptic device. The smart-phone shows the augmented environment to the user, computes the hand posture, and evaluates the fingertip contact force. 2 ADVANTAGES OF WRIST- WORN SYSTEM This work presents a proof-of-concept solution for MAR enabling hand-based interaction and tactile rendering of contact forces with virtual objects. In order to manipulate virtual entities, the user’s hand has to collide with the objects mesh, displayed on the phone screen. Instead of capturing the hand pose with a camera approach, we decided to leverage on inertial measurements to reconstruct the hand posture. This way, we avoid the limitation of always having the real hand in the camera field of view, and we free the other hand from the burden of holding the device. The problem of locating the hand in the 3D space is addressed by positioning the phone on a wrist- mounted support. In fact, the ARcore library used to develop the app provides the distance of the fiducial markers from the rear camera, that is translated of a fixed amount from the hand. Thus, the hand avatar pose estimated with inertial measures and its position in the 3D space are integrated on a virtual avatar projected on the screen, which can interact with virtual objects. The prototype we developed for testing consists of two wearable pieces of equipment: a display located on the forearm, that removes the constrain of handholding the phone without affecting dexterity, and small fingertip interfaces to virtually map the hand and to gener- ate haptic cues. The final implementation of our system will feature a flexible display worn on the wrist to minimize encumbrance, and small fingertip haptic interfaces that have minimum impact on the user’s manipulation capabilities. With the proposed system we aim at addressing the following limitation: (i) keeping at least one hand free, (ii) enhancing the experience with haptic rendering, (iii) providing an on-demand AR experience, and (iv) designing a low-budget wearable system based on smart-phone.

Transcript of Combining Wristband Display and Wearable Haptics for...

Page 1: Combining Wristband Display and Wearable Haptics for …sirslab.dii.unisi.it/papers/2020/PaolocciIEEEVR2020.pdf · 2020. 3. 24. · 3 SYSTEM DESCRIPTION We present a prototype system

Combining Wristband Display and Wearable Haptics for AugmentedReality

Gianluca Paolocci 1,2*, Tommaso Lisini Baldi1, Davide Barcelli1, and Domenico Prattichizzo 1,2

1 Department of Information Engineering and Mathematics, University of Siena2 Department of Advanced Robotics, Istituto Italiano di Tecnologia

ABSTRACT

Taking advantages of widely distributed hardware such as smart-phones and tablets, Mobile Augmented Reality (MAR) market israpidly growing. Major improvements can be envisioned in in-creasing the realism of virtual interaction and providing multimodalexperiences. We propose a novel system prototype that locates thedisplay on the forearm using a rigid support to avoid constraints dueto hand-holding, and is equipped with hand tracking and cutaneousfeedback. The hand tracking enables the manipulation of virtualobjects, while the haptic rendering enhances the user’s perceptionof the virtual entities. The experimental setup has been tested byten participants, that expressed their impressions about usabilityand functionality of the wrist-mounted system w.r.t. the traditionalhand-held condition. Subjects’ personal evaluations suggest thatthe AR experience provided by the wrist-based approach is moreengaging and immersive.

Index Terms: Human-centered computing—Mixed / augmentedreality;

1 INTRODUCTION

Augmented reality is expected to become an essential technology inthe next future, because of its ability to enhance reality by addinginformative or entertaining contents. However, its functionality ismostly unexploited due to the necessity of wearing additional hard-ware (e.g. during works or daily activities). Suitable prototypes fora massive deployment of AR devices can be envisioned in wearablelightweight interfaces, which do not isolate the user from the contextor limit his movement, but provide support to the user’s activities.Smart-phones respond appropriately to many of these requirements,and due to the fact that nowadays almost everyone owns one, weselected phones as core technology of our system to reach the largestnumber of potential users. Lack of realism and tactile interactionwith virtual objects can be identified as current major limitations ofMobile Augmented Reality (MAR), together with the need to carrythe device with one hand. In fact, most industrial scenarios and dailyactivities require both hands to be available at the same time.

Few researches succeded in achieving hand-based interactionwith virtual objects for Mobile Augmented Reality. Hand trackingtrough Leap Motion was exploited in [7] to display an avatar su-perimposed to the real hand on the smart-phone screen. However,this approach lacked haptic feedback, and setup was bulky becauseLeap Motion data could not be streamed directly to the phone butrequired additional hardware. Instrumenting the whole hand througha glove for sensing and rendering contact forces using vibrationswas the idea in [4], while [3] featured a combination of touch screeninteraction and index finger tracking to address MAR issues.

*email: [email protected]

Figure 1: Wearable haptics for Mobile Augmented Reality. An IMU-based systemtracks the motion of the index finger while the haptic thimble provides the user withcutaneous stimuli. The proposed device creates the sensation of making/breakingcontact with virtual objects. An Arduino microcontroller, powered by a Li-Po battery,is in charge of collecting data from the sensors and control the haptic device. Thesmart-phone shows the augmented environment to the user, computes the hand posture,and evaluates the fingertip contact force.

2 ADVANTAGES OF WRIST-WORN SYSTEM

This work presents a proof-of-concept solution for MAR enablinghand-based interaction and tactile rendering of contact forces withvirtual objects.

In order to manipulate virtual entities, the user’s hand has tocollide with the objects mesh, displayed on the phone screen. Insteadof capturing the hand pose with a camera approach, we decided toleverage on inertial measurements to reconstruct the hand posture.This way, we avoid the limitation of always having the real handin the camera field of view, and we free the other hand from theburden of holding the device. The problem of locating the handin the 3D space is addressed by positioning the phone on a wrist-mounted support. In fact, the ARcore library used to develop the appprovides the distance of the fiducial markers from the rear camera,that is translated of a fixed amount from the hand. Thus, the handavatar pose estimated with inertial measures and its position in the3D space are integrated on a virtual avatar projected on the screen,which can interact with virtual objects.

The prototype we developed for testing consists of two wearablepieces of equipment: a display located on the forearm, that removesthe constrain of handholding the phone without affecting dexterity,and small fingertip interfaces to virtually map the hand and to gener-ate haptic cues. The final implementation of our system will featurea flexible display worn on the wrist to minimize encumbrance, andsmall fingertip haptic interfaces that have minimum impact on theuser’s manipulation capabilities.

With the proposed system we aim at addressing the followinglimitation: (i) keeping at least one hand free, (ii) enhancing theexperience with haptic rendering, (iii) providing an on-demand ARexperience, and (iv) designing a low-budget wearable system basedon smart-phone.

Page 2: Combining Wristband Display and Wearable Haptics for …sirslab.dii.unisi.it/papers/2020/PaolocciIEEEVR2020.pdf · 2020. 3. 24. · 3 SYSTEM DESCRIPTION We present a prototype system

3 SYSTEM DESCRIPTION

We present a prototype system composed of a wristband display anda wearable fingertip interface.

A smart-phone (P20 lite, Huawei Technologies Co. Ltd., CN) isthe central processing unit, but also serves as visual input sourceand visual display. It is secured to the arm using a 3D-printed ABSsupport attached to a thermoplastic splint.

The fingertip interface is equipped with: i) an Inertial and Mag-netic Unit (MPU6050, Invesense Corp. US) to estimate the indexfinger pose w.r.t to the palm, and ii) a servomotor (HS-35HD UltraNano, HITEC Inc. USA) to render interaction forces through apulley mechanism. A clip system enables users to easily fasten thedevice on the finger. The servomotor can provide a maximum torqueof 0.8 kgcm−1. The cutaneous force is generated by the device con-sidering the force generated by the servomotor and the resistancegiven by the three springs and the human skin. Interested reader isreferred to [6] for further details on the force feedback generation.

On top of the servomotor is firmly attached an IMU sensor boardthat contains a triaxial accelerometer/gyroscope and an I2C interface.For what concerns the finger tracking, since the aim of this workis providing a proof of concept and not to present a hand-trackingapproach, we exploited the algorithm presented in [5]. To estimatethe relative motion of the finger w.r.t. the hand, we placed an addi-tional IMU in the back of the hand. We decided to use a simplifiedkinematic model of the hand, which requires a reduced number ofsensors to estimate the finger pose. To reconstruct the finger pose,the developed Android app combines the orientation estimated bytwo sensor boards and biomechanical constraints [1].

An ATMega 328p microcontroller (Arduino Pro Mini 3.3 V) is incharge of communicating through a Bluetooth connection with thesmart-phone to control the servomotor and to transmit data comingfrom the inertial sensors.

The Android application is based on the ARcore library [2], whichprovides online estimation of camera position and distance fromfiducial markers, that are used to accurately locate virtual objectsin the mixed environment. Anyway, the real hand cannot be in thesmart-phone camera field of view, because it would cover the scenebelow. Thus, we decided to dislocate the display close to the wristand place a virtual representation of the real hand at the bottom ofthe smart-phone screen.

4 EXPERIMENTAL EVALUATION

The prototype system was tested in a manipulation and explorationtask to collect users’ evaluations on functionality and usability. Thedevice was tested in two layouts: wrist-mounted (WM) and hand-held (HH). Ten participants (8 males, age range 23 - 45, mean32) were tasked to perform a weight evaluation on three virtualspheres, rendered on the top of virtual ramps. The users were ableto visualize the virtual environment superimposed to the real worldthrough the smart-phone display, and interact with virtual objectsto retrieve informative contents. Visual and tactile informationplayed complementary roles for the correct execution of the task,because the sphere mass could be estimated only by means of hapticfeedback.

After the esperiments, participants were asked to describe theirpersonal impressions on a 15-questions survey1 regarding mentaleffort, manipulability, and embodiment perceived. Participants ratedthe system on a seven-point Likert scale (0 means that the partici-pant “completely disagrees” with the sentence, 6 means “completelyagrees”).

Questionnaire answers (here reported as mean ± standard de-viation) were analyzed by means of paired sample t-tests. Sta-tistical analysis showed that subjects’ preference for the wrist-mounted display was significant for what concerns Embodiment(HH = 3.52± 0.92, WM = 4.67± 0.66), while the condition did

1http://sirslab.dii.unisi.it/wristbandhaptics/

(a) (b)

Figure 2: Representative trial: the user is tasked to test, determine, and order threespheres by their weight. At the beginning (a) all the spheres are yellow and placed onthe top of a virtual ramp. The user perceives the objects weight by pressing onto thelevers, then order the objects by increasing weigth. If the order is correct the spheresturn green (b), otherwise red.

not have significant effects on the self-perception of Mental Work-load (HH = 3.80±0.80, WM = 4.16±0.79) and for what concernsthe Manipulability (HH = 3.52±0.77, WM = 4.20±0.96). Statis-tically significant difference was registered also for the Overall score(HH = 3.69±0.66, WM = 4.34±0.68).

5 CONCLUSIONS AND FUTURE WORKS

The prototype developed to test the wristband display requires sev-eral improvements, mainly from the design and technology pointof view. Lowering the form factor of the fingertip interface willreduce the obtrusiveness during real-life activities, and a flexible dis-plays would transform the forearm device in an accessory. Increasedcomputational power would open the way to multiple-fingers (andtwo-hands) virtual interaction. The delocalization of the virtual handavatar may be exploited in embodiment studies, reinforced by thehaptic feedback.

REFERENCES

[1] S. Cobos, M. Ferre, M. Sanchez-Uran, and J. Ortego. Constraints forRealistic Hand Manipulation. Proc. Presence, pp. 369–370, 2007.

[2] Google Inc. ARCore software development kit, 2019.[3] W. Hurst and C. Van Wezel. Gesture-based interaction via finger track-

ing for mobile augmented reality. Multimedia Tools and Applications,62(1):233–258, 2013.

[4] J. Y. Lee, G. W. Rhee, and D. W. Seo. Hand gesture-based tangible inter-actions for manipulating virtual objects in a mixed reality environment.The International Journal of Advanced Manufacturing Technology, 51(9-12):1069–1082, 2010.

[5] T. Lisini Baldi, F. Farina, A. Garulli, A. Giannitrapani, and D. Prat-tichizzo. Upper Body Pose Estimation Using Wearable Inertial Sensorsand Multiplicative Kalman Filter. IEEE Sensors Journal, 2019.

[6] T. Lisini Baldi, S. Scheggi, L. Meli, M. Mohammadi, and D. Prattichizzo.GESTO: A Glove for Enhanced Sensing and Touching Based on Inertialand Magnetic Sensors for Hand Tracking and Cutaneous Feedback.IEEE Transactions on Human-Machine Systems, 47(6):1066–1076, Dec2017. doi: 10.1109/THMS.2017.2720667

[7] J. Qian, J. Ma, X. Li, B. Attal, H. Lai, J. Tompkin, J. F. Hughes, andJ. Huang. Portal-ble: Intuitive free-hand manipulation in unboundedsmartphone-based augmented reality. In Proceedings of the 32nd AnnualACM Symposium on User Interface Software and Technology, pp. 133–145. ACM, 2019.