Design and Evaluation of AHMED, an Ad-hoc Mixed-reality …€¦ · Leap wearable augmented reality...

5
Design and Evaluation of AHMED, an Ad-hoc Mixed-reality Exhibition Designer for Museums and Art Galleries Krzysztof Pietroszek * , Carl Moore American University Immersive Designs, Experiences, Applications and Stories Lab School Of Communication American University Washington DC Email: * [email protected] Abstract—We describe the design and implementation of a mixed reality exhibit prototype for the Museum of the Peace Corps Experience. The Museum exhibits are reconstructed using the photogrammetry and made interactive using real-time physics simulation and freehand interaction. We recorded the volumetric captures of Returned Peace Corps Volunteers telling stories related to the artifacts. The prototype was evaluated in a pilot study with positive response of the audience. Index Terms—photogrammetry, heritage, museum, volumetric capture, mixed reality, augmented reality, I NTRODUCTION Since Peace Corps was founded in 1961, approximately 220,000 Americans have served in over 140 nations, working side-by-side with counterparts. The half-century legacy of Peace Corps has been substantial, yet ephemeral. Several initiatives are capturing stories and documents but current collections of Peace Corps-related objects are scattered across the U.S. or held privately by former volunteers. The objects, and the stories that accompany them, are unique sources of cross-cultural understanding. The Museum of the Peace Corps Experience is America’s first-ever aggregate expression of the lessons, sacrifices, in- sights and experiences of returned Peace Corps Volunteers (RPCVs) to be preserved through artifacts and stories. The purpose of the Museum is to expand the reach of Peace Corps through physical and virtual exhibits that link artifacts with the personal stories of volunteers. Over the past few years, the underlying structure of the Museum has gradually taken shape and several hundred donated artifacts have been col- lected. Parts of the collection have been shown at conferences, schools, universities and other museums across the U.S. The Peace Corps is a volunteer program run by the United States government since 1960s. Peace Corps mission is to “provide a a service opportunity for motivated changemakers to immerse themselves in a community abroad, working side by side with local leaders to tackle the most pressing challenges of our generation“ [1]. To address the need for preserving the memories of inter- cultural exchange, a group of Returned Peace Corps Volunteers founded an online Museum of Peace Corps Experiences. The museum’s mission is to document the RPCV stories and preserve the artifacts related to their service. Despite the fact that to date over 200,000 volunteers served in 141 countries, Peace Corps does not have a permanent exhibition space for the souvenirs, the artifacts and the stories brought back by the volunteers. In partnership with the Committee for a Museum of the Peace Corps Experience, we designed and implemented an interactive, mixed-reality educational experience for the Mu- seum using Magic Leap augmented reality glasses. The ap- plication allows the Museum visitors to interact with the artifacts and invite the “holograms” of the RPCVs into the Museum visitor’s living room. To the best of our knowledge, this research presents the first attempt to bring the museum experience into a visitor’s living room using mixed reality. Additionally, we introduce a physics-simulated artifact defor- mation through freehand interaction with artifacts as a way to engage audiences. Finally, we propose a process of low cost volumetric capture of RPCV storytellers to served the function of museum guides. With the future proliferation of wearable mixed reality devices, we believe that this mode of exposure to museum artifacts may become an effective method of engaging new audiences. RELATED STUDIES A number of projects across the spectrum of the reality- virtuality continuum [2] have been previously designed and evaluated in the context of museum and exhibition environ- ments. Recent examples include immersive experiences in the literary museum in Trieste, Italy [3], an augmented reality experience for a mine museum in the UK [4], or virtual representation of museum artifacts [5]. A critical examination of the use of immersive technologies in the museum context was provided by Stogner [6]. An illustrative example of the use of immersive technologies in the museum context is provided by ARCO project [5]. ARCO project introduced an efficient, low cost photogram- metry process for modeling museum artifacts, as opposed to

Transcript of Design and Evaluation of AHMED, an Ad-hoc Mixed-reality …€¦ · Leap wearable augmented reality...

Page 1: Design and Evaluation of AHMED, an Ad-hoc Mixed-reality …€¦ · Leap wearable augmented reality device. The implementation is cross-platform and can be adapted for Hololens 2

Design and Evaluation of AHMED, an Ad-hocMixed-reality Exhibition Designer for Museums and

Art GalleriesKrzysztof Pietroszek∗, Carl Moore

American University Immersive Designs, Experiences, Applications and Stories LabSchool Of Communication

American UniversityWashington DC

Email: ∗[email protected]

Abstract—We describe the design and implementation of amixed reality exhibit prototype for the Museum of the PeaceCorps Experience. The Museum exhibits are reconstructed usingthe photogrammetry and made interactive using real-time physicssimulation and freehand interaction. We recorded the volumetriccaptures of Returned Peace Corps Volunteers telling storiesrelated to the artifacts. The prototype was evaluated in a pilotstudy with positive response of the audience.

Index Terms—photogrammetry, heritage, museum, volumetriccapture, mixed reality, augmented reality,

INTRODUCTION

Since Peace Corps was founded in 1961, approximately220,000 Americans have served in over 140 nations, workingside-by-side with counterparts. The half-century legacy ofPeace Corps has been substantial, yet ephemeral. Severalinitiatives are capturing stories and documents but currentcollections of Peace Corps-related objects are scattered acrossthe U.S. or held privately by former volunteers. The objects,and the stories that accompany them, are unique sources ofcross-cultural understanding.

The Museum of the Peace Corps Experience is America’sfirst-ever aggregate expression of the lessons, sacrifices, in-sights and experiences of returned Peace Corps Volunteers(RPCVs) to be preserved through artifacts and stories. Thepurpose of the Museum is to expand the reach of Peace Corpsthrough physical and virtual exhibits that link artifacts withthe personal stories of volunteers. Over the past few years,the underlying structure of the Museum has gradually takenshape and several hundred donated artifacts have been col-lected. Parts of the collection have been shown at conferences,schools, universities and other museums across the U.S.

The Peace Corps is a volunteer program run by the UnitedStates government since 1960s. Peace Corps mission is to“provide a a service opportunity for motivated changemakersto immerse themselves in a community abroad, workingside by side with local leaders to tackle the most pressingchallenges of our generation“ [1].

To address the need for preserving the memories of inter-cultural exchange, a group of Returned Peace Corps Volunteers

founded an online Museum of Peace Corps Experiences. Themuseum’s mission is to document the RPCV stories andpreserve the artifacts related to their service. Despite the factthat to date over 200,000 volunteers served in 141 countries,Peace Corps does not have a permanent exhibition space forthe souvenirs, the artifacts and the stories brought back by thevolunteers.

In partnership with the Committee for a Museum of thePeace Corps Experience, we designed and implemented aninteractive, mixed-reality educational experience for the Mu-seum using Magic Leap augmented reality glasses. The ap-plication allows the Museum visitors to interact with theartifacts and invite the “holograms” of the RPCVs into theMuseum visitor’s living room. To the best of our knowledge,this research presents the first attempt to bring the museumexperience into a visitor’s living room using mixed reality.Additionally, we introduce a physics-simulated artifact defor-mation through freehand interaction with artifacts as a way toengage audiences. Finally, we propose a process of low costvolumetric capture of RPCV storytellers to served the functionof museum guides. With the future proliferation of wearablemixed reality devices, we believe that this mode of exposure tomuseum artifacts may become an effective method of engagingnew audiences.

RELATED STUDIES

A number of projects across the spectrum of the reality-virtuality continuum [2] have been previously designed andevaluated in the context of museum and exhibition environ-ments. Recent examples include immersive experiences in theliterary museum in Trieste, Italy [3], an augmented realityexperience for a mine museum in the UK [4], or virtualrepresentation of museum artifacts [5]. A critical examinationof the use of immersive technologies in the museum contextwas provided by Stogner [6].

An illustrative example of the use of immersive technologiesin the museum context is provided by ARCO project [5].ARCO project introduced an efficient, low cost photogram-metry process for modeling museum artifacts, as opposed to

Page 2: Design and Evaluation of AHMED, an Ad-hoc Mixed-reality …€¦ · Leap wearable augmented reality device. The implementation is cross-platform and can be adapted for Hololens 2

Fig. 1. Gameplay. From left: A user interacting with the simulation of a carpet artifact. A volumetric recording of a Returned Peace Corp Volunteer. A userinteracting with a simulation of the “alpargatas” sandal with an accompanying volumetric recording of RPCV.

previously used laser-based scans. It also included a design andevaluation of the virtual representation of museum exhibitionsfor the World Wide Web and interactive kiosks [5]. Immersivemedia was found to offer creative solutions to problems com-mon in travelling museum exhibitions such as transportation,insurance and accessibility of artifacts themselves. The re-searchers noted that, based on the challenges brought about byweb-based immersive technologies, exhibitions must includea way for museum visitors to be able to interact with thedigitized content such that it would remind them of real life –an interaction philosophy known as a “natural user interface”[7].

The project presented in this paper addresses this inter-activity in a very familiar manner through hand gestures tomove the digital artifacts for closer inspection as they wouldnormally with real objects.

Examination of gesture interaction was addressed by Bran-cati et al. [8] who described a system that used an inter-active gesture-based interface for augmentation of heritagesites, where meaningful cultural information was shared. Thegesture-based interactivity consisted of selecting desired in-formation with fingertip clicking motions. The interactiveselections are connected to virtual points of interest thatrepresent nearby places in the physical environment that,when prompted, provide further information and guidance,such as directions to metro stations or tourist attractions.All the interactions are based on this fingertip motion whichmoves a virtual pointer used to select the points of interestin the augmented reality. The freehand interaction design wasexamined both in the indoor as well as outdoor conditionswith good results in both cases.

Immersive technologies have also been used in museumexhibits in a collaborative manner. Oh et al. [9] present a studydesigned to understand how multi-user, gesture-based interac-tions could enhance a playful, informal learning experience formuseum visitors. The study showed how participants were ableto have an individualized augmented reality experience thatwas unique and personal and at the same time participate in a

shared, social environment. The experiment was carried out inan informal learning environment, which lead to an experiencethat was immersive and self-directed when paired with theglasses-based AR where both hands were free to interactwith the mixed media visualization. They found that the sizeand weight of the AR glasses took away from the overallexperience. However, we believe that recent improvementsin ergonomics and design of AR glasses may have alreadyeliminated this issue.

GAMEPLAY

After launching the Magic Leap application, the user ispresented with a 3D menu that lines up the artifacts broughtby the Returned Peace Corps Volunteer (RPCV) accompaniedby a short description of the artifact, name of the RPCV, anddates of service. Once the visitor selects an artifacts, the scenepresenting the story of the artifact is loaded.

Each scene consists of an interactive, three-dimensional“hologram” of the artifact. The player can touch and, insome cases, deform the object using virtual hand interactionmetaphor [10]. Each exhibited artifact is accompanied by avolumetric capture and narration performed by the RPCV.RPCVs tell the story of the artifact they contributed and itsrelationship to their respective experience during Peace Corpsservice.

The virtual position of an artifact depends on the real-time reconstruction of the location in which the immersiveexhibition was launched. Using Magic Leap SDK mesh re-construction, the application performs partial reconstructionof the space in front of the user while searching for an empty,non-occluded area. Once a large enough area is identified, theexhibit is placed in the empty area.

The player’s interaction with the artifact is determined bythe type of object. The application supports three types ofartifacts: hard-body, soft-body, and cloth. Hard-body exhibits;for example, a piece of pottery, a metal object, or a sculpture,rotate at the touch of the visitor. The position of the objectremains unchanged.

Page 3: Design and Evaluation of AHMED, an Ad-hoc Mixed-reality …€¦ · Leap wearable augmented reality device. The implementation is cross-platform and can be adapted for Hololens 2

For soft-body exhibits; for example, a piece of traditionalclothing, a carpet, or a hat, the application performs a soft-body deformation simulation. Finally, for cloth objects; forexample, a carpet, shawl, or sandal made of fabric, theapplication performs cloth simulation. For example, touchingthe “hologram” of the carpet results in the carpet’s real-timedeformation simulation shown in Figure 1.

DESIGN & IMPLEMENTATION

The prototype of the virtual exhibit for the Museum ofPeace Corps Experiences was designed in Unity3D for MagicLeap wearable augmented reality device. The implementationis cross-platform and can be adapted for Hololens 2 or Meta2, or other AR devices that support hand tracking and gesture-based freehand interaction.

During the design phase, we reconstructed the Museum’sartifacts using the photogrammetry process, and recordedthe stories told by Returned Peace Corps Volunteers usingvolumetric capture.

The Curator’s Experience

The Curator is a user who can manipulate the artifacts’position, rotation, and scale. The goal of the curator is to setup the artifacts properties in a way deemed by the curator toprovide the best for the experience of the visitor of the virtualmuseum. This functionality is implemented to aid public orsemi-public mixed-reality exhibitions of the Museum of PeaceCorps Experiences. By manipulating the artifacts properties,the curator may, for example, re-position the virtual artifactonto a physical object, such as a table or a podium. The curatormay also re-scale or the object.

If the app is opened by the curator user, the device performsa full, as opposed to partial, reconstruction of the mesh ofthe location, identifying geometry of furniture, the floor andwalls. If the location has been previously scanned and did notchange significantly (e.g. furniture was not moved), the devicewill recognize it and the scan can be skipped.

The Curator interact with the exhibits using virtual handmetaphor (VHM) [?] aided with freehand gesture recognition.No glove or tracker is used. Instead, the interaction relies onMagic Leap’s built-in hand tracking and gesture recognition.In our experience, the hand tracking works reliably in varyinglighting conditions, including dimmed living room lights andbright daylight in the interior location. We did not test thehand tracking at the exterior location, because Magic Leapholograms are not bright enough for a exterior use.

For the selection confirmation, grabbing and manipulatingthe objects, we use gesture recognition. The Magic Leap SDKoffers stat-of-the-art machine-learning computer-vision gesturerecognition. The recognizer implements six default gestures:Open Hand Back, Index Finger, Fist, Pinch, Thumb, and OK(Figure ??). If no hand is being detected at the moment, thegesture recognizer reports No Hand gesture.

In our experimentation with the gesture recognition SDKprovided by Magic Leap, we find that the reliability of therecognition depends on the gesture. For example, in our

experimentation we found got many false positives for openhand gesture, and common confusion between the Pinch andOK gestures. Additionally, the gestures may switch quickly.For example, while the player is about to perform the pinchgesture, the gesture may be recognized as OK gesture duringthe fingers’ re-positioning.

To partially compensate the false positives and ambigu-ous recognition, we do not issue a action immediately afterrecognition of the gesture. Instead, we collect 10 samples ofgesture recognition. Only if the same gesture is continuouslyrecognized for all 10 samples, the gesture is consideredreliable and the action associated with the gesture is launched.

The mapping of gestures to manipulation actions is noveland constitutes an additional contribution of this paper, withexception of mapping the rotation to fist gesture rotationpreviously introduced by [?].

To perform manipulation, we fist require that artifact to bein a “selected” state. To select the artifact, the player performsa Finger gesture, similar to a pointing metaphor [?]. To de-select the artifact, the player touches the artifact again whileperforming the IndexFinger gesture.

Once the object is in the “selected” state, the player maymove, scale or rotate it. To move the artefact, the player mustperform the fist gesture while keeping their hand inside theobject and then move the hand to a desired location. To releasethe object at a new location, while keeping it in a selected, thecurator user must stop performing the pinch gesture. Recogni-tion of any gesture other than fist suffice to release the artifact.Once released the gesture remains in a selected state until it isdeselected. The object may accidentally become deselected, ifduring the release the curator accidentally performs the IndexFinger gesture on a selected artifact, however, we did notobserve that issue in our pilot study.

To upscale the artifact, the player performs the Open HandBack gesture outside of the selected artifact. Once the OpenHand Back gesture is recognized, the artefact starts scaling uplinearly, until the gesture is no longer performed. To downscalethe artifact, the curator performs Fist gesture, which is linearlymapped to scaling over time in the same manner. While in 2Dtouch interfaces, the standard gesture for scaling is typicallythe pinch gesture, we find the calculation of distance betweenthe thumb and the index finger not reliable enough to map thatdistance into the scale in a user-friendly manner.

To rotate the artifact, the curator performs a pinch gesture onnear the edge of the object. The point of contact between thethumb and index fingers and the artifact become an anchorof rotation. To perform the rotation, the curator moves thehand to a new position on an imaginary sphere around theobject, while holding the pinch gesture. To stop the rotation,the curator must stop performing the pinch gesture.

Reconstruction of the Museum Artifacts

Selected artifacts of the Museum were reconstructed usingthe photogrammetry process that has been previously used inthe reconstruction of museum artifacts [11]. The reconstructionconsisted of three stages. First, we took a varied amount of

Page 4: Design and Evaluation of AHMED, an Ad-hoc Mixed-reality …€¦ · Leap wearable augmented reality device. The implementation is cross-platform and can be adapted for Hololens 2

photos of each artifact in a well-lit green-screen studio. Weused Sony A7 camera with 24 megapixel CMOS sensor withtelelens set to 24mm, f − stop = 11, shutterspeed = 250,and ISO = 200. The number of photos taken depended onthe size of the object and the complexity of its geometry.For example, to reconstruct the Turkmen carpet of size 5’x 3’, we took 128 photos of both sides of the carpet. Forthe reconstruction of “Alpargatas” sandals from Columbia, wetook 283 photos.

The photographic process included systematically movingthe camera in an imaginary sphere around the object, takinga photo from each position. We changed the distance tothe object without modifying the lens length to help thephotogrammetric process to reconstruct the geometry of theobject while preserving the details of its structure. We haveused three distances: wide shot, close up, and extreme closeup.

To perform the photogrammetry, we used the beta versionof the RealityCapture software to perform the alignment,point cloud calculation, colorizing and texturing of the ar-tifacts. While many photogrammetry software options exist,Reality Caputre is one of the few software packages thatoffer GPU-based implementation, which significantly speedsup the process. We run the process on a Windows 10 PCworkstation with 64GB of RAM, Intel i7 processor, andNVIDIA GTX 1080Ti. In average, the process took 2h perartifact, including photo alignment, point cloud computation,texturing, mesh simplification, and export. We exported thereconstructed artifact to .fbx with the texture kept in a separate.png file.

The models created during the photogrammetry processoften consisted of hundreds of millions of vertices that werenot optimized for use in a Game Engine. Further, in ourexperience, Magic Leap cannot handle more than a fewhundred thousand vertices per scene without affecting theframe rate. Thus, we decimated the high-poly mesh of themodel down to three thousand vertices using incrementaldecimation algorithms implemented in MeshLab, an opensource 3D modelling software. In the case of the Turkmencarpet, which is a structure with simple, rectangular geometricshape, we have re-topologized and re-texturized the model.The low-poly mesh was further cleaned in ZBrush by 3Dmodeling artists removing errors and filling out holes in the 3Dmodel resulting from the photogrammetry process. We used

Fig. 2. Photogrammetry reconstruction of “Alpargatas” sandal

xNormal and zBrush to create Physically Based Rendering(PBR) texture, including normal map, occlusion map, androughness map. The resulting model had approximately 3, 000vertices. We generated 8192x8192 PBR textures, which is themaximum texture resolution currently supported by Unity3Dgame engine.

Optimizing the number of faces of the reconstructed 3Dmodels of the museum artifacts while preserving their aesthet-ics was important due in part to the Magic Leap devices’ limitson the displayable vertex number. More importantly, it had alsoaffected the performance of cloth and soft-body simulation,which allows the viewer-user to deform the object with touch.For simulation of soft-body and cloth deformation physics, wemodified a commercially available implementation of state-of-the-art soft-body and cloth deformation physics engineprovided by CPU-based Obi real-time physics framework,available as a Unity3D plugin. While the implementationof Obi real-time physics framework does not support MagicLeap’s Lumin Operating System (OS), we were able to reusethe libraries for ARM processor and successfully run it onLumin OS. In our experimentation, the particle and soft-bodycloth solvers offered by the Obi real-time physics engine didnot affect the frame rate on Magic Leap if the number ofvertices was less than 3, 000. For 3D models with higher vertexcount, the frame rate dropped significantly. For example, incase of a the carpet model with 10,000 vertices, resulting in10,000 particles in the cloth simulation, the frame-rate droppedto 1 fps. Thus, our decision to optimize the number of verticesin the model was dictated by the performance limitations ofthe Magic Leap device.

Volumetric Capture of the RPCV

The presentations of the reconstructed interactive museumartifacts were supplemented with stories written by the Re-turned Peace Corps Volunteers. To present the performanceof the stories by each RPCV, we performed a volumetriccapture of RPCVs. High quality volumetric capture of an actorcosts between 10, 000 USD and 50, 000 USD per minute ofcapture and is only available at a few specialized studios.Thus, to perform the volumetric capture, we used a low-cost, experimental volumetric capture software for Kinect v2.Because Kinect v2 recording resulted in high depth error,

Fig. 3. Recording the Volumetric Capture of a RPCV

Page 5: Design and Evaluation of AHMED, an Ad-hoc Mixed-reality …€¦ · Leap wearable augmented reality device. The implementation is cross-platform and can be adapted for Hololens 2

we designed a custom depth map filter to improve the depthposition of the capture (Figure 1).

PILOT STUDY

The prototype was tested in an informal setting that aidedour design process. We presented an interactive version ofthe prototype to members of various populations, includingchildren, expert users (UX designers, game designers, im-mersive system designers), and novice users (students, fac-ulty, members of general public). All participants respondedpositively to the immersive experience, providing qualitative,verbal feedback.

One participant observed that the application provides anovel way for interaction with museum exhibits, pointingout that visitors can interact with the virtual exhibits whiletypically museum exhibits cannot be touched. Another partic-ipant commented that the virtual exhibit is contextualized bythe visitor’s living room which makes the experience morepersonal than it would be in a public setting. The usersalso appreciated that the visit to the virtual museum requiredno travel thus broadening the potential audience, assumingproliferation of wearable augmented reality devices in thefuture. One participant with previous experience interactingwith magic, window-based mobile augmented reality pointedto significant positive difference, that the hand not occupiedwith holding a smartphone or a tablet can be used to interactwith the artifact. Generally, all participants expressed awe andappreciation with regards to interactivity of the artifacts.

The participants also found interaction with the artifactsreliable and easy to learn, regardless of their level of previousexperience with mixed reality. Children participants foundthe interaction particularly appealing. Further, one expert usercommented that a more fine-grained interaction using individ-ual fingers instead of the palm of the hand would improve therealism of interaction. We plan to incorporate this commentin future development of the project.

FUTURE WORK

To address the positive audience response to the prototypeof the virtual reality exhibit in the Museum of the Peace CorpsExperience, we plan to develop further the project in twodirections. First, we will expand the application to include acurator mode which will allow a museum curator to place theobject in specific place in an interior location, thus allowingfor quick, ad-hoc creation of virtual exhibitions. Second, wewill explore the efficacy of the system in promoting the PeaceCorps vision and mission. To do so, we will perform a formal,qualitative study comparing mixed-reality museums with otherform of engagements, such as online virtual museums.

ACKNOWLEDGMENT

We would like to thank the Museum of Peace CorpsExperiences for the artifacts and stories they provided. Wewould also like to express our sincere appreciation for all theReturned Peace Corps Volunteers who travel to our researchlab and brought their artifacts to participate in the project.

Fig. 4. A visitor interacting with a museum exhibit in his living room

Finally, we would like to thank Akash Vasishtha for his helpin improving the reconstructed 3D models of the artifacts, andWestley thompson for providing his expertise with Magic LeapSDK.

REFERENCES

[1] P. Corps. (2019) Changing lives the world over. [Online]. Available:https://www.peacecorps.gov/about/

[2] P. Milgram and F. Kishino, “A taxonomy of mixed reality visualdisplays,” IEICE TRANSACTIONS on Information and Systems, vol. 77,no. 12, pp. 1321–1329, 1994.

[3] C. Fenu and F. Pittarello, “Svevo tour: The design and the experimen-tation of an augmented reality application for engaging visitors of aliterary museum,” International Journal of Human-Computer Studies,vol. 114, pp. 20–35, 2018.

[4] T. Jung, M. C. tom Dieck, H. Lee, and N. Chung, “Effects of virtualreality and augmented reality on visitor experiences in museum,” inInformation and communication technologies in tourism 2016. Springer,2016, pp. 621–635.

[5] K. Walczak, W. Cellary, and M. White, “Virtual museum exbibitions,”Computer, vol. 39, no. 3, pp. 93–95, 2006.

[6] M. B. Stogner, “" the immersive cultural museum experience–creatingcontext and story with new media technology".” International Journalof the Inclusive Museum, vol. 3, no. 3, 2011.

[7] D. Wigdor and D. Wixon, Brave NUI world: designing natural userinterfaces for touch and gesture. Elsevier, 2011.

[8] N. Brancati, G. Caggianese, M. Frucci, L. Gallo, and P. Neroni,“Experiencing touchless interaction with augmented content on wearablehead-mounted displays in cultural heritage applications,” Personal andUbiquitous Computing, vol. 21, no. 2, pp. 203–217, 2017.

[9] S. Oh, K. Park, S. Kwon, and H.-J. So, “Designing a multi-userinteractive simulation using ar glasses,” in Proceedings of the TEI’16:Tenth International Conference on Tangible, Embedded, and EmbodiedInteraction. ACM, 2016, pp. 539–544.

[10] K. Pietroszek, “Virtual hand metaphor in virtual reality,” inEncyclopedia of Computer Graphics and Games, N. Lee, Ed.Cham: Springer International Publishing, 2018, pp. 1–3. [Online].Available: https://doi.org/10.1007/978-3-319-08234-9_178-1

[11] C. Nicolae, E. Nocerino, F. Menna, and F. Remondino, “Photogrammetryapplied to problematic artefacts,” The International Archives of Pho-togrammetry, Remote Sensing and Spatial Information Sciences, vol. 40,no. 5, p. 451, 2014.