Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e...

72
Computer Science, Degree Project, Advanced Course, 15 Credits Survey of Virtual and Augmented Reality Implementations for Development of Prototype for Practical Technician Training Tobias Lindvall & Özgun Mirtchev Computer Engineering Programme, 180 Credits Örebro, Sweden, Spring 2017 Examiner: Franziska Klügl Örebro Universitet Örebro University Institutionen för School of Science and Technology Naturvetenskap och Teknik SE-701 82 Örebro, Sweden 701 82 Örebro

Transcript of Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e...

Page 1: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Computer Science, Degree Project, Advanced Course,

15 Credits

Survey of Virtual and Augmented RealityImplementations for Development of Prototype for

Practical Technician Training

Tobias Lindvall & Özgun Mirtchev

Computer Engineering Programme, 180 Credits

Örebro, Sweden, Spring 2017

Examiner: Franziska Klügl

Örebro Universitet Örebro UniversityInstitutionen för School of Science and Technology

Naturvetenskap och Teknik SE-701 82 Örebro, Sweden

701 82 Örebro

Page 2: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Abstract

Virtual training is a vital way of educating technicians to make them prepared for real maintenancework. Technicians that are educated to perform maintenance work on JAS 39 Gripen, complete parts oftheir training through the application Virtual Maintenance Trainer (VMT), which can provide a detailedsimulation of the aircraft during operation. The technicians are able to complete courses and lessonswith specific procedures such as debugging internal computers and parts of the aircraft and performingmaintenance work to fix errors. However, the application is desktop-based and to make the educationeven more effective, there is a desire to explore the possibilities in virtual and augmented reality.

This report explores the alternatives of education tools in virtual reality and augmented realitythrough a survey. In the survey, the advantages and disadvantages of current implementations areexamined to provide an optimal system which could work to give technicians a realistic practicaltraining simulation experience.

Based on the results of the survey and by using the game engine Unity, a prototype applicationis built which can simulate technician training procedures on a model of JAS 39 Gripen. HTC Viveand Leap Motion were used to immerse the user into the simulation world and to enable realisticinteraction. A technician may be able to learn through completing different training procedures in thesimulation by walking around and interacting with a full-scaled Gripen aircraft.

Sammanfattning

Virtuell träning är ett viktigt sätt att utbilda tekniker för att förbereda dem för underhållsarbete iverkligheten. Tekniker som utbildas för att utföra underhållsarbete på JAS 39 Gripen, genomför delarav utbildningen genom programmet Virtual Maintenance Trainer (VMT), som kan återge en detaljeradsimulering av flygplanet under drift. Teknikerna kan delta i lektioner med specifika uppgifter sominkluderar att felsöka interna datorer och delar av flygplanet samt utföra underhållsarbete för att åtgärdafel. Programmet är dock skrivbordsbaserat och för att göra utbildningen mer effektiv, finns det enönskan om att utforska möjligheterna i virtual och augmented reality.

Denna rapport undersöker alternativen för utbildningsverktyg i virtual reality och augmentedreality genom en teoretisk undersökning. I undersökningen vägs fördelar och nackdelar för nuvarandeimplementeringar för att tillhandahålla ett optimalt system som kan fungera för att ge tekniker praktiskerfarenhet i en realitisk träningssimulering.

Baserat på resultaten från undersökningen och genom att använda spelmotorn Unity, har enprototypsapplikation skapats som kan simulera teknikerutbildning på en modell av JAS 39 Gripen.HTC Vive och Leap Motion användes för att låta användaren kliva in i simuleringsvärlden och för attmöjliggöra realistisk interaktion. En tekniker kan lära sig att utföra underhållsåtgärder genom attgenomföra olika träningsförfaranden i simuleringen genom att gå runt och interagera med ett fullskaligtGripen-flygplan.

Page 3: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Preface

A big thank you to our supervisor at Saab, Johan Gustafsson, for all the help, encouragement andsupport during the exam work. A special thanks to Linus Lindberg at Saab, who helped us by providingvaluable technical knowledge regarding 3D modelling.

Many thanks to our supervisor at the university, Andrey Kiselev, for giving feedback for writing thereport and for enabling us to use available hardware for the prototype. Also thank you to our examiner,Franziska Klügl, for help with writing the report.

i

Page 4: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Table of Contents

List of Figures iv

Abbreviations v

1 Introduction 11.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.1.1 Saab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.1.2 Virtual Maintenance Trainer . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.2 Project . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.3 Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.4 Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.5 Division of Labour . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2 State-of-the-Art Survey 52.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.2 Clarification of Virtual and Augmented Reality . . . . . . . . . . . . . . . . . . . . . . 52.3 Guidelines for Training in VR and AR . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.4 Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.5 Virtual Reality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

2.5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72.5.2 Hardware and Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72.5.3 Related Studies and Implementations . . . . . . . . . . . . . . . . . . . . . . . 12

2.6 Augmented Reality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172.6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172.6.2 Hardware and Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172.6.3 Related Studies and Implementations . . . . . . . . . . . . . . . . . . . . . . . 20

2.7 Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242.7.1 Multi-user Capabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242.7.2 Graphics and System Performance . . . . . . . . . . . . . . . . . . . . . . . . . 252.7.3 User Interactions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 292.7.4 Miscellaneous . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

2.8 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312.8.1 Result Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312.8.2 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

3 Methods and Tools 333.1 Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

3.1.1 Development Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 333.1.2 Event-Driven Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

3.2 Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353.2.1 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

ii

Page 5: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

3.2.2 Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 363.2.3 Development Frameworks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

3.3 Other Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

4 Prototype Development 384.1 Refuel Procedure Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

4.1.1 Ground Crew Panel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 384.1.2 Refuelling Access Panel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 384.1.3 Refuel Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394.1.4 Instructions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

4.2 Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 404.2.1 Virtual Model Representations . . . . . . . . . . . . . . . . . . . . . . . . . . . 404.2.2 Interaction Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

4.3 Discussion of Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 484.3.1 Expenditure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 484.3.2 Evaluation of Used Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . 484.3.3 Development Potential and Future Developments . . . . . . . . . . . . . . . . . 49

5 Discussion 505.1 Compliance with the Project Requirements . . . . . . . . . . . . . . . . . . . . . . . . . 50

5.1.1 Summary of the Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . 505.1.2 Fulfilled Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

5.2 Impact on Society . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 515.3 Project Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 525.4 Reflection on Own Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52

5.4.1 Knowledge and Comprehension . . . . . . . . . . . . . . . . . . . . . . . . . . 525.4.2 Proficiency and Ability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52

Bibliography 53

Appendix A Setting up Unity with HTC Vive and Leap Motion 62

Appendix B Prototype System Diagram 64

Appendix C Demonstration of Implementation 65C.1 Menu Interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65C.2 Refuel Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65

iii

Page 6: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

List of Figures

1.1.1 An overview of the VMT system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

2.2.1 The RV Continuum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.4.1 Basic VR and AR System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72.5.1 Oculus Rift with Touch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82.5.2 HTC Vive with controllers and Lighthouse basestations . . . . . . . . . . . . . . . . . . . 92.5.3 Samsung Gear VR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.5.4 Plugged Leap Motion controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.5.5 Microsoft Kinect V1 & V2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.6.1 Microsoft HoloLens . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182.6.2 Meta Company’s Meta 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182.6.3 Google Glass . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.6.4 Google Cardboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192.7.1 Image illustrating the FoV of HoloLens for displaying virtual objects . . . . . . . . . . . . 28

3.0.1 A top-down view of the tracked area by using the Room Overview Window in SteamVR . 343.2.1 Basic Prototype System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353.2.2 HTC Vive with a front-attached Leap Motion . . . . . . . . . . . . . . . . . . . . . . . . 35

4.1.1 Panels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394.2.1 JAS 39C Gripen . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 414.2.2 Refuel equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 414.2.3 Hand models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 424.2.4 Picked up refuel hose nozzle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 434.2.5 Rotating the refuel valve protection cap . . . . . . . . . . . . . . . . . . . . . . . . . . . 444.2.6 Button interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 444.2.7 Menu . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454.2.8 Laser pointer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454.2.9 Navigation sequence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 464.2.10 Refuel manual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 464.2.11 X-Ray mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 474.2.12 Multiple users . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

A.1 View of finished project setup of VR camera and Leap Motion modules . . . . . . . . . . 63

B.1 Diagram showing an overview of the used devices and the process of the prototype . . . . 64

iv

Page 7: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Abbreviations

AR Augmented Reality. 3–6, 13, 17–25, 27, 29–33, 50–52

AV Augmented Virtuality. 5, 6

CAVE Cave Automatic Virtual Environment. 12–14, 24, 25, 31, 32

COTS Commercial off-the-shelf. 12, 24, 26, 31, 32

DOF Degrees of Freedom. 8, 10, 14, 15, 17, 18, 23, 27, 28, 31

FOV Field of View. 8–10, 15–19, 21, 22, 24, 26–28, 31, 48, 49

HMD Head-Mounted Display. 7–11, 13–20, 24–33, 35, 48

IMU Inertial Measurement Unit. 8, 26

SDK Software Development Kit. 10, 15, 20, 24, 33

VMT Virtual Maintenance Trainer. 1–3, 12

VR Virtual Reality. 3–8, 10–17, 20, 24–27, 29–33, 36, 49–52

v

Page 8: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

1 Introduction

Educating technicians in practical training procedures often requires the use of expensive hardwarewhich also might have a limited availability. A training procedure may include the technicians to docertain practical tasks in order to learn about maintenance and inspection work of a particular hardwaresuch as a vehicle or other machines.

With the swift development of virtual reality and augmented reality, a desire among many companieshas emerged to find out how the technologies could be utilised for practical training procedures asaccessible and cost-effective educational tools.

This thesis includes a survey of the state of the art in virtual and augmented reality technologies. Thesurvey focused on existing implementations for practical training and was conducted using a qualitativemethod. Using certain keywords related to the topic, data was congregated and then compiled to extractimportant information by considering different parameters relating to the objective.

A prototype was created (based on the results of the survey) to evaluate the technical capabilities and toshow the concept of a practical procedure training in virtual or augmented reality.

1.1 Background

1.1.1 Saab

This thesis has surveyed the current technologies in virtual and augmented reality. Based on the resultof the survey, a suggestion was made of how Saab could develop their technician training tool.

Saab AB is a military vehicle manufacturer, founded 1937, by the Swedish government with thepurpose to secure the production of Swedish fighters. One fighter in particular called JAS 39 Gripen,one of the most known fighters, started to be produced 1988 and has since then undergone fourgenerations (A, B, C and D) and the latest in development is the E generation.

To maintain a Gripen fighter, extensive knowledge about the aircraft is required by educatedtechnicians, provided by the specialised maintenance technician training application for Gripen, VirtualMaintenance Trainer (VMT) 39, which is developed by the business area Saab Aeronautics. It allowsthe training of technicians who will specialise in inspecting and maintaining the Gripen fighter.

1.1.2 Virtual Maintenance Trainer

VMT 39 is a desktop-based education application developed by Saab Aeronautics that runs on theVMT platform, which supports other VMT training applications for other vehicles. The platform, alsodeveloped by Saab, can be run on the most common operating systems, such as Windows, Linux andUNIX. The VMT 39 application can be used to cost-effectively train Gripen technicians or pilots. Itcovers virtual based maintenance training through offering education of the aircraft system, procedure

1 of 52

Page 9: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Introduction Background

Figure 1.1.1: An overview of the VMT system

training, system diagnostics and fault localisation. Tailored training sessions can be created usingvideos, sounds, charts, interactive simulations, 3D animations, etc. The system can be used partly bya technician student for self-education but also by a teacher under instructional training course. Thesystem runs on a dedicated workstation, with three screens and a joystick. A diagrammatical overviewof the system may be seen in figure 1.1.1.

1.1.2.1 Users

Trainees and instructors constitute the types of users that can use VMT 39. Trainees connect tothe application server using the Java-based client where they are allowed to practice certain tasks.Instructors use the same client program but have the rights to create lesson setups for trainees toparticipate in.

1.1.2.2 Course Creation

The VMT platform is using the Shareable Content Object Reference Model (SCORM) [1] standardto make e-learning content and Learning Management System (LMS) work together. This enablescreation of learning material which can be used in other SCORM conformant systems.

All the created courses in an application are backed up to the server. When an instructor has createda course in a VMT application, it is stored on the server enabling the selected trainees to access thesame course through client workstations. In the next version of VMT, the Tin Can API [2], will beimplemented instead of SCORM, which allows for more LMS functionalities.

During the use of the VMT 39 application as an instructor, it is possible to create new courses andcustom lessons. For each course, different learning modules can be created with different components.

2 of 52

Page 10: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Project Introduction

Course Modules define a lesson as a training, examination, guidance type etc. Different modulesonly shows relevant components that can be added to that type of module.

Course Components are added to a chosen course module. Different course components can in real-time display cockpit panels, show 3D models of the aircraft or show diagrams and panels to monitorelectric, hydraulic and fuel flows. From the components the instructor is also able to decide if there aregoing to be any errors during run-time for the trainee to handle.

1.2 Project

VMT uses computer displays to visualise the maintenance work during a procedure for a technicianstudent. To make this area of the training more efficient and realistic, a specific education environmentwas needed to provide realistic interactions with virtual models. This was achieved by using availabletechniques in Virtual Reality (VR) and Augmented Reality (AR) to be an auxiliary component in VMTto create a practical virtual education environment.

This work included doing a survey of different approaches and implementations of already existingsolutions for using VR and AR for educational purposes. The result of this survey would revealwhether VR or AR is appropriate to use as an educational tool with the current technology, throughcomparisons showing the different advantages and disadvantages. Possible hardware recommendationswill also be proposed and a prototype will be developed showing the possibility for technician trainingby using appropriate application development tools.

The application prototype would enable a user to execute a procedure on the aircraft by interactingwith virtual models through the use of controllers or their hands. The kind of hardware or softwarethat would be used were dictated by the survey. This application will not be integrated with VMT butwill instead be a proof of concept to show that practical education in virtual or augmented reality couldwork as a complement for VMT. It will be developed as a standalone application but could perhaps infuture VMT systems, be integrated as a separate lesson or configuration.

1.3 Objective

The objective of the project was to investigate the possibilities of providing a realistic practicalprocedure training concept for aircraft technicians. The investigation would cover VR and ARtechnology and be based on three different training situations:

a) A teacher should be able to demonstrate a procedure on a virtual engine, while the students standaround this object, all wearing head-mounted displays. The teacher should be able to rotate andturn the virtual/superimposed object. The students will in turn be able to observe the model inmore detail.

b) An inspection procedure should be done by the student alone, e.g; be able to inspect the aircraftbefore take-off, such as controlling air-pressure in tires, refuelling the aircraft or troubleshootingcockpit errors. Different parts of the aircraft should also be able to react to interactions, such asopening a door when making a gesture etc.

3 of 52

Page 11: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Introduction Requirements

c) The system should provide the tools to complete a maintenance task. The student should receivea set of instructions from the teacher and fulfil the requirements to complete the task. This needsto be done by using the instructions manual together with virtual tools such as a screwdriver etc.

1.4 Requirements

The project should have completed an in-depth survey of the current situation of VR and ARtechnologies, with a documentation of advantages and disadvantages. From the survey, investigationsshould indicate which appropriate technology is suggested for a practical technician trainingapplication.

A prototype application should be created to demonstrate the capabilities of the recommendedtechnology concept and attempt to implement the three scenarios mentioned in the 1.3 Objectivesection.

1.5 Division of Labour

Doing the survey, the workload was shared equally where each of the authors, on their own, collectedand summarised the relevant data which then was discussed. The major part of the work that had tobe done for planning and implementing the prototype was accomplished in an agile way using pairprogramming. Occasionally the labour was separated where the authors shared the work equally withmanipulation of 3D models and producing code for the prototype application.

4 of 52

Page 12: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

2 State-of-the-Art Survey

2.1 Introduction

This survey used a qualitative research method to find appropriate data for the topic, by searching forrelated papers using backward reference searching. Keywords that were used are related to virtualmaintenance training. Expansion of the research field was enabled by the backward reference search,to find new areas to analyse where fields such as medical and other areas were included.

This chapter starts with a brief introduction of the definition of Virtual Reality (VR) and AugmentedReality (AR) in Section 2.2 with a follow-up of the guidelines for implementing an application in bothfields in Section 2.3. The survey is divided as such that in Section 2.5 and Section 2.6, virtual realityand augmented reality are introduced with an explanation of available hardware before presentingavailable implementations regarding practical virtual training. It is recommended to read throughboth of these sections in order to get a better view of the available technology in both fields beforecontinuing on to the results of the survey. Section 2.7 is devoted to the results of the presentedimplementations and how virtual or augmented reality may perform with regards to different factorsin practical procedure training. Finally, the conclusion of the study is presented in Section 2.8.

2.2 Clarification of Virtual and Augmented Reality

The following section will further describe the meaning of VR and AR and where it ends up in theReality-Virtuality Continuum. The presented continuum (Figure 2.2.1) is a prescribed system toelucidate the meaning of the extremes that are included. Four classifications of environments andrealities are shown, where mixed reality includes environments between the real and the virtualenvironments called AR and Augmented Virtuality (AV) [3].

The real environment is the natural environment one can see with their own eyes, consisting of onlyreal objects, devoid of any artificial amplifications. VR is described as an environment consistingentirely of virtual objects, which can be observed through a display [3].

Between the real and the virtual environments, there are environments where the virtual objects arein coalescence with the real objects, offering the ability to, in real-time, use real objects as virtual

Figure 2.2.1: The RV ContinuumSource: [4]

5 of 52

Page 13: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

State-of-the-Art Survey Guidelines for Training in VR and AR

objects and vice versa. These environments are called AR and AV. There is no definite agreement onthe definitions of these two environments, however, three factors have been defined to enable a decisiveexplanation of how one may differentiate between them. The three factors are divided as:

Extent of the World Knowledge (EWK) The recognition of the world and complementary objects

Reproduction Fidelity (RF) The visual quality of the reproduced real and virtual objects

Extent of Presence Metaphor (EPM) The amount of presence or immersion the observer should feel

From this information, it is possible to define AR as a virtual environment for the observer, allowing thepossibility of displaying virtual objects in a real environment - making it closer to the real environmentside in the Reality-Virtuality continuum graph. In contrast, Augmented Virtuality enables theobservation of objects or data, from the real environment, in a virtual environment setting. Dependingon the weight of each previously mentioned factors, it either brings one closer to the real environmentor the virtual environment [3].

While there are studies in specific areas of mixed reality for AV, this chapter will focus on VR and ARfrom a first-person perspective, exploring how the technologies work in industrial, educational andother similar contexts.

2.3 Guidelines for Training in VR and AR

Implementing a system in VR and AR requires several factors to be fulfilled to offer a realisticexperience fit for technician training environments. Gavish et al. [5] has introduced four differentguidelines to follow when designing VR or AR applications for maintenance training. By followingthe guidelines it is possible to increase the training efficiency, enhance skills acquisition and enable adevelopment of a useful model of the task for the users. The four guidelines include:

• Proper integration of observational learning

• Enactive learning by combining physical and cognitive fidelity

• Providing guidance aids

• Providing sufficient information about the task

Having these guidelines as well as guidelines from Oculus [6] and Leap Motion [7], it will provide aproper way of analysing the different related implementations and help with design decisions for theprototype.

2.4 Architecture

VR and AR implementations that are observed, have a similar architecture which takes input data andcomputes the data in a simulation, producing some output for the user. As an example, the user mayinput data by moving their hands or their position. Based on this input, the simulation will calculate thenew state of the virtual entities and output a new representation of the virtual environment to the user.A diagrammatical overview of the inputs and outputs between a user and the devices (Figure 2.4.1)may give a better understanding of how a typical VR or AR system work.

6 of 52

Page 14: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Virtual Reality State-of-the-Art Survey

Figure 2.4.1: Basic VR and AR System Overview

2.5 Virtual Reality

2.5.1 Introduction

There have been multiple technological developments and experiments in the recent years in the worldof Virtual Reality (VR). New innovations are continuously emerging to bring the development forward.The VR technology has found its way into many areas of application, such as education, health careand maintenance training, to mention a few. Which specific technologies to use, ultimately depends onwhat task is at hand [8].

An important aspect of VR is the visual perception, enabled by different visualisation devices whichmay offer the user a visual experience of the virtual environment. Another aspect is user interaction,achieved by several devices today, providing different levels of immersion. For instance, using glove-or camera-based tracking techniques, the user is able to interact with a simulation by performing handgestures or remotely controlling a virtual hand [9].

Haptic feedback is also used to increase the immersion in a virtual reality setting. A way to achieve thisis through built-in actuators in hand controllers that respond to certain events or through special vestswhich can produce body contact simulations by electrical impulses [10].

2.5.2 Hardware and Software

This section will present hardware and software used in the world of VR today. For hardware, thedescription of the most prevalent devices today in the market will be included, together with anytracking capabilities they may have. The software subsection will include the description of most usedengines and development frameworks to create VR applications.

2.5.2.1 Device definitions

Wired Head-Mounted Display (HMD) in VR, available in the current market are dominated byOculus Rift and HTC Vive. The HMD’s are often used in a spacious room, connected to an externalworkstation with camera trackers in the corners of the room. Positional tracking is enabled by the

7 of 52

Page 15: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

State-of-the-Art Survey Virtual Reality

cameras so that the user may walk around within artificial borders and experience a six degree offreedom positioning. The artificial borders aid the user from walking too far from the boundaries ofthe virtual environment [10].

Non-wired HMD’s are the second type of HMD’s. Google Cardboard, Google Daydream andSamsung Gear VR are all non-wired HMD’s and require a smartphone. Compared to wired devices,these devices (without additional external tracking devices) only provide three Degrees of Freedom(DOF) rotational tracking.

Tracking and Input Devices are essential input methods in VR since it enables the virtual recreationof a user’s physical actions. Technologies today enable tracking of the head, hands or the entire bodyof the user to represent their actions in a responsive way. Besides tracking there are other primordialnon-tracked input devices, which will be mentioned towards the end of this subsection.

2.5.2.2 Hardware

Oculus Rift is the third iteration HMD from Oculus and was released 2016. It has a combinedresolution of 2160x1200 with a refresh rate of 90 Hz and a Field of View (FOV) at 110° [10].

A built-in gyroscope, magnetometer and accelerometer in the Oculus Rift HMD enables tracking ofthe user’s head orientation in three DOF. Through a method called sensor fusion [11], which usesinputs from the three mentioned sensors, the user’s in-game head orientation (yaw, pitch and roll)is determined. Positional tracking of the user’s head is achieved through the Constellation systemdeveloped by Oculus. This system uses micro-LED on-board the HMD, which are tracked by at leastone external IR-camera (Oculus Sensor), enabling optical tracking of the user in six DOF [6, 12].

In late 2016, Oculus released Oculus Touch, which is a native handheld input device of the Oculus Rift.In a virtual environment, Oculus Touch can represent the user’s hands accurately and in an intuitiveway. Positional tracking is achieved by the built-in LED’s and utilisation of the same Constellationsystem, used for the Oculus Rift HMD. The Oculus Sensor can be used for both Oculus Rift and theTouch. Rotational tracking for the Oculus Touch is performed using a built-in Inertial MeasurementUnit (IMU), similar to the HMD, enabling tracking of the user’s hands in six DOF. The device also hastouch triggers, push buttons and an analogue stick enabling user interactions e.g. gesture recognition,user orientation, grabbing virtual objects etc. [13]. An image of the HMD and the Touch controllers canbe seen in Figure 2.5.1.

Figure 2.5.1: Oculus Rift with TouchSource: [14, 15]

8 of 52

Page 16: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Virtual Reality State-of-the-Art Survey

HTC Vive was released in 2016 and is the main competitor to Oculus Rift. Similar to Oculus Rift ithas a combined resolution of 2160x1200 with a refresh rate of 90 Hz. The FOV is at 110° [16, 10]. Thedistinct difference to the Oculus Rift is the used tracking technique.

The HTC Vive HMD uses a technique for positional tracking called ’Lighthouse’, developed by Valve.This technique works similar to how it is used in maritime navigation, where ships navigate nearthe shore by counting the time between flashes from nearby lighthouses. Similar to that, Lighthouserequires the use of so called base stations (acting like lighthouses), which are accompanied with HTCVive. The base stations, attached to each corner of the wall of an open room, tracks peripherals byperforming an omnidirectional flash. Each base station then transmits alternating sweeps of horizontaland vertical IR-lasers of the room to trigger the photo-sensors on the peripherals. By comparing thetiming of when the sensors are activated, the exact position and orientation can be calculated with highaccuracy and low latency [17, 18].

Communication with the workstation is done through a Linkbox, which powers the HMD and connectsthrough USB and HDMI. The Linkbox also communicates with the basestations through bluetooth toactivate or deactivate the devices since they only requires power to operate.

Vive Tracker is an upcoming device which can be used to track anything of choice. It may be attachedto a specific item or to the body and provides an easy way for integrating full body experience into thevirtual environment. However, for tracking the entire body, it requires multiple Vive Trackers to be ableto give an accurate estimation of the body in the virtual environment. It uses the same ’Lighthouse’technique for positional tracking as the HTC Vive and the controllers [19].

The HTC Vive controllers was released together with the HTC Vive HMD in 2016 and includes 24sensors, a multi-function trackpad, a dual-stage trigger and haptic feedback. It is used for interactionwith the virtual environment using the hands but gives a visual representation of the controllers insteadof the hands [20]. Figure 2.5.2 displays the Lighthouse basestations together with the HMD and thecontrollers.

A new implementation of a wireless HTC Vive HMD has been developed in cooperation withIntel, which will be released in 2018. This relieves the need of a workstation and opens up furtherpossibilities for developing an efficient training system with this device.

Figure 2.5.2: HTC Vive with controllers and Lighthouse basestationsSource: [21]

9 of 52

Page 17: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

State-of-the-Art Survey Virtual Reality

Samsung Gear VR was developed in cooperation with Oculus and Samsung. It is a mobile,tetherless HMD that requires a connected Samsung smartphone to run. Only novel Samsungsmartphones (Galaxy S6 or later) are supported. The HMD provides the user with a horizontal FOVof 101° [22].

By utilising the built-in gyroscope/accelerometer and proximity sensor in the connected smartphone,Gear VR enables rotational tracking of the user’s head (three DOF).

The cooperation between Samsung and Oculus has also resulted in a VR controller, intended for theGear VR. This controller uses the same tracking technique as the Oculus Rift through sensor fusion.Additionally, the Gear VR has a built-in touchpad on the side to interact with the virtual environmentand communicates with the smartphone through bluetooth. Figure 2.5.3 shows the GearVR HMD.

Figure 2.5.3: Samsung Gear VRSource: [23]

Leap Motion is a pocket-sized, rectangular shaped and a USB-connective device that is marketedfor being an accurate hand-tracking tool in a VR or desktop context (Figure 2.5.4). This device inconjunction with the associated API, can in real-time determine the position of a users hands andfingers in a Cartesian space. The Leap Motion controller has three IR-LED’s and two IR-cameras,providing stereo vision, which enables optical tracking in six DOF [24]. The captured data is of agrayscale stereo image, which is streamed through the USB-connector to the computer where the OrionSoftware Development Kit (SDK) takes over and performs mathematical calculations with trackingalgorithms to reconstruct a 3D representation of what the device sees [25].

Figure 2.5.4: Plugged Leap Motion controllerSource: [26]

10 of 52

Page 18: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Virtual Reality State-of-the-Art Survey

Microsoft Kinect V1 enables tracking of the user’s entire body through an RGB-D camera (Figure2.5.5a). The captured data is computed to a depth map through structured light which infers the bodyposition through machine learning. The later released model, V2 (Figure 2.5.5b), uses a built-in Time-of-Flight camera, which emits light and detects the reflected light. The time between emission anddetection is then measured to determine the position of predefined objects [27].

(a) V1 (b) V2

Figure 2.5.5: Microsoft Kinect V1 & V2Source: [28, 29]

2.5.2.3 Software and Development Frameworks

There are several software applications which are widely used when designing for VR. The prominentgame engines today are Unity [30] and Unreal Engine 4 [31], which are cross-platform engines used forgame development. Unity and Unreal Engine supports plugins and packages to be imported to make iteasy to create a VR application for different HMD’s.

Software Development Kits (SDKs) Short description of a few selected SDKs and used APIs.

• OpenVR [32] is developed and controlled by Valve to support communication with HTC Viveand other virtual reality headset devices to run the same VR application. Essentially OpenVRconsists of two APIs and acts as a mediator between other software and other devices. TheFirst API, OpenVR API, is to allow communication between application and a software whichcommunicates with the drivers of an HMD. The second API, OpenVR Device API, allowscommunication between the HMD software, such as SteamVR, and the drivers of the HMD.

• Open Source Virtual Reality (OSVR) [33] is developed by Razer and Sensics and is similar toOpenVR to enable different devices to work with the same game but with the main differencebeing that it is open source.

• Oculus SDK [34] is used to enable installation and communication of drivers for using OculusRift. Oculus have one API which communicates with Oculus Runtime, which in turn directlycommunicates with the HMD and the tracking system.

• SteamVR [35] is a closed-course software by Valve and is used to enable communicationwith drivers of different HMDs, but primarily for HTC Vive, by using the OpenVR API.Communication between the HMD device, HTC Vive, and SteamVR does not require the useof the OpenVR Device API however, since the drivers for the HMD and tracking devices alreadyare implemented into SteamVR.

11 of 52

Page 19: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

State-of-the-Art Survey Virtual Reality

The mentioned SDKs allows flexible integrability with different software such as Unity and UnrealEngine.

2.5.3 Related Studies and Implementations

The following section will describe several studies in the field of VR in the recent years and whatconclusions they came to with their implementations.

Before presenting the different studies a clarification of mentioned platforms may be useful. Oftenwhen talking about virtual systems there are two types that are mentioned: Commercial off-the-shelf(COTS) and Cave Automatic Virtual Environment (CAVE). COTS platforms often describe a systemthat is able to be purchased commercially for a low cost, and often used by only one user. CAVEplatforms is a projection-based system where a virtual environment is projected on surrounding wallsby projectors, where several users may experience the same virtual environment together. Theseacronyms will be used frequently throughout the text but focus will mostly be on COTS platforms.

2.5.3.1 Maintenance and Inspection Operations

A practical training environment called Virtual Teaching and Training System (VTTS) was developedby Quan et al. [36], to teach aircraft maintenance. VTTS is a computer based on VR technology andreal-time education, similar to a CAVE. This particular VTTS uses a projective display and an LCDdisplay with workstations and an aircraft simulator. The VTTS configuration has the possibility torealise numerous teaching contents and subjects. Similar to Virtual Maintenance Trainer (VMT),it possess the ability to implement a system with various routine operations, emergency situationsand examinations. It is also possible to render a virtual scene similar to the real world operationsand work conditions. By using this system the authors have come to the conclusion that this kind ofapplication for maintenance training, reduces man-made errors, aircraft damage and accidents. VTTSenables to use virtual objects instead of real aviation equipment which lowers the cost, while essentiallymaintaining the same study efficiency. This type of learning environment will also increase the studyefficiency compared to traditional methods e.g. text and multimedia.

For maintenance, Chang et al. [37] has suggested a solution for training in a virtual environment. Intheir implementation of a practical procedure training system for substation maintenance, they present aconcept closely similar to VMT. Their system simulates models of equipment, failures that may occurand procedures to be completed by the student. A training and examination management tool is builtinto the system to help the teacher to keep track of each student. A mouse and keyboard are used asinput devices for the platform. The platform communicates with a server, running on a Linux operatingsystem, where all the computing is done. The scenario data (models, faults or tests) used by the serveris stored in a database. The clients (teachers and students) are using workstations with the Windowsoperating system.

The system provides many tools for the teacher to monitor training procedures and managetraining archives and teaching plans. Compared to the teachers, students have an interface with limitedaccess, but with the modules for training or examination of a procedure. A virtual toolbox (pliers,screwdrivers, duct-tape etc.) is also provided during the simulation which enables the student to use the

12 of 52

Page 20: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Virtual Reality State-of-the-Art Survey

appropriate tool during a certain procedure. In this current implementation, the information is relayedon a monitor but may very well be to an HMD instead.

A recent study of an immersive VR maintenance training conducted by McNamara et al. [38] usedOculus Rift (Development Kit 2) and Kinect V2, and the game engine Unity. The study focuses ontracking a human body by using the Kinect V2 and examining the positional accuracy and systemdelay during use. Positional accuracy offers realistic representations of corresponding user actions.System delay is the time between a physical user action and the time for the action to be displayed onthe HMD. It is an important factor to avoid simulator sickness to prevent the user from becoming dizzy[39]. Understanding the problems with system delay may aid in the understanding of the limitations ofa VR system.

Ten participants interacted with a virtual model depicting an Arresting Gear engine (an engineto decelerate an aircraft when it lands). Interactions included walking around the model and to trydifferent body positions to test the capabilities of the Kinect V2.

Results showed that positional accuracy of the Kinect V2 performed well, for a realistic movementin the virtual environment, on tracking a single user with the sensor in front. However specific bodypositions such as crouching, turning away from the sensor or blocking the view of the sensor wouldcause the Kinect to lose its tracking capabilities. Furthermore, it was unable to detect small gestures,which made it improper for the use of training where high accuracy was required.

On the other hand, a considerable amount of the system delay was caused, between a performedaction by the user and the rendering of the action on the HMD. Kinect V2 has a data transition rate at30 Hz compared to the HMD at 75 Hz. As a consequence, user-performed actions, with this type ofsystem, may take longer to update and cause disorientation which may cause simulator-sickness [6].Moreover, a contributing factor is the stereoscopic rendering on the Oculus Dk2, which requires thecomputer to render a scene twice per frame, causing an excessive demand on the CPU and GPU. Toalleviate this, the authors suggests using a computer with better specifications, with emphasis on betterperforming CPU, GPU and RAM. The time spent to prepare the image for the HMD would then bereduced significantly. Additionally, the system performed better on a computer running the operatingsystem Windows 8.1 and a dedicated graphics card, due to the different compatibilities with the driversbetween the hardware at the time.

This type of training system may be suitable for when the user will need to become familiar withan equipment in a virtual environment by walking around it, or by interacting with it by using largedistinct motions.

Continuing the topic of maintenance Borsci et al. [40] made a study of a train car service procedurewith an implemented virtual-based training tool called LAugmented Reality (AR)TE-VBT. It is basedon the HoloVis game engine (InMo) and enables users to visualise and interact with CAD models indifferent devices. The devices in this study compared between a CAVE platform, Oculus Rift and atabletop display.

In the CAVE system, there were four walls with each wall having two projectors which projectedthe virtual environment. Nine workstations were used to visualise the virtual world. The user had touse polarised glasses with built-in trackers for interaction. The tabletop display was a device namedzSpace, a 24 inch LCD running at 120Hz. A stylus device was used as a controller and similar to theCAVE, polarised glasses were used as a tracking device. In the case of Oculus Rift, it was used togetherwith desktop controllers and joysticks. 60 participants were divided into three groups and were giventhe task to do random maintenance operations on parts of a train car. One group would use the CAVE

13 of 52

Page 21: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

State-of-the-Art Survey Virtual Reality

system, second group the tabletop display and the third group the Oculus Rift.The conclusion of the study presents all three of the devices as good learning experiences with the

largest difference coming down to price. While the CAVE is a largely expensive alternative (>50k €),the tabletop display has a more reasonable price (< 6k €) and the Oculus Rift with a workstation at thecheapest price point (< 2k €). The authors also mention the importance of acceptance of the end-userswhen using VR systems for learning since the success of a system like this depends on how good theuser believes the virtual training is comparable to the real world.

Cabral et al. [41] presents a simulator system for maintenance of power grids distribution lines whichfollows most of the guidelines in section 2.3 well. Oculus Rift is used to visualising the environment.Interaction with the environment is enabled by tracking real tools, which receives a virtual depiction, togive the user a realistic feeling during learning. The tracking system is based on an extremely accurateinfrared camera system called OptiTrack. The body of the student is also tracked and is depicted as anavatar in the virtual environment allowing the teacher to assess if the maintenance is done in a correctway. Both the teacher and the student has their own program interface.

The teacher can through their perspective control of the training allow different situations tooccur to allow different challenges on the maintenance lessons. Challenges may include doing themaintenance at different weather situations, structures being on fire, experiencing electrical arcs andshort-circuits and other obstacles such as trees, cars etc. From that, the teacher is also able to see howthe student reacts to and solves the problem.

The student performing the maintenance training uses the Oculus Rift to see the virtual worldand do the operations required. Other students may, through an external screen, see what the trainingstudent is doing, to learn from each others mistakes.

DAssault Systèmes is a worldwide aerospace company, producing both passenger and fighter jets.Recently they presented a virtual simulator developed for virtual maintenance training for their Falconjet, by using their own 3DExperience platform. It is designed to be an inexpensive alternative foraircraft maintenance training to their already implemented CAVE system. It uses CATIA CAD datato provide an immersive training experience for engineers and technicians.

The system uses Oculus Rift HMD together with a front-attached Leap Motion, to provide handtracking capabilities, enabling real-time interaction for the users. Constellation tracking is also used forthe Oculus Rift to provide six DOF virtual experience. Multiple instructors or trainees are allowed inone session, each represented by an avatar in the virtual world. The Leap Motion tracker also allows theuser to see their own hand and the hands of others, useful when the teacher needs to point at a certainobject, or when a trainee points at an object while asking questions about it.

Interaction in the world is enabled by a menu, spawned by facing your left-hand palm towardsthe Leap Motion controller. From the menu, mentioning only a few things, the user is able to (i) settransparency of the virtual model, (ii) select different data to show, (iii) recolour models to make someparts more distinct from others, (iv) bring the trainees to the instructor position, (v) spawn a pointersteered by the HMD motion to point or select an object further away from the hand. Movement in thevirtual world is done by the arrow-buttons on a traditional keyboard [42].

2.5.3.2 Educational Benefits

A proposal for a virtual education environment was put forward by Crespo et al. [43] by testingthe benefits of a VR simulator. The simulator allows engineering students to perform an interactive

14 of 52

Page 22: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Virtual Reality State-of-the-Art Survey

operation on an industrial robot. Used hardware was Oculus Rift HMD and Razer Hydra Joysticks.The application was designed using the game engine Unity. Oculus SDK was used for configuringthe Oculus Rift while Sixense SDK was used to allow programming for six DOF motion tracking ofposition and orientation with the Razer Hydra controllers. Students that tested the system executed anumber of interacting tasks of the virtual robot with the controllers, by moving the robot around in thevirtual environment or making it perform certain tasks, such as moving items.

It was concluded that this system has promising possibilities of becoming a reliable engineeringlearning tool. The used software and hardware enable the system to be highly flexible by working withdifferent platforms, operating systems and controllers. Furthermore, it allows for mistakes to be madeand makes it possible for the student to experience what will happen if something goes wrong. Throughthe immersive interaction, it enabled a swift learning of specific routines’ and reduced the need to usereal hardware.

Alhalabi [44] also made an educational study with 48 students, by comparing different VR systems.The VR systems were provided from Worldviz, a worldwide provider of VR systems. It included TheCorner Cave System (CCS, a projector based VR system) and an Oculus Rift DK2. The CCS systemrequires the users to wear special shutter glasses equipped with precision position tracking ‘eyes’ tosee and interact with the virtual world through the projected screens. The students were divided intogroups with (1) no VR, (2) HMD with tracking, (3) HMD without tracking and (4) CCS with tracking.The tracking system was set up with eight cameras for user tracking in a spacious room. The studentswere asked to study different engineering materials and learn about new concepts. The results showedthat the learning efficiency increased significantly by using VR systems. The CCS system would bean appropriate alternative for multiple users at a time, however, the HMD system is more inexpensive,with the additional advantage of providing a more immersive environment for the user.

Additional virtual training implementations has included Oculus Rift together with Leap Motion (andKinect V2) to provide an effective way of training for situations in construction and fire emergencies.[45] used Unreal Engine 4 in their implementation of an inexpensive virtual environment for anengineering and construction setting. [46] used OpenGL to implement a web-based training systemfor fire wardens to offer a safer alternative for training the cognitive decisions during fire in a building.Both of the projects used Industry Foundation Classes (IFC) to define CAD-graphic data, related toarchitecture and construction, as 3D objects.

2.5.3.3 Input Practices

In terms of user control, Khundam [47] implemented a system using the Oculus Rift DK2 togetherwith Leap Motion to provide a way of controlling the movement inside a virtual environment with handpalm gestures. Using Unity, an application was created to provide a virtual space for the user to movein freely. 11 participants tested the system with two different scenes for each of the input methods.In the experiment, the subjects’ mission was to move around the virtual world in a fixed order. Theevaluation showed that using the Leap Motion controller the users were able to complete the missionsfaster with the palm control gesture, enabling finer control of the movement speed, than the gamepad.However, due to the smaller FOV of Leap Motion, the users had to face the HMD towards their handswhen performing the specific gestures.

Consequently, Alshaal et al. [48] presents their own platform, SmartVR, as an alternative to the

15 of 52

Page 23: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

State-of-the-Art Survey Virtual Reality

limitations of current available input methods and to expensive wearable devices. In particular, theFOV limitations of Leap Motion was mentioned to be less effective. The SmartVR platform was usedtogether with the Microsoft Band 2 wristband, a VR workstation and Oculus Rift DK2. A Bluetoothtransceiver was used for communication between the workstation and the wristband. Unity was usedto develop the application of a virtual store shop, where the user would be able to pick two modes,switching between them by performing a distinct gesture. First mode is a selection mode, which letsusers select virtual items, and a navigation mode, which enables movement in the virtual environment.Second mode is a navigation mode, where the user may perform a swiping gesture to move around,moving towards the direction of the swiped action. In the selection mode, the user is able to selectthe virtual item with a spawned cursor representing the hand, while the camera is static. A menu wasprovided to manipulate the object, such as changing colour or size. Using a wearable device in thisway, such as the Microsoft Band 2, enables the hands to be tracked regardless of the orientation of theHMD and allows the user to be less limited when performing the gestures.

16 of 52

Page 24: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Augmented Reality State-of-the-Art Survey

2.6 Augmented Reality

2.6.1 Introduction

Augmented Reality (AR) is a technique which augments what the user can see, feel and hear. Usually,an AR application involves visualisation of virtual objects superimposed on the real world, to mainlysupplement the real world with useful information [49, 50].

The occurrence of virtual objects, displayed in an AR application, together with their pose and position,can be based on camera-recorded fiducial markers which operate as positional reference points [51, 50].A marker may consist of a picture or pattern, which is recognisable by the particular AR application,such as QR-codes (Quick Response Codes), BIDI codes (BIDimensional) or bar codes [52].

A virtual object’s appearance can also be based on marker-less tracking. For instance, externalstationary cameras can be used for recognition and tracking of real-world objects. Based on detectedobjects in the video recording, the occurrence, position and pose of a virtual object can be determined[51]. The marker-less positioning of virtual objects can also be applied through the usage ofaccelerometers, GPS-technique, compasses etc [50].

The possibility for user interactions, in an AR application, can be implemented in similar ways as withVirtual Reality (VR). Such as gesture tracking, voice commands, hand controllers, touch devices etc.[53].

2.6.2 Hardware and Software

In this section, modern AR-related hardware and different development frameworks will be presented.

2.6.2.1 Device Definitions

Head-Mounted Display (HMD) intended for AR, can display the real world, augmented withadditional virtual objects for the user. There are both portable and tethered models for data transferand/or power. An HMD can be classified as either an optical-see-through or video-see-through device[53].

2.6.2.2 Hardware

Microsoft HoloLens is an optical-see-through device with a visor, on which virtual objects may beprojected in a maximum resolution of 1268 × 720 pixels [54]. By utilising four built-in spatial-mappingcameras and a depth camera, a 3D-model of the surrounding environment is created. The geometry ofthis 3D-model is used to determine the position and pose of virtual objects (also called holograms inHoloLens context) and for positional head tracking. Rotational head tracking can be achieved by usingthe built-in gyroscope, magnetometer and accelerometer. Both of these tracking techniques enables sixDegrees of Freedom (DOF) movement. The horizontal Field of View (FOV), in which the 3D-objectsmay be projected, is 30° wide. Figure 2.6.1 depicts the HoloLens HMD.

17 of 52

Page 25: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

State-of-the-Art Survey Augmented Reality

Figure 2.6.1: Microsoft HoloLensSource: [55]

User interactions can be performed by looking in a particular direction while performing a certainfinger gesture (recorded by the built-in camera) or through voice commands in Cortana, MicrosoftWindows’ voice-activated assistant [56, 57]. Microsoft’s HoloLens is completely tetherless andprobably the most known of currently few available AR HMDs. HoloLens can be bought in adevelopment kit, fit for an individual developer. For enterprises a similar kit can be purchased, but witha warranty and features like higher security etc. [58]. HoloToolkit, which is a set of APIs towards theHoloLens utilities, enables development of applications. These APIs can be integrated into Unity.

Meta 2 is similar to HoloLens; an optical-see-through device with common properties and amaximum resolution of 2560 × 1440 pixels distributed over a 90° wide FOV [59]. One difference isthat Meta 2 is not tetherless when running and requires a connection to a workstation with Windows 8or 10. The real world environment features are captured using a built-in computer-vision-camera. Thiscamera enables virtual object positioning through a modified simultaneous localisation and mapping(SLAM) algorithm. The movement of the wearer is tracked by using a built-in inertial measurementunit. Totally this provides six DOF [60]. Figure 2.6.2 shows the Meta 2 HMD with the connectioncable.

Figure 2.6.2: Meta Company’s Meta 2Source: [61]

18 of 52

Page 26: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Augmented Reality State-of-the-Art Survey

Google Glass is an optical-see-through HMD that appears like a regular pair of glasses which canprovide an image of virtual objects, displayed on the built-in 640 × 360 pixel prism-projector screenjust above the user’s natural line of sight (Figure 2.6.3) [62]. To the user, the built-in screen appearscomparable to a 25-inch high definition screen viewed from a distance of 2.4 meters [63]. Similarto Google Cardboard, Google Glass captures video in the direction the user is facing with a built-incamera. The recorded video can then be used for object recognition, video conversations etc. Userinteractions can be performed by voice commands or through a built-in touchpad located on the sideof the spectacle frame [53]. Using the Android API also enables rotational head tracking by utilising abuilt-in gyroscope, accelerometer and magnetometer [64].

Figure 2.6.3: Google GlassSource: [65]

Google Cardboard is a primitive video-see-through HMD (Figure 2.6.4) that requires an attachedsmartphone to be used for AR. Video of the real world can be captured by the camera of thesmartphone and displayed on the screen in real-time, augmented with additional virtual objects. Theexperienced horizontal FOV can alternate between 90° to 120°, depending on the attached smartphone[53].

Figure 2.6.4: Google CardboardSource: [66]

19 of 52

Page 27: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

State-of-the-Art Survey Augmented Reality

Tablets and smartphones also offer the possibility to be used for AR applications. The user observesthe screen which displays a video of the environment; captured by the built-in camera [67]. A tabletor smartphone device itself can be utilised for user interactions, which may be performed using thetouchscreen or the microphone by voice commands, depending on the used AR software [53].

The Leap Motion controller can, similar to VR, be used for achieving natural user interactions inAR applications [24, 9].

2.6.2.3 Software and Development Frameworks

Vuforia is a proprietary AR Software Development Kit (SDK) that uses image recognition throughcomputer vision technology for creating virtual objects. A Vuforia mobile app utilises the built-incamera to capture the environment. It can recognise a 2D-image (normally a photo, drawing, label etc.)or 3D-objects like cylinders and cuboids by reaching a cloud database, where predefined 2D- or 3D-data for recognition is stored. The program uses the data as a reference point to determine the positionand pose of the virtual object that will be imposed on the real environment. Vuforia supports nativedevelopment for iOS and Android but also enables integration with the Unity editor for targeting bothplatforms [50].

ARToolKit is a multi-platform, open-source (LGPLv3) library for development of marker-basedAR applications. Similar to Vuforia, ARToolKit uses image recognition for determining occurrence,position and pose of virtual objects, but can only recognise 2D images, not 3D-objects. The libraryallows for integration into the Unity editor, providing a convenient way of developing AR applicationsfor any platform [50].

D’Fusion is a proprietary, multi-platform AR engine established by Total Immersion. The enginesupports both marker-less and marker-based tracking. One of the strengths is that the engine has well-developed human face recognition and tracking. To develop D’Fusion-based applications, the GUI-based tool D’Fusion Studio is used [68, 50].

2.6.3 Related Studies and Implementations

2.6.3.1 Usages in Assembly, Maintenance and Inspection Work

Aircraft maintenance training activities are closely related to maintenance and inspection work tasks ine g. the automotive or aviation industry. Ramakrishna et al. [53] propose a framework for inspectionwork using AR-technique. The framework consists of an HMD or tablet that streams video images toa back-end server, using UDP-connection. The server runs the images through a Convolutional NeuralNetwork (CNN) to detect certain objects for inspection. When an object is recognised by the server,modelling data is sent to the HMD/tablet providing a guiding graphical overlay for the user.

An experimental study including this framework was performed, by the same authors, where threedifferent AR platforms were used: Google Glass, Google Cardboard and an Android tablet. Theexperiment was carried through on 20 subjects (engineers and research staff from an industry research

20 of 52

Page 28: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Augmented Reality State-of-the-Art Survey

lab). Their mission was to perform an inspection task on a 3D-printer. To create an inspection guidinggraphic overlay, the user initially scanned a QR code, placed on the printer, which provided usermanual data for the specific printer model. The user then had to inspect different parts of the printerand answer a set of questions through platform-specific interactions. The built-in touch-pad was usedfor Google Glass, voice commands for Google Cardboard and the touch screen for the Android tablet[69, 53].

Observations during this experiment show that stress was reduced when using AR with graphicoverlays containing the needed information compared to looking through extensive paper- or PDF-manuals for completing complex tasks. The Android tablet performed best of the three platformssince the users found it the most intuitive. Google Cardboard, on the other hand, caused eye-sickness,gave the user a claustrophobic feeling and the low graphics resolution aggravated the user’s ability toread printed text on the printer parts. A low FOV gives the user a cramped vision which can reducethe awareness and be dangerous in rough industrial environments. The voice command interactionsalso work poorly in crowded/noisy areas. The Google Glass was hard to fit over regular safety glassesand the swipe patterns were difficult to remember. It is a more expensive platform than the others inthis experiment but could be convenient for users in situations where both hands are needed for worktasks. In summary, the tablet was the best candidate in this experiment due to simplicity and measuredinspection turnaround time [53].

Industrial assembly work, like previously mentioned inspection work, can be related to maintenancetraining. Loch et al. [70] analyse a training tool that provides step-by-step guidance for assemblytasks in an AR-setting. It consists of a computer screen, placed in front of the user, which displays areal-time camera recording of the workspace for the assembly task with an additional virtual graphicoverlay for assembly guidance. The camera is mounted above the workspace, pointing downwards.A study was conducted to evaluate and compare AR-based to video-based guidance for assemblytasks. 17 undergraduate students from social science and technical disciplines, without any previousexperience in AR-guidance, participated in the study. Before the experiment, preparation training wasperformed to become more familiar with the AR-guidance system. The experiment involved a practicalpart, where the participants were to assemble two different models using Lego blocks.

Measurements in the study showed that the average time to complete an assembly task was longerfor the video-based instructions compared to the AR-guidance and the average number of errors perassembly task was over ten times(!) higher. Evidently, using AR-guidance increased the accuracyand task performance due to the lower number of occurred errors, compared to using the video-basedinstructions. The author draws the conclusion that, for assembly tasks, experts in a certain area are themost effective guides, but that AR-based solutions work better in several important aspects than video-based instructions or printed manuals [70].

Boeing Research and Technology (BR&T) performed several comprehensive studies that explorethe usage of AR solutions in satellite and aircraft part manufacturing. For developing virtual workscenarios, BR&T used D’Fusion Studio. The choice of visualisation AR-platform fell upon high-endruggedized PC tablets, snapped into hands-free holders to enable work with both hands simultaneously.A set of eight external Vicon RGB cameras mounted to the inner-roof of a dedicated room, wereused for accurate marker-less tracking [50]. For passing the captured camera input to the D’Fusionapplication, a custom-made Lua script was developed. Microsoft Kinect was used for low-resolution3D modelling of objects for tracking and recognition, but also allows existing CAD files to be utilised.

21 of 52

Page 29: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

State-of-the-Art Survey Augmented Reality

The system was tested by simulating an assemblage of an aircraft wing.According to the BR&T studies, utilising AR techniques in ideal engineered environments can-

provide more effective manufacturing processes in terms of task completion time and failure rate.Although, more development will be needed for the technology to be extensively applicable in mainlinemanufacturing industries. BR&T proposed an AR solution that an average manufacturing worksite today can suffer from camera line of sight problems when working in narrow spaces or behindoccluding vehicle parts. Due to this problem, external cameras or sensors should not be preferred as thetracking solution. For display, tablet PCs works but does not give the user a deep level of immersion.Other more immersive display technologies, that can provide a wide FOV and high safety, would bepreferred in manufacturing processes [51].

Caterpillar Inc. is the world’s leading mining equipment manufacturer at the time of this report. During2015 Caterpillar, in cooperation with ScopeAR, developed an AR application for maintenance andinspection work assistance, using marker-less tracking; that can run on a smartphone, tablet or ARglasses [71]. The application provides a graphic overlay which guides the user through a certainmaintenance or inspection task. The overlay consists of arrows, floating text and 3D models of machineparts involved in the task. Work instructions are given step-by-step and the application can evaluate ifthe user has completed a certain step in a correct way [72, 73, 74].

2.6.3.2 Educational Benefits

Ibáñez et al. [49] conducted an empirical study, including 40 ninth grade students, for the purpose ofexploring student behaviours, learning effectiveness and motivation while using a marker-based AR-simulation tool called AR-SaBEr. The tool can be used for learning basic electrical principles anddeveloped by using Vuforia and Unity [50]. The experiment used Android tablets and was divided intothree topics: electric current, voltage and resistance. Each topic had one reading activity and two tothree experimental activities where instructions were given step-by-step in a highly guided way calledscaffolding. The students were given the task to build basic virtual electrical circuits from componentslike batteries, fans and switches which were visualised on marked physical blocks. By moving theblocks into the right place, while observing the augmented scene on the tablet screen, a circuit couldbe constructed. AR-SaBEr and similar tools can be useful as interactive learning environments and areespecially important for achieving desirable results in given tasks. To help students keep the focus ontheir goal, it is necessary to provide free experimentation but also additional scaffolding for completingtasks.

Using the AR-SaBEr, Ibáñez et al. [67] also made a case study, investigating the attitude of learnerstowards education with AR as a technical device. The subjects in this study were physics undergraduatestudents with no previous experience in usage of AR. Their task was to understand and solve anelectromagnetic problem which included a visual 3D cube with a force to be determined and asimulation of the trajectory of a particle.

Two single-choice questions with four alternatives were then answered by the subjects. Beforethe experiments, the subjects were given some AR preparation training with a smartphone applicationwhich lets the user determine the position of a point in a 3D-space. The result from the study concludedthat AR helped students to visualise the electromagnetic problem. The subjects also enjoyed it, whichpositively affects the attitude towards AR-learning that in turn can motivate students to performlearning activities.

22 of 52

Page 30: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Augmented Reality State-of-the-Art Survey

2.6.3.3 Object Tracking Investigations

User interactions is a vital part of an educational AR application. Natural user interactions can bringimmersion to the application and depth sensors can be used for achieving this. For detecting body orhand gestures with depth sensors, Microsoft Kinect and Leap Motion Controller are today’s most useddevices.

Kim et al. [9] analyses a set of approaches for user interactions in a partially marker-based ARapplication. The implementation was done with the Vuforia SDK and Unity [50]. 20 persons, whereof11 who had previous experience of augmented reality, participated in the experiment. A SamsungGalaxy Tab S 10.5 was used for running the application and a Leap Motion Controller to captureuser interactions. Two different approaches were analysed without using the Leap Motion Controller.The first, vision-based approach records the user hands with the built-in camera, where the programinterpreted the different hand gestures. The second approach uses the screen of the Android devicefor touch-based interactions. These cases were compared to an immersive approach where the LeapMotion Controller is used for tracking the hands of the user and virtually recreated as 3D hands in theapplication for natural user interactions.

Users performed three experimental tasks by using the mentioned interaction approaches: Thefirst task consisted of transforming a virtual 3D bunny into a goal position and pose. The secondtask involved picking up and hanging a virtual 3D torus on a pin. The last task consisted of a simpleassemblage of virtual engine parts; the user was to put a piston into a conrod, insert a lock pin into ahole and then to screw the pin into the hole.

The mean value of measured completion time for the first task was 45% longer for the touch andvision-based methods compared to using the immersive method. No measurable differences wereobserved while measuring failure rate.The second task did not produce any measurable differences inperformance between the interaction methods, but a wide-angle lens attached to the camera helped togrant vision of the fiducial markers while performing the task. In the final task, the average measuredcompletion time for the touch-based, vision-based and immersive methods were 225, 190 and 120seconds and the mean failure rate were 55%, 40%, and 5% respectively.

The results show that the immersive implementation outperforms touch and vision-based methods.By using this or a similar method, the user can easily, in a natural and intuitive way, manipulate virtual3D objects in six DOF.

Garon et al. [75] investigates the HoloLens and concludes that there are limitations in the device thatcan not be neglected in situations where high precision is important. It was found that shortage of high-quality depth data is one limitation that can result in poor tracking and thus, bad positioning of virtualobjects. When developing for the HoloLens, a programmer is only permitted to use Microsoft’s ownAPI and cannot reach the raw data from the built-in sensors. This prevents implementation of other,better-purposed tracking functions and algorithms. To bypass this restriction, Garon et. al. mountedan external Intel RealSense RGBD (Red-Green-Blue-Depth) camera on a HoloLens. The cameracaptured video that was processed on a connected stick PC and transferred via Wi-Fi, to the HoloLens.The mentioned processing on the stick PC consists of a program with a sophisticated object detectionimplementation that detects and estimates the pose and position of a scanned object. When testing thiscustom-built system, a great improvement in detection of objects measuring close to 20 × 20 × 20 cmcould be observed.

23 of 52

Page 31: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

State-of-the-Art Survey Result

2.7 Result

This section will contain comparisons between different system depending on the objectives. WhileVirtual Reality (VR) and Augmented Reality (AR) both can satisfy similar needs, there are bothadvantages and disadvantages with using techniques for both concepts.

From the given requirements, separate areas were extracted for evaluation. Since the requirementsstate that the teacher should be able to demonstrate a virtual model together with a group of students,using AR or VR technology to, one requirement would be to have a system which supports multipleusers simultaneously. Both a teacher and a student should be able to interact with the virtualenvironment. Therefore making the manipulation of virtual models necessary, requiring support foruser interactions. Being able to accurately observe a highly detailed virtual model, adequate graphicsand performance are desirable for both VR and AR [6]. A high performing system for this mattershould deliver low system delay, sufficiently high frame rate and stability in the placement of virtualobjects [76]. These topics will present advantages and disadvantages of various VR and AR systems forthe particular area.

2.7.1 Multi-user Capabilities

Multi-user in VR

There have been many VR systems utilising the Cave Automatic Virtual Environment (CAVE)platform. The main advantage is inherited by its multi-user capabilities and by offering an immersiveand close-to-reality environment. Compared to VR Commercial off-the-shelf (COTS) systems, thisplatform can enable a wider Field of View (FOV) and allow multiple users to physically stand in theCAVE-area to feel present within the virtual environment. However, it only allows one user to betracked at one time to control the simulation.

From the related studies and implementations, there is only one implementation which uses a COTSplatform with added multi-user capabilities. DAssault Systèmes, used their 3DExperience platform,to create a multi-user virtual environment where an instructor guides trainees around a CAD-based3D-model of a Falcon jet aircraft. Since this system is using Oculus Rift which is a tethered Head-Mounted Display (HMD), each user must have their own workstation. They use a network system,where the trainees join a session guided by an instructor. Since this is a novel implementation, there areno studies regarding how well the multi-user usability of this system works and if there are any possibledrawbacks. However, there are videos [77, 78] which may give an observation of how the system worksin general.

Though there are currently a limited amount of presented solutions for multi-user environments for VR,there are multiple Software Development Kit (SDK)’s and third-party applications being continuouslydeveloped. Using Unity 5 also easily enables the creation of multiplayer environments through usingtheir networking API. Most SDK’s include some form of networking capabilities and are too many tomention all of them but most prominent ones are the built-in HLAPI of Unity and the Photon Engine[79] among others. The support is mainly for the more popular HMDs such as Oculus Rift and HTCVive. The implementations are similar to how standard multiplayer networking works, for instancepeer-to-peer, client-server, client-side prediction etc. [80].

24 of 52

Page 32: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Result State-of-the-Art Survey

Multi-user in AR

Microsoft HoloLens offers the ability to share virtual objects amongst multiple users throughnetworking solutions. HoloToolkit includes a specific network library intended for implementing multi-user support into applications. Running such an application, a common object can appear with the sameposition and pose for all connected participants. Users can appear as avatars (virtual representationsof the users) to each other. In a multi-user session, participants can be in the same or different physicalrooms. To enable synchronisation of virtual objects and avatars between participants, a virtual anchor(a spatial reference point) is placed and shared between the users. This ability of object-sharing inreal-time can be used while multiple users collaborate e. g. designing, manipulating or just observing3D-objects [81]. Video conversation programs like Skype etc. may be used for communication,collaborating and guiding between HoloLens, PCs, smartphones and tablets [82]. In a similar way tothe HoloLens, Meta 2 HMD can also be used for collaboration [83].

Mobile devices can be utilised for AR as well with multiple users in the same virtual environment.A marker-based application developed using Vuforia or ARToolKit, may run on smartphones and/ortablets. Multiple users can observe the same virtual object by turning the tablet/smartphone cameratowards the same marker. Pose and position transformations of the virtual object are performed byphysically moving or rotating the marker. Other manipulations and interactions with an object arepossible by using one of the standard multiplayer network solutions; allowing connected users to seeand interact with shared virtual objects in real-time [84].

The Google Glass does not allow the same type of flexibility in positioning of virtual 3D-objects dueto the fixed built-in display. Instead, a multi-user situation could consist of e.g. a video conversationbetween a desktop user and a Glass user. In this case, the desktop user would receive a point-of-view video feed, captured by the built-in camera of Google Glass. The Glass user is then able tosimultaneously perform a certain work task while receiving guidance from the desktop user throughaudio and video. Google Glass can also be utilised for streaming a first-person point-of-view videowhile performing a work task for instructional usage.

2.7.2 Graphics and System Performance

This section will cover details about devices used in VR and AR and their graphical and systematicalcapabilities.

Performance of VR Devices

A CAVE system can consist of one or multiple walls, with each wall having two projectors displayingthe projections. The user needs to wear 3D glasses, for displaying stereoscopic images, to experienceVR. Stereographic rendering renders the same scene twice, meaning a great deal of computer poweris required. Tracking is enabled by using different systems, such as camera-based tracking, magnetictracking and optical tracking - each one of these may work well depending on the situation but comeswith their own problems. The graphics engine possibilities for a CAVE system are limited and usuallyhave expensive license costs. There are other alternatives besides fully professional CAVE’s, with lower

25 of 52

Page 33: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

State-of-the-Art Survey Result

prices [85] though with lower rendering quality. A system like this requires a high-grade graphics cardand high-quality projector which requires regular maintenance and calibrations [86, 87].

In the area of COTS platforms, the most relevant HMDs are Oculus Rift and HTC Vive. Both ofthese are similar to each other but still have their internal differences. Both have a screen resolution at2160 × 1200, refresh rate at 90 Hz, a FOV at 110 °, offers six rotational and spatial positioning, makingit adequate to give a full VR experience and reduce simulator sickness [6, 39]. One advantage HTCVive has to Oculus Rift is a larger tracking area, 4.6× 4.6 meters with Lighthouse compared to 1.5× 3.3meters with Constellation [10].

HTC Vive uses Lighthouse and Oculus Rift uses Constellation, both of these are known as positionaltracking systems and formally called drift correction systems. The positional and rotational trackingis managed by a gyroscope, an accelerometer and an on-board Inertial Measurement Unit (IMU). Acritical issue with using IMU as positional tracking is that it is prone to generate drift errors due togravity and other noises [75]. Since the sampling rate of the IMU is up to 1000 Hz, the errors may addup quickly and cause the tracked object to move along undesired directions. However, the issue maybe alleviated by using reference points. Lighthouse generates reference points by registering the orderof activated photodiodes by the base station lasers, providing 60 updates per second from each axis ofthe tracked object. Constellation generates its reference points by pulsing 10-bit patterns in the LEDs ofthe HMD, which is detected by the global shutter of the cameras, also providing 60 updates per second.The data of the IMU and the photodiodes is then computed to provide an accurate calculation of wherethe tracked object is in the 3D space and helps with reducing latency [17, 88].

Currently, the biggest problem with outside-in tracking, such as Constellation and Lighthouse, isocclusion. If something is blocking the view of the sensor from the LEDs of the tracked object, it willnot be detected, thus reducing the tracking capabilities of the object. This problem may be alleviatedby using more sensors, but at the same time, the price becomes higher which makes it undesirable as along-term solution. Oculus has tried to mitigate this problem by patching the tracking software, whichhas helped the majority of consumers, but the underlying problem with occlusion still remains withthese types of tracking devices.

When regarding the potential of GearVR and its capabilities of wireless virtual experience, it lookspromising to use for different applications in the future. The HMD offers a FOV at 101 ° and trackedrotational orientation. Rest of the specifications depend on which smartphone is used e.g. using ittogether with a Samsung Galaxy S7 provides a screen resolution at 2560 × 1440 and refresh rate at60 Hz. No studies were found to have used this device in maintenance or inspection work, probablybecause of lack of positional tracking. However, if the rotational tracking is good enough, to onlyobserve a virtual environment from a static position, there are options to develop custom applicationsfor the GearVR by using the Oculus SDK. Common problems that usually affects this system ismostly caused by the used smartphone. These problems relate to a limited amount of battery time andoverheating issues, which may be ameliorated by better performing smartphones in the future.

Regarding optimal specifications of VR HMDs, having a wide FOV, display refresh rate and lowsystem delay gives a better experience and immersion for the user. Failing to achieve an optimalvalue of these requirements may give a bad experience for the user and cause motion sickness. Therecommended refresh rate is at 90 Hz for HMDs to reduce the motion-to-photon latency to tolerablelevels. Motion-to-photon latency defines the time it takes from the movement of a users’ head, untilthe simulation is updated to represent the new position on the HMD. The motion-to-photon latency

26 of 52

Page 34: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Result State-of-the-Art Survey

is affected by the input latency, GPU processing time and the display latency. Failing to meet theserequirements, may cause the user to experience an unsynchronised 3D simulation where the actionsand the physical movements of the user are not properly mirrored in the simulation environment.

Having an adequate FOV is also important in reducing motion sickness inducing factors andis a great factor for immersion and giving the user an effect of presence. To get an immersive VRenvironment, it is recommended to have a horizontal FOV at 120° and a vertial FOV at 135°, dependingon the position of the viewers eyes relative to the HMDs lenses. When using complex 3D models orgraphics it is even more important to make sure the system is capable of displaying the simulationenvironment at an adequate framerate which is possible to achieve by applying different computingtasks to improve rendering quality such as using occlusion culling or remeshing the complex models,which in turn may require a powerful workstation to handle the required computations [6, 39, 89, 90].See Table 2.1 for an overview of the technical details.

Table 2.1: Technical details of selected VR HMDs

Oculus Rift HTC Vive Samsung Gear VR

(Combined) Resolution 2160 × 1200 2160 × 1200 Smartphone-based

Refresh Rate 90Hz 90Hz 60Hz

FOV 110° 110° 101°

Positional Tracking Yes Yes No

Tracking SystemConstellation

(Optical)Lighthouse —

Room-scale Boundaries 1.5m × 3.3m 4.6m × 4.6m —

DOF 6 6 3

Built-in Camera No Yes Smartphone-based

Wireless No No Yes

Software Windows, Oculus Windows, SteamVR Android, Oculus

Display Type OLED AMOLED AMOLED

Price (inc. controllers) $800 $900 $130

Performance of AR Devices

When using an AR HMD, the FOV is a vital part to give visual awareness of the surrounding virtualobjects. Microsoft advertises the HoloLens to be able to display all the virtual objects within the user’seye-span. However, developers and hardware critics who have tested the system, are reporting thatthe FOV is relatively narrow, limiting the user to only an area of a rectangle for seeing virtual objects(see Figure 2.7.1) [91, 92, 93]. The geometric tracking in HoloLens is more accurate than most of thecompetitors, however, without any external aid it still has issues with superimposing virtual objects tofit properly with the real environment. HoloLens is restricted only to Microsoft’s own API for utilisingthe on-board sensors which limits the ways for a developer to implement better tracking techniques

27 of 52

Page 35: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

State-of-the-Art Survey Result

[75]. Compared to the HoloLens, the Meta 2 HMD connected to an up-to-date computer offers a widerFOV to project virtual objects, but at a slightly lower screen resolution. However, with the same case asHoloLens, users have perceived the virtual object positioning of Meta 2 as unstable and jittery [60, 94].

Figure 2.7.1: Image illustrating the FoV of HoloLens for displaying virtual objectsSource: [93]

There is also a possibility for using an immersive, primitive video-see-through device such as GoogleCardboard. Compared to more sophisticated HMDs like HTC Vive and Oculus Rift, this device hasa low screen resolution (how low, depends on the connected smartphone) and a narrow FOV. Studiesthat involve inspection work tasks, has shown that these problems can impair the user’s sight whichaggravates the work task and can even be dangerous in certain environments [53].

User performance was shown to be badly affected when the latency of a system was too high than if ithad a lower refresh rate. To alleviate this, instead of increasing the refresh rate, it is advised to insteadreduce system delay as much as possible through improvement of object target tracking [95]. See Table2.2 for an overview of the technical details.

Table 2.2: Technical details of selected AR HMDs

HoloLens Meta 2 Google GlassGoogle

Cardboard

Resolution 1268 × 720 2560 × 1440 640 × 360Smartphone-

based

FOV 30° 90° — 90° - 120°

Positional Tracking Yes Yes — No

DOF 6 6 — 3

Wireless Yes No Yes Yes

SoftwareWindows,

HoloToolkitWindows, Meta

2 SDKWindows, Linux,

Glass SDK

Android, iOS,Google VR SDK

Price $3000 - $5000 $900 $1500 $15

28 of 52

Page 36: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Result State-of-the-Art Survey

2.7.3 User Interactions

Interactions with virtual models is a key property of an application developed to train aircrafttechnicians and maintenance in other fields. When using a VR or AR application there are manyalternatives of input devices to choose from and the option to combine freely is available throughthe use of different frameworks. However, in the end, the importance of intuitive user interactions isessential for a realistic experience in the virtual environment.

2.7.3.1 Natural User Interactions

There are a lot of alternatives for choosing a system for user interaction. An intuitive choice whenchoosing a system, for interactivity in a virtual reality setting, is for it be realistic and natural; using thehands without holding any hardware devices. There exist a few devices today which enable this type ofinteraction. The prevalent devices in the market include Leap Motion and Microsoft Kinect.

In a recent study [96], user interaction was analysed to see whether the natural user interactionsdevices, Leap Motion and Microsoft Kinect, offers a desirable and interactive user experience. Theusers preferred the Leap Motion for its precision and Kinect for larger movements. In a similar research[38] the Kinect was likewise deemed to be unfit for precise interactions but more appropriate fortracking of distinct body motions.

Leap Motion has been proven to possess a sub-millimetre accuracy and can be applied as a robust handtracking device in both an AR and VR setting [24]. If the controller is mounted to an HMD, it will alsofollow the direction the user is facing and keep track of the hands while using the hands and turning thehead, fulfilling the guideline for enactive interactions in section 2.3.

Microsoft Kinect has been used in several related studies [51, 9, 38, 45, 46] and was considered bymost to be an inexpensive body tracking device, while Leap Motion was more considered for precisesmall hand tracking. However, both Leap Motion and Microsoft Kinect have trouble with trackingcaused by occlusion and by too much surrounding light since they are based on infrared cameras.

Tracking the eyes of a user is a technology which might help in interaction but also in graphicalprocessing power with foveated rendering, by rendering the 3D scene in high-quality at the area ofwhere the eye is centrally looking and reducing the render quality in the peripheral vision, to savecomputing power. In interaction specifically eye-tracking may enable the user to interact with thevirtual environment by looking at objects directly and selecting by pressing buttons or making gestures.Tobii is the leader in eye-tracking tech development and has implemented working systems in VRtechnology [97, 98].

2.7.3.2 Gestures

Gesture tracking is closely related to natural user interactions, used both in AR and VR withdifferent devices. In AR there usually is a machine learning algorithm scanning the visual screenfor recognisable gestures. For instance, Microsoft HoloLens can pick up a predefined distinct fingergesture to raise an event, activating an action. This is the reason gesture-based interactions are notcounted as natural user interactions since they usually need to be quite distinct for the program to verifyas a gesture thereby making the interaction less realistic.

29 of 52

Page 37: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

State-of-the-Art Survey Result

It has also been noted that it is difficult for users to understand how to properly perform a particulargesture which is understandable by the system. In the case of Google Glass, swipe patterns were shownto be hard to remember without a proper briefing. Similar cases have been noted in VR where the userperforms gestures based on virtual hands from Leap Motion or Kinect [38, 47].

2.7.3.3 Voice

Voice commands can be problematic in crowded or noisy areas. Voice input for AR applicationsrunning on smartphones and tablets, can be an intuitive way of user interaction due to the extensiveuse of smartphones today. However, it is not as immersive as techniques that allows natural userinteractions, e.g. Leap Motion.

2.7.3.4 Controllers

Using controllers or other handheld devices does not provide the same immersion as using naturalinteractions, however, it may give a more responsive and precise experience when handling virtualobjects.

Oculus Touch uses the same Constellation tracking as its HMD; Oculus Rift. To work optimally itrequires at least two cameras to not lose tracking, making it more fit to use in a room dedicated forthis type of activities. The same applies to the HTC Vive Steam VR controllers, which also uses theLighthouse tracking. Both controllers work well in gaming and have been applied to maintenance andinspection. By aid from software, certain virtual objects are able to attach to the virtual hands of theuser to reduce the requirement of extreme preciseness and other minor problems that may occur duringhandling.

Attachable devices may give more freedom in handling virtual objects and are close to provide naturaluser interactions. Microsoft band was used in a study [48] to navigate around by using only the palmsof the hands, Vive Tracker is an upcoming input device which enables the user to track the handsprecisely by attaching it to the armwrist and wearing special tracking gloves, enabling the user tointeract with virtual objects by open palms, which is not possible with handheld controllers.

For GearVR, a newly released VR controller will enable some interaction with the virtual environmentwhen using the HMD, to complement the limited use of its touchpad. Other input methods includekeyboards, mouse and joystick. Such as the case of the training simulation made by DAssault, wherethey use the keyboard to navigate the position in the virtual 3D world.

2.7.4 Miscellaneous

Measurements from studies involving students from different areas, show that guidance through anAR application, displaying superimposed objects on a workspace in real-time, can provide shorter taskcompletion time and lower rate of errors compared to other instruction methods. Conducting a simpleassembly task while getting instructions through a video feed, were shown to produce 10.8 timesmore errors and took 22% more time than using a guiding AR application through the assemblage.AR-guidance for industrial manufacturing tasks can perform very well but requires an extensively

30 of 52

Page 38: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Conclusions State-of-the-Art Survey

engineered environment. While tracking objects in an AR application, distractions can occur, not leastin crowded manufacturing environments [70].

AR based solutions have proven to be a useful technical tool for education activities. Today’s noveltyof the technique can interest and motivate students to engage themselves in AR-learning activities.Basic educational AR applications running on e. g. smartphones can help students to visualise a givenproblem [67, 49]. AR frameworks for inspection tasks, utilized in applications running on variousplatforms, can also reduce stress, compared to using paper- or PDF manuals while performing work[53].

Different companies have implemented VR training simulations using HTC Vive and Oculus Rift withOculus Touch [99, 100, 101, 102]. The training takes place in a virtual environment specific for thechosen training with objects reacting to different interactions by the user. Maintenance training hasbeen implemented with training on an engine in an assembly workshop where the user first watchesa tutorial of how the procedure is done in a 3D modelled animation and then do the same procedureby using virtual tools available in the workshop. Another maintenance task is done at a railway trackwhere the user is expected to repair a railroad switch in a realistic setting. The manufacturing trainingsimulation offers the user to operate a lathe and inspecting products. Interaction with buttons anddifferent parts are possible by the precision of the Oculus Touch and HTC Vive controllers.

2.8 Conclusions

In this survey, several studies and experiments involving VR and AR solutions have been presented.Some of the presented implementations are applicable to the stated problems in the requirements of thisreport. The result section was divided into different subsections to get an overview of how particularimplementations performs in the most important aspects of a virtual training application.

2.8.1 Result Evaluation

For the multi-user aspect, compared to AR, VR offers better support for real-time simulations withmultiple users that can interact with the virtual environment. VR CAVE platforms enable multi-userenvironments however with a limit to the number of active users and typically with only one userallowed to be tracked at a time. COTS platforms include a plethora of tools to enable different HMDsfor multi-user collaboration and allow as many users as desired depending on hardware capabilities.Same principles are found in AR which also offers multi-user and object-sharing capabilities throughMicrosoft HoloLens, Meta 2 and other smartphone or tablet implementations, however, programmersare restricted in the choice of development frameworks, compared to VR.

Concerning the graphics and system performance aspect, AR offers tetherless implementations withup to six DOF which benefits user mobility. However, wireless solutions limit available computingpower and can therefore not provide the best graphics. Furthermore, most AR HMDs have a restrictedFOV (compared to VR HMDs) and a minimum distance, in which virtual objects may appear. Thisimpairs the use of AR in this aspect, especially because of the importance to be able to fully observehigh-detail models. Additionally, AR-solutions often show instabilities of the appearance of virtualobjects because of unstable tracking methods.

31 of 52

Page 39: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

State-of-the-Art Survey Conclusions

In VR, it is recommended to use one of the more developed tethered HMDs connected to a powerfulworkstation to get the best VR experience and to avoid motion sickness. To allow positional tracking,however, it is important to understand how the used tracking system works to avoid potential problemsand limitations, such as occlusion. CAVE platforms often require a lot of space for the specificcomponents and powerful workstations to run smoothly causing the quality of the system to be highlyrelated to available capital. Thus it is advocated to use COTS hardware platforms instead to developinteractive training simulations.

As for user interactions, AR and VR implementations offer various types of user input optionsand most of the mentioned stand-alone devices may be used in both environments. To fulfil therequirements, selecting a favourable input device depends on convenience, intuitive and stability. ARgenerally utilises touch, voice commands, and predefined hand gestures while certain VR solutionsinclude controllers which imitate the motion and properties of the human hands but also enables theuse of more traditional video game controllers, keyboards and pointing devices. Devices that are closerto natural user interactions or provide virtual hands, contributes to a sense of responsive control anda more realistic experience. Naturally, this sort of interaction devices are recommended for virtualtraining systems.

2.8.2 Summary

To determine whether an AR or VR-based implementation is suited for a technician trainingapplication, it is of great importance to examine the existing technological advances for both concepts,which was done in this survey.

It is found that AR requires further development before it can be properly implemented for a virtualtechnician training simulation. Currently, it pertains more to guidance by graphical overlays inmaintenance, inspection or assembly work tasks, rather than the virtual technician training scenario.

At the current stage, VR has seen more implementations than AR in maintenance training and similarapplications. In the context of COTS platforms, besides the low implementation cost, there are alsomany options to customise different setups with existing frameworks making it very flexible. VR alsoallows to simulate any environment and by using the right tools, this enables developers to convenientlycreate an immersive virtual learning platform for practical training purposes.

32 of 52

Page 40: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

3 Methods and Tools

From the survey, it was concluded that VR was the route to take regarding implementation of practicaltechnician training. While VR has some limitations compared to currently available technology in AR,technologies in VR had far more advantages in terms of current tracking technology and capabilities indisplay and interaction devices.

It was noted that most of the implementations in VR used similar hardware and developmentframeworks. Even though the Oculus Rift HMD was used more than HTC Vive, for the prototypeimplementation, HTC Vive was chosen for the reason that it provides a larger movement area thanwhat the Constellation technology of Oculus Rift offers. A larger movement area gives a better trainingenvironment for the trainee by allowing the ability to physically walk around more freely which makesthe training more realistic. Figure 3.0.1 illustrates a top-down view of the tracked room that was usedfor the operation.

As an interaction and input device, Leap Motion was chosen as it performed well in the studies thatused it and because it would provide better practical learning for the technician during training.Because of its high accuracy tracking of the fingers and the hands it gives a realistic experience whenthe technician enacts the tasks physically by using their own hands.

The considered game engines was Unity or Unreal Engine, however, Unity was the ultimate choicebecause of its good platform and external hardware compatibility especially with HTC Vive and LeapMotion. HTC Vive requires SteamVR API to work which has OpenVR SDK in it, which supportsother HMD’s as well. Leap Motion requires the Orion SDK to work and has different modules whichworks with Unity, to enable interaction with the virtual models. The Unity project, together with thementioned packages, was set up according to the instructions in Appendix A and a more detaileddiagram of the system can be found in Appendix B.

3.1 Methods

3.1.1 Development Planning

The prototype was developed by using extreme programming (XP). The planning of which featureshould be implemented for each week was done in a separate document. To learn about developing in a3D game engine such as Unity, the FAQ on their web-page helped out.

3.1.2 Event-Driven Implementation

The simulation built in Unity was developed in an event-driven way where events, such as button clicksand user hand gestures trigger listener function calls. This could enable communication between two ormore objects without the need of having a reference of each object in every object.

33 of 52

Page 41: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Methods and Tools Methods

Figure 3.0.1: A top-down view of the tracked area by using the Room Overview Window in SteamVROn the top-right and bottom-left are the basestations (Lighthouses) which scans the room in arcs of120°, both horizontally and vertically. They were placed, no more than 5 meters apart from each other,at a high ground pointing downwards at a 30° - 45° angle towards a focal point of the movement area,marked by ’+’ in the figure. The real-time tracked HMD is displayed slightly above the centre. Thesurrounding blue lines are the boundary-marks (called Chaperone boundary) which are setup by theuser to limit tracking when outside the FoV of the Lighthouses. The light blue rectangle is createdautomatically within the Chaperone boundary and is the play movement area for the user. Whennearing the boundaries of this rectangle the user is warned to prevent moving outside the safe area formovement and activity. The Chaperone boundary in the figure measures approximately 3.6m × 4.6m,creating a play area averaging 3.2m × 4.0m. Further setup details can be found in [103].

34 of 52

Page 42: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Tools Methods and Tools

Figure 3.2.1: Basic Prototype System Overview

3.2 Tools

The selected tools may be seen diagrammatically in Figure 3.2.1 and will be further described in thefollowing subsections.

3.2.1 Hardware

The HTC Vive HMD and Leap Motion controller has been described earlier in section 2.5.2.2). TheLeap Motion was attached to the front of the HTC Vive HMD, so that the hands of the user are trackedin front of the head (see Figure 3.2.2). The Leap Motion and the HTC Vive were connected throughUSB and HDMI to a workstation running the simulation on Unity with the operating system Windows10. The technical specifications of this workstation included an eight-core AMD FX-8320 CPU, 32GBRAM and an AMD Radeon R9 380 GPU.

Figure 3.2.2: HTC Vive with a front-attached Leap Motion

35 of 52

Page 43: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Methods and Tools Tools

3.2.2 Software

Development of a prototype application for demonstrating the concept requires specific tools, such asdevelopment frameworks and libraries to make the hardware work together.

3.2.2.1 Unity

Unity was used to develop and build the training simulation for the user. It is a multi-platform gameengine developed by Unity Technologies. Mainly, it is used for developing games for personalcomputers, consoles, mobile devices and web pages. The development environment that comes withUnity enables a visual representation of the project. Game logic is scripted using C# or JavaScript.There is a broad set of asset packages that are used for extending a Unity project. Asset packages cancontain 3D models, APIs for certain hardware or algorithms etc. [104].

The game engine is also mostly used in VR application development because of the facilitation ofpackages which enable an easy way to handle 3D models and to apply interaction between them andcontrollers. For this reason, it is used for this prototype development. Even though there are morerecent versions, Unity 5.5.0 was utilised due to compatibility with currently available Leap Motionpackages.

3.2.2.2 Blender

Blender is an open-source 3D modelling suite that was used in the project to manipulate the geometryof the imported models used in the simulation. The imported models usually didn’t have the correctscale or all the needed attributes, but this was solved by using the available set of tools in Blender[105].

3.2.3 Development Frameworks

3.2.3.1 SteamVR

To enable communication with the HTC Vive the SteamVR software was needed. The prototypeapplication communicated with SteamVR through the OpenVR API and SteamVR communicated withthe HTC Vive.

3.2.3.2 Orion SDK and Leap Motion Modules

Implementing interaction ability with the Leap Motion required the Orion SDK, which provideddifferent modules for Unity. The module Leap Motion Core Assets, enabled a 3D modelling of thehands. Using the Leap Motion Attachments Module enabled attachment of game object to the hands,so users could pick things up. Leap Motion UI Input Module included buttons, toggles and other UIelements to interact with the hands.

36 of 52

Page 44: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Other Resources Methods and Tools

3.3 Other Resources

• The aircraft manual for the Gripen fighter plane was provided to show in detail how a procedureis performed by a technician. A procedure was then selected to be implemented in the prototypeapplication.

• Virtual models by Saab were of CATIA 3D models and all the parts for a full model aircraft werenot available. Instead, a free model of Gripen from Flight Simulator X, representing the outershell of the aircraft was retrieved from a freeware site [106] providing aircraft models used inFlight Simulator. Other models in the simulation were retrieved from websites providing free 3Dmodels for general use.

37 of 52

Page 45: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

4 Prototype Development

4.1 Refuel Procedure Training

From the aircraft manual a procedure was selected that was considered appropriate as an initialpractical training implementation for the prototype. The considered factors included the remainingtime of the project and how well the interactions would fit with Leap Motion. The procedure includesrefuelling the aircraft by operating the Ground Crew Panel and the Refuelling Access Panel, whichboth are located to the starboard side of the hull on the centre of the aircraft. Other needed equipmentincludes a refuel hose, a ground cable and a pressure refuelling unit.

4.1.1 Ground Crew Panel

The Ground Crew Panel (GCP) is located at the starboard side of the hull to the centre of the aircraft.It is a relatively small panel concealed by a hatch. The hatch is able to be opened by pressing thefingers on two quick-release lock buttons that are located on the bottom of the hatch. When the hatchis unlocked, the technician opens the hatch by pulling it towards oneself. The hatch is hinged unto thehull and rotates until it is perpendicular from the closed state.

The panel contains a battery switch, to provide power for the panel with its internal computers andseveral control buttons which are used to run various test on the aircraft before and after aircraftoperation. From this panel, the technician is also able to communicate with the pilot in the cockpit byattaching a pair of specific headphones. For the refuelling procedure there is a dedicated fuel section onthe panel, enabling the technician to run diagnostic tests and display the current fuel on a fuel indicator.

The fuel indicator displays three different numbers, as either fuel level or diagnostic test errors. Ifa test has completed without any errors, the fuel indicator display will show three eights ("888")while pushing the test fuel button. Relatively close to the test fuel button is a fuel selector where thetechnician chooses the level of fuel the refuel equipment will fill the aircraft with.

The real ground crew panel on Gripen has a lot more components than what is described here. Furtherdisclosure is prevented by classified information. For this reason, the GCP panel in this prototypecontains only a simplified implementation which only includes buttons that are relevant for the refuelprocedure (Figure 4.1.1a).

4.1.2 Refuelling Access Panel

The Refuelling Access Panel (RAP) is similar to the Ground Crew Panel (GCP) and is located afew centimetres below the GCP hatch. The RAP hatch operates similarly to the GCP hatch by usingthe same lock buttons to open, however, it needs to be pulled downwards towards oneself insteadof upwards and may be opened with a larger angle than 90° from the closed state. On the panelthere is a refuel valve which has a protection cap which needs to be removed before the refuelling

38 of 52

Page 46: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Refuel Procedure Training Prototype Development

(a) Open ground crew panel (b) Open refuel access panel

Figure 4.1.1: Panels

operation commences. The hatch needs to be opened as much as possible so that the refuel hose,between the refuel nozzle and the pressure refuelling unit, does not weigh on the hatch. A simplifiedimplementation of this panel only includes the refuelling valve without any detailed parts (Figure4.1.1b).

4.1.3 Refuel Equipment

The Pressure Refuelling Unit (PRC) and the ground cable are needed to refuel the aircraft. The PRCis similar to a compressor and provides sufficient pressure for the specific chemical fuel during theautomatic refuel of the aircraft. A hose is connected between the fuel tank and the PRC and anotherhose from the PRC to the refuel valve on the aircraft. The most basic functionality of this device is toturn the pressure coupling to enable a flow of fuel at the appropriate pressure.

The ground cable is connected to the ground point of the aircraft, located to the right of the GCP panelhatch, and the PRC.

The virtual representations of these devices may be seen in Figure 4.2.2.

4.1.4 Instructions

The exact steps cannot be outlined publicly because of confidential information in the manual, howeverthe instructions (Table 4.1) has been briefly broken down to steps that are manageable by a technicianstudent and interaction by the Leap Motion controller, while still remaining closely realistic to the realprocedure.

39 of 52

Page 47: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Prototype Development Result

Table 4.1: Simplified Refuel Procedure Instructions from the Aircraft-Manual

Step Instruction1 Connect the ground cable to the aircraft ground point2 Open the GCP and RAP panels. On GCP, switch the battery to ON3 Remove the cap from the refuel valve and attach the refuel hose to the

refuelling valve4 Select the quantity of fuel to be filled and run a test by pressing the test

button until the fuel indicator flashes between the test complete ("888") andthe selected fuel quantity mode (e.g. "001" for 40%)→ (888-001-888-001)

5 Set PRC to ON and open the pressure coupling6 Refuel until it automatically stops and the fuel indicator displays the chosen

fuel quantity7 Set PRC to off and remove the coupling8 Install the protective cap to the refuel valve and close the RAP hatch9 Disconnect the ground cable from the aircraft10 On GCP, set battery switch to OFF and close the hatch

4.2 Result

4.2.1 Virtual Model Representations

Developing a maintenance practice scenario, using models with a certain level of detail were ofimportance.

Gripen was downloaded from a website [106], which offers free Flight Simulator X models. Thisparticular model came in the file format MDL, since this file format is not fully supported by themajority of 3D applications, a third party model converter was utilised, called ModelConverterX [107].This application allows for conversion of objects between different file formats, and in this case, it wasconverted to OBJ, to enable import into Blender for further customisations. The model was found tohave an extreme amount of detailed parts which had to be organised for easier selection.

The gears of the aircraft were up as well so to give the user a more realistic experience of the aircraftwhen standing in front of it, the gears had to be manually pulled out of the model using Blender.Access panels for the chosen procedure were also added to the aircraft hull. To the final model an extratexture, created in Paint.NET, was applied to the hull to give it more roughness and a more naturalappearance (Figure 4.2.1).

Refuel Equipment models were mainly downloaded from Unity Asset Store. Although, some partswere retrieved from [108] most commonly in SKP file format. Blender was used to handle conversionfrom SKP to FBX which is Unity-compatible. The Pressure Refueling Unit (Figure 4.2.2a) has amounted handle that needed to be movable. To achieve this, the parts were separated using Blender.The ground plug (Figure 4.2.2b) and refuel hose nozzle (Figure 4.2.2c) models are originally designedas rocket engines but closely matches the look of the desired objects. The refuel wagon (Figure 4.2.2d)is a graphical representation of the fuel source and has no practical function.

40 of 52

Page 48: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Result Prototype Development

Figure 4.2.1: JAS 39C Gripen

(a) Pressure refueling unit (PRC) (b) Ground plug

(c) Refueling hose nozzle (d) Refuel wagon

Figure 4.2.2: Refuel equipment

Backdrops, skybox and asphalt assets were downloaded from Unity Asset Store and the website3DWarehouse [108].

41 of 52

Page 49: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Prototype Development Result

Leap Motion Hand Models

The Capsule hand models are the default hand models one starts with after installing Leap Motionfor the first time. The fingers have the appearance of cylinders which are attached to the joints of thehands and the fingers, which appears as spheres. It also has has attached arms (Figure 4.2.3a). Thismodel was mostly used at the beginning until at a later stage where more human-like hand models weresearched for to test whether it would be better or not. While human-like hands was found, it did not feelgood when interacting with objects neither represented the real hands very well.

The Poly hand models was found from the Leap Motion Hand Assets package and looks slightlydifferent with five thinly rectangular-shaped fingers (Figure 4.2.3b). The Poly hand models felt bestwhen interacting with the virtual objects, probably because of the good fit between graphical andphysical representations. Even though it does not have any attached arms, it was decided to let it remainthat way since the arms could cause inconveniency when interacting with objects as they sometimescould dislocate themselves when the hands were affected by bad tracking conditions.

(a) Capsule hands

(b) Poly hands

Figure 4.2.3: Hand models

42 of 52

Page 50: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Result Prototype Development

4.2.2 Interaction Design

Different factors were considered which would be of importance in implementing interaction. Theinteractions that would be most prevalent in the application prototype implementation of procedureswould be removing and attaching objects from other objects, opening and closing hatches, activatingbuttons and turning handles.

To make the operation be close to how the real world operation is executed, it was important to makethe actions in the real-world to be represented as identical as possible with Leap Motion. Doing thismay give the trainee a sense of how a procedure is done in real life and opens up possibilities formuscle memory training, however to confirm the effectiveness of this possibility, further research willneed to be done in this aspect together with Leap Motion.

4.2.2.1 User Interfaces

Traditional first-person games often offer user interfaces where text and other communicative mediumare represented two-dimensionally on the view camera, this type of user interface representation iscalled non-diegetic. The opposite representation is called diegetic and is represented in the real gameworld instead of being shown as an overlay [109]. To avoid the risk of breaking immersion for the user,it is best to use diegetic representations of information.

In the case of the GCP panel, when pressing the battery button, the refuel indicator represents whetherthe battery is on or not, which is a form of a diegetic representation.

4.2.2.2 Distinct Procedure Actions

Picking Up Objects Being able to pick up the certain objects using the user’s right remote handwas implemented with help from Leap Motion Attachments package and some custom scripting. Bydetecting if the ring and middle finger is sufficiently close to the hand palm, it is determined whetherthe hand is in a gripping pose or not. It is also determined if a graspable object is close to the handpalm. If the object is close enough to the right hand, the object locks to the hand for carrying andplacing for as long as the hand of the user is in a gripping mode. Figure 4.2.4 depicts an example ofthe user picking up the refuel nozzle to bring it to the refuelling valve on the refuel panel.

Figure 4.2.4: Picked up refuel hose nozzle

43 of 52

Page 51: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Prototype Development Result

Rotating objects The ability to loosen certain objects through rotation was implemented, by utilisingthe Leap Motion Attachments package and installing custom scripting. The lid that covers the refuelvalve in the Refuel Access Panel could this way be removed, rotating the right hand in a gripping state(Figure 4.2.5).

Figure 4.2.5: Rotating the refuel valve protection cap

Pressing Buttons To provide a realistic impression of pressing a button for the user, it was importantto provide feedback when executing the actions. In the case with buttons the UI library in Unity, visualas well as audio feedback was provided. When the user is pushing a button with a Leap Motion finger,a visual cue is represented by fading the button to a darker colour while at the same time an audio clipis played resembling the sound of a initial click. When the finger is released away from the button, thebutton colour is reverted to the original light shade colour as it was previously, before it was pushed,and another audio clip is played, resembling a retracting button click. Figure 4.2.6 depicts a buttoninteraction with the fuel test button on the ground crew panel.

(a) Pressing button (b) Releasing button

Figure 4.2.6: Button interaction

44 of 52

Page 52: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Result Prototype Development

Figure 4.2.7: Menu

4.2.2.3 Menu Design

Using the basic Unity UI assets in combination with the Leap Motion UIInput module, a hand-operatedmenu (Figure 4.2.7) was built to enable four different menu choices:

• Usage of tools

• Navigation in close proximity around the Gripen model

• A readable manual with task instructions

• An X-ray mode, which adds transparency to certain parts of the Gripen model

When the user directs the left palm towards the face, a detection algorithm from the Leap Motion CoreAssets, detects the orientation of the hand and spawns a menu to the right of the open palm.

Tools In the current implementation there is only one tool, but it is possible to easily add any formof tool to the list. The only existing tool at the moment is a laser pointer. When selected, this tool isplaced on the right index finger of the users Leap Motion hands and can be used to point at differentobjects of the aircraft to get a small description of the name of the aimed object. Figure 4.2.8 shows anexample where the object has changed colour to a green shade to give a representation of the shape.

Figure 4.2.8: Laser pointer

45 of 52

Page 53: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Prototype Development Result

Navigation The ability for user locomotion in the virtual world was solved by translocating theuser. From the research it was recommended to not include movement that could induce simulator-sickness to the player, caused by unwanted movement for the player, which is why translocation workswell in limited cases. It is important to let the user feel as if they are in a realistic world, which ledto the decision that positional tracking would be enough to physically walk around small sectors ofthe aircraft and using translocation would be an ability to move to a different sector of the aircraft.The navigation menu has four buttons (right, left, front and rear) from the UnityUI library, eachtranslocating the user to the corresponding side of the aircraft.

Figure 4.2.9 demonstrates an image sequence of the navigation process, where the user is standing atthe right side of the aircraft and presses the front button of the navigation menu. A fading transitionis then enacted and the user is translocated to the front of the aircraft. The fading transition wasimplemented to reduce the risk of disorientation when the entire environment instantly changes for theuser.

(a) Pressing navigate front (b) Fading transition (c) Translocation completed

Figure 4.2.9: Navigation sequence

Manual The instructions was created by applying text to a Canvas, which is a UI element fromthe UnityUI library. The entire planning included the precondition that the user knows which stepsto execute during the simulation, by looking at the manual beforehand or during simulation. In theprototype, the manual only consists of the refuel procedure instructions. However, since the realmanual is vastly more complicated, this would not be a desirable solution if intending to implementthe entire manual into the simulation. Instead some sort of PDF renderer would need to be created. Butbecause of limited time and other reasons involving areas out of the scope of this project, this optionwas not implemented into the prototype. Providing a manual inside the simulation gives a high sense ofrealism and also reduces the risk of simulator-sickness.

Figure 4.2.10: Refuel manual

46 of 52

Page 54: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Result Prototype Development

X-Ray A transparency manipulation mode was implemented to set the transparency level of theaircraft. Changing the transparency makes the user able to see electrical wires and the internal partsof the aircraft. This could also be developed to only change the transparency of selected parts of theaircraft. A UI slider from the Leap Motion UI Input package was added and programmed to alter thetransparency level (Figure 4.2.11).

(a) Normal opaque mode (b) Transparent mode

Figure 4.2.11: X-Ray mode

4.2.2.4 Multi-user Capabilities

At the time of this report, Leap Motion does not provide any packages for Unity with complete multi-user functionality for networking. Due to this, multi-user support was not fully implemented into theprototype. Instead, another stand-alone Unity application for desktop was developed using Unity’sbuilt-in network High Level API (HLAPI), Leap Motion Core Assets and Attachments packages withsupplementary self-written scripts. This enabled remote transformation of hand representations overnetwork.

Figure 4.2.12 shows an example of two clients connected together. The perspective is from a first-person view and the users right hand is shown at the bottom of the figure. The hand of the other usedis near the engine and is pointing at different parts for demonstration. The hands are represented bya sphere which imitates the hand palm and five cuboids which represents the fingers. This could ofcourse be further implemented to include different joints, until Leap Motion comes with their ownsolution.

Figure 4.2.12: Multiple users

47 of 52

Page 55: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Prototype Development Discussion of Implementation

4.3 Discussion of Implementation

Starting from the beginning with setting up the project, getting used to Unity and the other tools,the implemented procedure took about 200 hours of work divided over two persons to complete. Toimplement a procedure with similar steps as the refuelling procedure, would probably take half of thetime. The reason for this is the possibility to reuse functions that took time to implement and havingmore experience now helps to outline the needed steps to solve a problem.

It was challenging at first to implement this prototype yet it was very educational for us to solvecompatibility issues and combining different packages and libraries together. Adding interaction toobjects in Unity was not as challenging but rather required more time to make it more polished wheninteracting with Leap Motion. While there are some minor immersive-breaking situations, it still givesthe user a general feeling of actually standing in front of a real aircraft. Links to videos showing therefuel procedure operation and the menu interaction can be found in Appendix C.

4.3.1 Expenditure

Örebro University robotics lab had a HTC Vive HMD and Leap Motion controller which we used fordebugging. Though, since we also had to be able to debug from home or at Saab on our private laptops,we bought our own Leap Motion controller which costs about $80.

Total costs for the hardware components needed to run the Unity application at the time of the report:

Workstation $1000HTC Vive $900

Leap Motion $80Total $ 1980

Being able to develop Unity projects commercially comes with a cost of $125 per month if the revenueis more than $100.000 a year. Otherwise it can be used for free.

4.3.2 Evaluation of Used Hardware

Developing with HTC Vive was surprisingly easy together with SteamVR. The HTC Vive performedwell and it felt very comfortable to equip it during simulation. If it did not fit well enough, there weresome issues with light leakage but it was alleviated by fitting it more properly to the head.

At times the simulation was not able to keep up with the required rendering so it would cause thesimulation to stutter and lag. While this was a minor nuisance and could cause simulator-sickness, itcan be alleviated by simply acquiring a more powerful workstation.

Leap Motion worked well during simulation, however, it could lose detection of the fingers when therewould be too much light in the room, or if the other hand crossed the other causing occlusion. This wasone of the important reasons the menu was not placed directly on the palm.

Another thing that which has been mentioned before and can be confirmed by the prototypeimplementation is that Leap Motion requires the user to use their hands in the FOV of the Leap Motion

48 of 52

Page 56: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Discussion of Implementation Prototype Development

controller. While the FOV is great, it is not possible to track the hands if they are beyond the trackinglimitation of the controller.

The Native HTC controllers could probably be a plausible tool for interactions in a simulation similarto this prototype.

4.3.3 Development Potential and Future Developments

The potential for development is great. As this is only a prototype, there are a lot of things that couldbe improved to provide a better experience, such as better models for better immersion, in particular theaircraft model to provide more detailed objects for the user.

A better menu with more options would also be a welcoming feature, the manual in particular could beimproved and more tools to use with the aircraft. Another improvement for tools could be by using theVive Tracker attached to a real tool, which is depicted virtually in the virtual environment, so that thetrainee may train with real tools, in the virtual training world. Oculus is also working on the same thingby adding the possibility to track a third Touch controller. Further research will have to show whetherthere is a significant difference in learning between using real tools in virtual training versus using onlyvirtual tools.

Multi-user capabilities are already at a point where its possible to add several players into theapplication, with more time the functionality would have been implemented.

Further development with interface representations and better feedback when interacting with theobjects and the menu would be a necessary addition to make the application even more interactive andimmersive. At this point the user needs to select menu options by looking at the menu and touchingwith a Leap Motion finger. To make interactions better, it would be a benefit to follow the VR designguide made by Leap Motion [110]. Since Leap Motion is a rapidly developing technique, the softwaresurrounding this device might change. Further support would be required to keep it up to date. At thetime of this report there are packages which are used in the prototype that recently became deprecatedand replaced by one combined package.

Also, in the future, eye-tracking may remove the need to interact with a menu physically entirely andinstead let the user select with their eyes to translocate by just looking at a specific area or bring upinformation by looking at objects and so on. However, it comes down to implementation decisions todecide whether it is the correct path to take for the developed application.

The development possibilities goes as far as our imagination. Hopefully further technologicaldevelopments will battle the current limitations of technology. Development of VR applications willbecome even easier and allow for more possibilities than what already exists today to give a trulyimmersive and exuberant alternate world.

49 of 52

Page 57: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

5 Discussion

5.1 Compliance with the Project Requirements

5.1.1 Summary of the Requirements

The primary requirements, as stated in section 1.4, indicate that an in-depth survey of VR and AR hadto be completed to decide an appropriate technology for practical technician training application. Aprototype application would be developed to show the capabilities of the particular technology. Therewere three secondary requirements to show the capabilities of the selected technology. The followinglist will give a better overview of the requirements:

• An in-depth survey of VR and AR

– Will produce a documentation of the advantages and disadvantages

– Investigations will indicate which technology is suggested for developing a practicaltechnician training

• A prototype application should be developed to show some capabilities of the particulartechnology. With three secondary requirements:

– A teacher should be able to demonstrate a part of the aircraft for the trainee

– A trainee should be able to do a procedure training by themselves

– A trainee should be able to do a procedure training by the aid of teacher guidance

5.1.2 Fulfilled Requirements

A substantial part of this report contains the survey of VR and AR containing advantages anddisadvantages for each field and a conclusion section which argues which technology is the best to usefor practical technician training. In this case, it was concluded that VR, at the current time, is the mostappropriate technology to use.

3 An in-depth survey of VR and AR

3 Will produce a documentation of the advantages and disadvantages

3 Investigations will indicate which technology is suggested for developing a practicaltechnician training

To fulfil the second primary requirements a prototype application was implemented to show thecapabilities of VR and guidelines from the survey was followed to ensure an implementation whichwould give a realistic experience for the trainee. Two of three of the secondary requirements werefulfilled with the prototype implementation. The remaining unfulfilled requirement was the multi-userfeature, where the teacher would have been able to demonstrate an engine or a part of the aircraft to the

50 of 52

Page 58: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Impact on Society Discussion

trainee in the same virtual environment. However, a standalone application was created to prove that itis possible to do with a bit more time on hand.

• A prototype application should be developed to show some capabilities of the particulartechnology. With three secondary requirements:

– A teacher should be able to demonstrate a part of the aircraft for the trainee

3 A trainee should be able to do a procedure training by themselves

3 A trainee should be able to do a procedure training by the aid of teacher guidance

5.2 Impact on Society

The development of VR applications for practical training can provide realistic training environmentsfor many areas such as healthcare, car manufacturing etc. Enabling this type of training requiresinvestments in the form of VR hardware and development of tailored applications. In some contexts,assets for the applications already exist, such as 3D-models, CAD-drawings which can alleviate theapplication development process.

If more companies would consider to adopt this way of training, this could in some ways be in favourfor the society, since it is all about virtual content. Using virtual content means that less hardware hasto be manufactured just for the sake of training, which could reduce the need of using natural resourcesfor this purpose.

If Leap Motion is used more in training it may help with the development of this technology and willhopefully further increase the tracking accuracy of the hands of the user, without using any type ofattached devices to the hands.

While VR and AR technology offers a wide form of advantages in the educational aspect, it requiresa lot of time to develop a good functioning application or device. This may not be a good thing forenterprises that wishes to invest into this technology, since time is considered as money in the businessworld. This makes the balance between making realistic applications and production time a bit difficultto define. While consumers want a very realistic environment to spend their time in, the producersdo not wish to spend several years to create an application and also do not want to force users to buyexpensive powerful computers. Though, as computers become more powerful and cheaper, and thedevelopment of VR and AR will further increase in the coming years, it will become more accessibleto more users, making these kind of applications invaluable for education.

Once the AR technology has developed a bit more, it may be possible to train people while at the sametime letting them perform the work. It will make it easier to get people into work and may reduce theunemployment rate. Of course, it depends on which kind of work one talks about but as an exampleperhaps a business needs a cashier to work one evening. The AR technology may in that case help withshowing overlays on the cash register for a person which has no previous experience.

Using this technology opens up a lot of doors to experience things one cannot do otherwise. Itmay help people with simulating the future, it may help with healthcare and overall make peoplesmarter. Simulating the future may help with people who would like to train their ability to talk infront of people, get a visual representation of a future project and more. In healthcare, VR and AR

51 of 52

Page 59: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Discussion Project Development

may help doctors with diagnosing patients by visually representing data about the patient, aboutwhich medications or which kind of illnesses one has. It may also help with therapy by letting apatient live through their fears or by aiding in surgery. Using this technology in education, will makepeople generally smarter. Instead of reading a book about an historic event, one may be able to relivethrough the event by using VR or AR. Using multi-sensory learning has been proven to increase theeffectiveness of learning and by using technology which enables us to do just that, it will revolutionisethe education system.

5.3 Project Development

The survey in this project was purely based on articles and papers. This could have been supplementedwith input from end-users (technicians and instructors) using some questionnaires. The prototypedemonstrates some of the important abilities of a VR application pertained for practical proceduretraining. As the authors barely had any previous experience in Unity, this implementation can revealhow convenient it is to work with a tool such as Unity, rather than drawing the foundation of a VRapplication design pattern.

If the project would be evolved any further, developers could consider to use this prototype as a pointerwhile creating a new application project using well established development patterns. If the programwould instead include more authentic and advanced 3D models, the program structure would becomean important part of the design. Besides the increase in size of the application, it would be difficult tonavigate to desired model parts of the aircraft unless one would use an appropriate interface.

5.4 Reflection on Own Learning

5.4.1 Knowledge and Comprehension

After having completed the survey, it has been a significant knowledge booster in the field of VR andAR. We do feel that we know a lot more about where the technology in both fields are at the currentstage and which problems each field needs to solve to provide a better solution than what exists today.The implementation work has increased our understanding for development using a tool such as Unity.We learned more about how to exploit the freedom of combining different code libraries to create thedesired VR application.

5.4.2 Proficiency and Ability

By doing an extensive research on the particular fields we have learned to define and extract problemsfrom the requirements. Using different methods have aided to process and analyse results of technicaland scientific studies. From the studies we have been able to produce a report which is well structuredand relevant to the project. We have also presented the project orally to our fellow course students andSaab Aeronautics employees through the use of illustrations and videos of the implementation.

52 of 52

Page 60: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

Bibliography

[1] Rustici Software. Scorm explained, 2017. URL https://scorm.com/scorm-explained/.Accessed: 2017-05-22.

[2] Rustici Software. Tin can api tech overview, 2017. URL http://tincanapi.com/overview/.Accessed: 2017-05-22.

[3] Paul Milgram, Haruo Takemura, Akira Utsumi, and Fumio Kishino. Augmented reality: A classof displays on the reality-virtuality continuum. In Photonics for industrial applications, pages282–292. International Society for Optics and Photonics, 1995.

[4] Wikimedia Commons (Russel Freeman). Virtuality continuum, 2007. URL https://en.wikipedia.org/wiki/File:Virtuality_Continuum_2.jpg. [Digital image] File:Virtuality_Continuum_2.jpg. Retrieved: 2017-05-22.

[5] Nirit Gavish, Teresa Gutierrez, Sabine Webel, Jorge Rodriguez, and Franco Tecchia. Designguidelines for the development of virtual reality and augmented reality training systems formaintenance and assembly tasks. BIO Web of Conferences, 1:00029, 2011. doi: 10.1051/bioconf/20110100029.

[6] Oculus. Oculus best practices, 2015. URL http://static.oculus.com/documentation/pdfs/intro-vr/latest/bp.pdf. Accessed: 2017-04-12.

[7] Leap Motion. Vr best practices guidelines, June 2015. URL https://developer-archive.leapmotion.com/assets/Leap%20Motion%20VR%20Best%20Practices%20Guidelines.

pdf. Accessed: 2017-06-06.

[8] Tom DeFanti. Vr: past, present, and future [virtual reality]. In Image Processing, 1998. ICIP 98.Proceedings. 1998 International Conference on, pages 1–vol. IEEE, 1998.

[9] Minseok Kim and Jae Yeol Lee. Touch and hand gesture-based interactions for directlymanipulating 3d virtual objects in mobile augmented reality. Multimedia Tools and Applications,feb 2016. doi: 10.1007/s11042-016-3355-9.

[10] Christoph Anthes, Rubén Jesús García-Hernández, Markus Wiedemann, and DieterKranzlmüller. State of the art of virtual reality technology. In Aerospace Conference, 2016 IEEE,pages 1–19. IEEE, 2016.

[11] Gabriele Bleser and Didier Stricker. Advanced tracking through efficient image processing andvisual–inertial sensor fusion. Computers & Graphics, 33(1):59–72, 2009.

[12] XinReality. Constellation, 2016. URL https://xinreality.com/wiki/Constellation.Accessed: 2017-04-15.

[13] XinReality. Oculus touch, 2016. URL https://xinreality.com/wiki/Oculus_Touch.Accessed: 2017-04-15.

53

Page 61: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

[14] Wikimedia Commons (Samwalton9). The oculus rift cv1 headset, 2017. URL https://commons.wikimedia.org/wiki/File:Oculus_Consumer_Version_1.jpg. [Digitalimage] File: Oculus_Consumer_Version_1.jpg. Retrieved: 2017-05-24.

[15] Wikimedia Commons (Samwalton9). Oculus touch controllers for the oculus rift cv1, 2017.URL https://commons.wikimedia.org/wiki/File:Oculus_Touch_Controllers.jpg.[Digital image] File: Oculus_Touch_Controllers.jpg. Retrieved: 2017-05-24.

[16] P. Dempsey. The teardown: HTC vive virtual reality headset. Engineering & Technology, 11(7):80–81, aug 2016. doi: 10.1049/et.2016.0731.

[17] Oliver Kreylos. Lighthouse tracking examined, 2016. URL http://doc-ok.org/?p=1478.Accessed: 2017-04-15.

[18] rvdm88. Htc vive lighthouse chaperone tracking system explained, August 2015. URL https://www.youtube.com/watch?v=J54dotTt7k0. YouTube [Video file, 1:26] Accessed: 2017-04-27.

[19] Vive. Vive tracker, 2017. URL https://www.vive.com/us/vive-tracker/. Accessed:2017-04-20.

[20] Vive. Vive controller, 2017. URL https://www.vive.com/us/accessory/controller/.Accessed: 2017-04-20.

[21] flickr. Htc vive now up for pre-order, 2016. URL https://www.flickr.com/photos/bagogames/25845851080/. [Digital image]. Retrieved: 2017-05-24.

[22] Samsung. Samsung gear vr specifications, 2017. URL http://www.samsung.com/global/galaxy/gear-vr/specs/. Accessed: 2017-04-20.

[23] Wikimedia Commons (CS104group72015). Image of the samsung gear vr final version, 2015.URL https://commons.wikimedia.org/wiki/File:Samsung_Gear_VR_V1.png. [Digitalimage] File: Samsung_Gear_VR_V1.png. Retrieved: 2017-05-24.

[24] Frank Weichert, Daniel Bachmann, Bartholomäus Rudak, and Denis Fisseler. Analysis of theaccuracy and robustness of the leap motion controller. Sensors, 13(5):6380–6393, 2013.

[25] How does the leap motion controller work?, 2014. URL http://blog.leapmotion.com/hardware-to-software-how-does-the-leap-motion-controller-work/. Accessed:2017-05-28.

[26] Wikimedia Commons (SkywalkerPL). Leap motion orion controller connected, 2016. URLhttps://commons.wikimedia.org/wiki/File:Leap_Motion_Orion_Controller_

Plugged.jpg. [Digital image] File: Leap_Motion_Orion_Controller_Plugged.jpg.Retrieved: 2017-05-24.

[27] Oliver Wasenmüller and Didier Stricker. Comparison of kinect v1 and v2 depth images interms of accuracy and precision. In Asian Conference on Computer Vision Workshop (ACCVworkshop), Springer, 2016.

54

Page 62: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

[28] Wikimedia Commons (litheon). Kinect sensor as shown at the 2010 electronic entertainmentexpo, 2010. URL https://commons.wikimedia.org/wiki/File:KinectSensor.png.[Digital image] File: KinectSensor.png. Retrieved: 2017-05-24.

[29] Wikimedia Commons (Evan-Amos). The xbox one’s kinect, 2014. URL https://commons.wikimedia.org/wiki/File:Xbox-One-Kinect.jpg. [Digital image] File:Xbox-One-Kinect.jpg. Retrieved: 2017-05-24.

[30] Unity. Unity 3d, 2017. URL https://unity3d.com/. Accessed: 2017-04-15.

[31] Unreal Engine. Unreal engine, 2017. URL https://www.unrealengine.com/. Accessed:2017-04-20.

[32] Valve. Openvr, 2017. URL https://github.com/ValveSoftware/openvr. Accessed:2017-04-20.

[33] What is osvr?, 2016. URL http://www.osvr.org/what-is-osvr.html. Accessed: 2017-05-21.

[34] Oculus. Oculus sdk, 2017. URL https://developer.oculus.com/licenses/sdk-3.4.1/.Accessed: 2017-04-20.

[35] Valve. Steamvr, 2017. URL http://store.steampowered.com/steamvr. Accessed: 2017-04-20.

[36] Xiaolin Quan, Feng Feng, Limin Qiao, Shaochun Zhong, and Shusen Shan. The research andapplication of aircraft maintenance virtual teaching training system. In 2011 InternationalConference on Mechatronic Science, Electric Engineering and Computer (MEC). IEEE, aug2011. doi: 10.1109/mec.2011.6025387.

[37] Zhengwei Chang, Yu Fang, Yun Zhang, and Can Hu. A training simulation system for substationequipments maintenance. In 2010 International Conference on Machine Vision and Human-machine Interface. IEEE, 2010. doi: 10.1109/mvhi.2010.31.

[38] Courtney McNamara, Matthew Proetsch, and Nelson Lerma. Investigating low-cost virtualreality technologies in the context of an immersive maintenance training application. In LectureNotes in Computer Science, pages 621–632. Springer International Publishing, 2016. doi:10.1007/978-3-319-39907-2_59.

[39] David B. Kaber, Yingjie Li, Michael Clamann, and Yuan-Shin Lee. Investigating humanperformance in a virtual reality haptic simulator as influenced by fidelity and system latency.IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, 42(6):1562–1566, nov 2012. doi: 10.1109/tsmca.2012.2201466.

[40] Simone Borsci, Glyn Lawson, Bhavna Jha, Mark Burges, and Davide Salanitri. Effectivenessof a multidevice 3d virtual environment application to train car service maintenance procedures.Virtual Reality, 20(1):41–55, jan 2016. doi: 10.1007/s10055-015-0281-5.

[41] Marcio Cabral, Olavo Belloc, Andre Montes, Eduardo Zilles Borba, and Marcelo Knorich Zuffo.VR THOR — virtual reality training with hotstick on operations risks. In 2016 IEEE VirtualReality (VR). IEEE, mar 2016. doi: 10.1109/vr.2016.7504786.

55

Page 63: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

[42] DAssault Systems. Immersive maintenance training, 2017. URL http://www.dassaultfalcon.com/en/MediaCenter/Newsd/Pages/PR%202017/

Immersive-maintenance-training.aspx. Accessed: 2017-04-06.

[43] Raul Crespo, Rene Garcia, and Samuel Quiroz. Virtual reality simulator for robotics learning.In 2015 International Conference on Interactive Collaborative and Blended Learning (ICBL).IEEE, dec 2015. doi: 10.1109/icbl.2015.7387635.

[44] Wadee S. Alhalabi. Virtual reality systems enhance students’ achievements in engineeringeducation. Behaviour & Information Technology, 35(11):919–925, jul 2016. doi: 10.1080/

0144929x.2016.1212931.

[45] Thomas Hilfert and Markus K?nig. Low-cost virtual reality environment for engineering andconstruction. Visualization in Engineering, 4(1), jan 2016. doi: 10.1186/s40327-015-0031-5.

[46] Helen V. Diez, Sara García, Andoni Mujika, Aitor Moreno, and David Oyarzun. Virtual trainingof fire wardens through immersive 3d environments. In Proceedings of the 21st InternationalConference on Web3D Technology - Web3D '16. ACM Press, 2016. doi: 10.1145/2945292.2945296.

[47] Chaowanan Khundam. First person movement control with palm normal and hand gestureinteraction in virtual reality. In 2015 12th International Joint Conference on Computer Scienceand Software Engineering (JCSSE). IEEE, jul 2015. doi: 10.1109/jcsse.2015.7219818.

[48] Salah Eddin Alshaal, Stylianos Michael, Andreas Pamporis, Herodotos Herodotou, GeorgeSamaras, and Panayiotis Andreou. Enhancing virtual reality systems with smart wearabledevices. In 2016 17th IEEE International Conference on Mobile Data Management (MDM).IEEE, jun 2016. doi: 10.1109/mdm.2016.60.

[49] Maria-Blanca Ibanez, Angela Di-Serio, Diego Villaran-Molina, and Carlos Delgado-Kloos.Augmented reality-based simulators as discovery learning tools: An empirical study. IEEETransactions on Education, 58(3):208–213, aug 2015. doi: 10.1109/te.2014.2379712.

[50] Dhiraj Amin and Sharvari Govilkar. Comparative study of augmented reality sdk's. InternationalJournal on Computational Science & Applications, 5(1):11–26, feb 2015. doi: 10.5121/ijcsa.2015.5102.

[51] Paul Davies and David Lee. Augmented reality in manufacturing at the boeing company -lessons learned and future directions, Dec 2014. URL http://thearea.org/wpfb-file/augmented_reality_at_boeing_-_lessons_learned-pdf/.

[52] David Fonseca, Nuria Martí, Isidro Navarro, Ernest Redondo, and Albert Sanchez. Usingaugmented reality and education platform in architectural visualization: Evaluation of usabilityand student’s level of sastisfaction. In Computers in Education (SIIE), 2012 InternationalSymposium on, pages 1–6. IEEE, 2012.

[53] Perla Ramakrishna, Ehtesham Hassan, Ramya Hebbalaguppe, Monika Sharma, Gaurav Gupta,Lovekesh Vig, Geetika Sharma, and Gautam Shroff. An AR inspection framework: Feasibilitystudy with multiple AR devices. In 2016 IEEE International Symposium on Mixed andAugmented Reality (ISMAR-Adjunct). IEEE, sep 2016. doi: 10.1109/ismar-adjunct.2016.0080.

56

Page 64: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

[54] Microsoft. Mixed-reality rendering, 2017. URL https://developer.microsoft.com/en-us/windows/mixed-reality/rendering. Accessed: 2017-04-16.

[55] Wikimedia Commons (Ramadhanakbr). Microsoft hololens, 2016. URL https://commons.wikimedia.org/wiki/File:Ramahololens.jpg. [Digital image] File: Ramahololens.jpg.Retrieved: 2017-05-25.

[56] Rod Furlan. The future of augmented reality: Hololens - microsoft's AR headset shines despiterough edges [resources_tools and toys]. IEEE Spectrum, 53(6):21–21, jun 2016. doi: 10.1109/

mspec.2016.7473143.

[57] Microsoft. Cortana, 2017. URL https://www.microsoft.com/en-us/mobile/experiences/cortana/. Accessed: 2017-04-16.

[58] Microsoft. Hololens purchase, 2017. URL https://www.microsoft.com/en-us/hololens/buy. Accessed: 2017-04-16.

[59] MetaVision. Meta 2 specifications, 2017. URL https://buy.metavision.com/products/meta2. Accessed: 2017-04-16.

[60] Patrick Renner and Thies Pfeiffer. Attention guiding techniques using peripheral vision and eyetracking for feedback in augmented-reality-based assistance systems. In 2017 IEEE Symposiumon 3D User Interfaces (3DUI). IEEE, 2017. doi: 10.1109/3dui.2017.7893338.

[61] Wikimedia Commons (MetaMarket). Meta 2 headset, 2016. URL https://commons.wikimedia.org/wiki/File:Meta_2.jpg. [Digital image] File: Meta_2.jpg. Retrieved:2017-05-25.

[62] A. Ho, C. Maritan, J. Sullivan, E. Cheng, and S. Cao. Measuring glance legibility of wearableheads-up display interfaces using an adaptive staircase procedure: A study with google glass.Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 60(1):2073–2077,sep 2016. doi: 10.1177/1541931213601470.

[63] Johnny Yau Cheung Chang, Lok Yee Tsui, Keith Siu Kay Yeung, Stefanie Wai Ying Yip, andGilberto Ka Kit Leung. Surgical vision. Surgical Innovation, 23(4):422–426, aug 2016. doi:10.1177/1553350616646477.

[64] Google. Google glass locations and sensors, 2015. URL https://developers.google.com/glass/develop/gdk/location-sensors. Accessed: 2017-04-16.

[65] Wikimedia Commons (Tim.Reckmann). Google glass - view of the mini-computer, 2014. URLhttps://commons.wikimedia.org/wiki/File:Google_Glass_Main.jpg. [Digital image]File: Google_Glass_Main.jpg. Retrieved: 2017-05-25.

[66] Wikimedia Commons (Evan-Amos). Google cardboard headset, 2015. URL https://commons.wikimedia.org/wiki/File:Google-Cardboard.jpg. [Digital image] File:Google-Cardboard.jpg. Retrieved: 2017-05-25.

[67] Maria Blanca Ibanez, Angela Di Serio, Diego Villaran, and Carlos Delgado-Kloos. Theacceptance of learning augmented reality environments: A case study. In 2016 IEEE 16thInternational Conference on Advanced Learning Technologies (ICALT). IEEE, jul 2016. doi:10.1109/icalt.2016.124.

57

Page 65: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

[68] Total Immersion. D’fusion studio, 2015. URL http://www.t-immersion.com/products/dfusion-suite/dfusion-studio. Accessed: 2017-04-17.

[69] Ramakrishna Perla, Gaurav Gupta, Ramya Hebbalaguppe, and Ehtesham Hassan. InspectAR: Anaugmented reality inspection framework for industry. In 2016 IEEE International Symposium onMixed and Augmented Reality (ISMAR-Adjunct). IEEE, sep 2016. doi: 10.1109/ismar-adjunct.2016.0119.

[70] Frieder Loch, Fabian Quint, and Iuliia Brishtel. Comparing video and augmented realityassistance in manual assembly. In 2016 12th International Conference on IntelligentEnvironments (IE). IEEE, sep 2016. doi: 10.1109/ie.2016.31.

[71] ScopeAR. Scopear: An augmented reality company, 2017. URL http://www.scopear.com/about/. Accessed: 2017-04-18.

[72] Caterpillar. A whole new reality, 2015. URL http://www.caterpillar.com/en/news/caterpillarNews/innovation/a-whole-new-reality.html. Accessed: 2017-04-18.

[73] PR Newswire. Scope ar and caterpillar deliver first augmented reality-based remote supportplatform for heavy industry. http://www.prnewswire.com/news-releases/scope-ar-and-caterpillar-deliver-first-augmented-reality-based-remote-support-platform-for-heavy-industry-300359664.html, 2016. Accessed: 2017-04-18.

[74] EquipmentWorld. Caterpillar augmented reality inspection demo, October 2015. URL https://www.youtube.com/watch?v=S8jMgBimuxg. YouTube [Video file, 5:22] Accessed: 2017-04-18.

[75] Mathieu Garon, Pierre-Olivier Boulet, Jean-Philippe Doironz, Luc Beaulieu, and Jean-FrancoisLalonde. Real-time high resolution 3d data on the HoloLens. In 2016 IEEE InternationalSymposium on Mixed and Augmented Reality (ISMAR-Adjunct). IEEE, sep 2016. doi: 10.1109/

ismar-adjunct.2016.0073.

[76] David Kanter. Graphics processing requirements for enabling immersive vr. AMD White Paper,2015.

[77] Dassault Falcon. 3d maintenance magic - virtual reality for immersive training, February 2017.URL https://www.youtube.com/watch?v=LzqR7JsVVoY. YouTube [Video file, 4:16]Accessed: 2017-04-20.

[78] Dassault Falcon. How virtual reality tools train dassault falcon aircraft mechanics ? aintv,February 2017. URL https://www.youtube.com/watch?v=Yb4-ASQX1AQ. YouTube [Videofile, 9:28] Accessed: 2017-04-20.

[79] Photon. We make multiplayer simple., 2017. URL https://www.photonengine.com/en-US/Photon. Accessed: 2017-05-23.

[80] Gaffer On Games. What every programmer needs to know about game networking,2010. URL http://gafferongames.com/networking-for-game-programmers/what-every-programmer-needs-to-know-about-game-networking/. Accessed: 2017-04-21.

58

Page 66: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

[81] Microsoft. Shared holographic experiences, 2017. URL https://developer.microsoft.com/en-us/windows/mixed-reality/shared_holographic_experiences. Acessed:2017-04-24.

[82] Henry Chen, Austin S. Lee, Mark Swift, and John C. Tang. 3d collaboration method overHoloLens™ and skype™ end points. In Proceedings of the 3rd International Workshop onImmersive Media Experiences - ImmersiveME '15. ACM Press, 2015. doi: 10.1145/2814347.2814350.

[83] Road to VR. Meta 2 ar glasses - collaboration demonstration, March 2016. URL https://www.youtube.com/watch?v=7PZN-r_zGek. YouTube [Video file, 9:28] Accessed: 2017-04-24.

[84] Danakorn Nincarean Eh Phon, Mohamad Bilal Ali, and Noor Dayana Abd Halim. Collaborativeaugmented reality in education: A review. In 2014 International Conference on Teaching andLearning in Computing and Engineering. IEEE, apr 2014. doi: 10.1109/latice.2014.23.

[85] Samuel A. Miller, Noah J. Misch, and Aaron J. Dalton. Low-cost, portable, multi-wall virtualreality. In Proceedings of the 11th Eurographics Conference on Virtual Environments, EGVE’05,pages 9–14, Aire-la-Ville, Switzerland, Switzerland, 2005. Eurographics Association. ISBN3-905673-21-5. doi: 10.2312/EGVE/IPT_EGVE2005/009-014.

[86] Muhanna A. Muhanna. Virtual reality and the CAVE: Taxonomy, interaction challenges andresearch directions. Journal of King Saud University - Computer and Information Sciences, 27(3):344–361, jul 2015. doi: 10.1016/j.jksuci.2014.03.023.

[87] Siddhesh Manjrekar, Shubhrika Sandilya, Deesha Bhosale, Sravanthi Kanchi, Adwait Pitkar, andMayur Gondhalekar. CAVE: An emerging immersive technology – a review. In 2014 UKSim-AMSS 16th International Conference on Computer Modelling and Simulation. IEEE, mar 2014.doi: 10.1109/uksim.2014.20.

[88] Oliver Kreylos. Hacking the oculus rift dk2, part ii, 2014. URL http://doc-ok.org/?p=1124. Accessed: 2017-04-25.

[89] AMD Advanced Micro Devices. A path to truly immersive virtual reality (vr),July 2015. URL http://developer.amd.com/wordpress/media/2015/07/A-path-to-truly-immersive-VR_Final_legally_approved.jpg. Accessed: 2017-06-06.

[90] Oliver Kreylos. Optical properties of current vr hmds, 2016. URL http://doc-ok.org/?p=1414. Accessed: 2017-06-16.

[91] Techradar. This is how microsoft’s hololens will address its biggestflaw, 2016. URL http://www.techradar.com/news/wearables/this-is-how-microsoft-s-hololens-will-address-its-biggest-flaw-1322596.Accessed: 2017-04-24.

[92] VBandi. Hololens vs meta 2, 2016. URL https://vbandi.net/2016/03/04/hololens-vs-meta-2/. Accessed: 2017-04-24.

[93] Oliver Kreylos. Hololens and field of view in augmented reality, 2015. URL http://doc-ok.org/?p=1274. Accessed: 2017-04-26.

59

Page 67: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

[94] Wareable. Meta 2 first impressions: Ar feels closer than ever, 2016. URL https://www.wareable.com/ar/meta-2-review. Accessed: 2017-04-24.

[95] Ming Li, Katrin Arning, Luisa Vervier, Martina Ziefle, and Leif Kobbelt. Influence of temporaldelay and display update rate in an augmented reality application scenario. In Proceedings of the14th International Conference on Mobile and Ubiquitous Multimedia - MUM '15. ACM Press,2015. doi: 10.1145/2836041.2836070.

[96] Takayuki Miura, Akihito Yoshii, and Tatsuo Nakajima. Designing affordances for virtualreality-based services with natural user interaction. In Design, User Experience, and Usability:Technological Contexts, pages 266–277. Springer International Publishing, 2016. doi: 10.1007/

978-3-319-40406-6_25.

[97] Tobii Technology. Towards immersive virtual reality - why eye tracking is the naturalnext step for vr, 2016. URL https://www.tobii.com/siteassets/tobii-tech/vr/tobii-whitepaper-eye-tracking-next-natural-step-for-vr.pdf/?v=1. Accessed:2017-06-09.

[98] Tobii Technology. Tobii eyetracking - an introduction to eye tracking and tobii eye trackers,2010. URL http://www.acuity-ets.com/downloads/Tobii%20Eye%20Tracking%20Introduction%20Whitepaper.pdf. Accessed: 2017-06-09.

[99] Heartwood3d. Virtual reality (vr) operations & maintenance training, November 2016. URLhttps://www.youtube.com/watch?v=4NCo9rRUniU. YouTube [Video file, 3:41] Accessed:2017-04-21.

[100] MechaTraining LLC. Virtual reality lathe by mechatraining llc, March 2017. URL https://www.youtube.com/watch?v=HRBuSwkiv7o. YouTube [Video file, 3:10] Accessed: 2017-04-22.

[101] MechaTraining LLC. Virtual reality motor maintenance by mechatraining llc, February 2017.URL https://www.youtube.com/watch?v=dq2RSlslQcU. YouTube [Video file, 4:18]Accessed: 2017-04-22.

[102] Tengo Interactive. Tengo interactive vr training solutions for russian railways, October 2016.URL https://www.youtube.com/watch?v=_10mU0NES5U. YouTube [Video file, 3:23]Accessed: 2017-05-21.

[103] Hewlett Packard (HP). Vr room-scale setup – htc vive, 2017. URL http://www8.hp.com/h20195/v2/GetPDF.aspx/4AA6-9648ENW.pdf. Accessed: 2017-06-16.

[104] Unity. Unity user manual (5.5), 2017. URL https://docs.unity3d.com/550/Documentation/Manual/UnityManual.html. Accessed: 2017-05-22.

[105] Blender, 2017. URL https://www.blender.org/. Accessed: 2017-06-02.

[106] fs-freeware.net, 2017. URL https://www.fs-freeware.net/. Accessed: 2017-05-25.

[107] SceneryDesign.org. Model converter x, 2017. URL https://www.scenerydesign.org/modelconverterx/. Accessed: 2017-05-25.

60

Page 68: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

[108] 3DWarehouse. 3dwarehouse webpage, 2017. URL https://3dwarehouse.sketchup.com/.Accessed: 2017-05-25.

[109] Erik Fagerholt and Magnus Lorentzon. Beyond the hud-user interfaces for increased playerimmersion in fps games, 2009.

[110] Explorations in vr, 2017. URL https://developer.leapmotion.com/explorations#110.Accessed: 2017-06-10.

61

Page 69: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

A Setting up Unity with HTC Vive and Leap Motion

Windows

1. Download

• Unity 5.5.0 from http://unity3d.com/get-unity/download/archive

• Leap Motion Orion from http://developer.leapmotion.com/get-started

• Leap Motion Unity Core Assets from http://developer.leapmotion.com/unity

• Leap Motion Attachments Module

• Leap Motion UI Input Module

2. Install Unity and Leap Motion Orion

3. Start Unity, create a new 3D project

4. Import the downloaded Leap Motion packages

5. In Asset Store, download and import SteamVR plugin

6. In Project window, navigate to Assets/LeapMotion/ and drag Leap_Hands_Demo_VR intothe scene

7. Delete the default Untitled Scene

8. In the scene, select object LMHeadMountedRig→ CenterEyeAnchor, go to the Inspectorwindow and set the following component values:

• Camera component

– Clipping Planes: (Near: 0.05 - Far: 1000)

– Depth: -1

– Field of View: 60

9. Additionally on the CenterEyeAnchor object, add the script SteamVR_Camera

10. In the scene, select LMHeadMountedRig→ HandModels. In Project window, navigate toAssets/LeapMotionModules/Attachments/Prefabs/

11. Drag the HandAttachments prefabs to the Handmodels object in the scene

12. In the scene, select LMHeadMountedRig→ CenterEyeAnchor→ LeapSpace→LeapHandController and, in the inspector window, increase the Hand Pool Model size by 1

13. Change the new Group Name that appeared to "Attachments_Hands" or something else moreappropriate

14. Additionally, assign the new HandAttachments hands into the Left and Right model references

15. In Project window, navigate to Assets/LeapMotionModules/UIInput/Prefabs/ and dragLeapEventSystem into the scene

62

Page 70: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

16. In the scene, select LeapEventSystem and in the Inspector window, under the Leap InputModule component, assign the only LeapHandController that exists in the project

17. Save the scene as a new scene and the setup is done. The project should now look like figure A.1shows.

Figure A.1: View of finished project setup of VR camera and Leap Motion modules

63

Page 71: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

B Prototype System Diagram

Figu

reB

.1:D

iagr

amsh

owin

gan

over

view

ofth

eus

edde

vice

san

dth

epr

oces

sof

the

prot

otyp

e

64

Page 72: Survey of Virtual and Augmented Reality Implementations ...1135758/FULLTEXT01.pdf · even more e ective, there is a desire to explore the possibilities in virtual and augmented reality.

C Demonstration of Implementation

C.1 Menu Interaction

Clicking on the image in this section will open a web browser playing a video. The video shows theinteraction with the menu in real-time while using the HTC Vive and Leap Motion. To the top-left ofthe video, is another video showing one of the authors performing the actions. Further informationabout each feature is described in section 4.2.2.3 on page 45. (Keep in mind when viewing the video,that watching on a flat screen gives a different experience than using an HMD)

C.2 Refuel Procedure

The image in this section opens a web browser playing a video of the refuel procedure training by usingHTC Vive and Leap Motion. The steps are described in detail in Table 4.1 on page 40. To the top-leftof the video, one of the authors performs the actions that are needed for the steps and to the top-right,there is a top-down view of the room and the tracked object in it, which shows the HMD, to see how itmoves around as the author does. (Keep in mind when viewing the video, that watching on a flat screengives a different experience than using an HMD)

65