Modelling a real-time multi-sensor fusion-based navigation ...

344
i Modelling a Real-Time Multi- Sensor Fusion-Based Navigation System for Indoor Human- Following Companion Robots Mark Tee Kit Tsun A thesis submitted in fulfilment of the requirements for the degree of Doctor of Philosophy Performed at Swinburne University of Technology 2018

Transcript of Modelling a real-time multi-sensor fusion-based navigation ...

Page 1: Modelling a real-time multi-sensor fusion-based navigation ...

i

Modelling a Real-Time Multi-Sensor Fusion-Based Navigation

System for Indoor Human-Following Companion Robots

Mark Tee Kit Tsun

A thesis submitted in fulfilment of the requirements for the degree of

Doctor of Philosophy

Performed at

Swinburne University of Technology 2018

Page 2: Modelling a real-time multi-sensor fusion-based navigation ...

ii

ABSTRACT

Assistive robotics today have been involved to a wide range of applications which include

studies that aid children with cognitive disabilities, disabled patients and elderly care.

However, it is difficult to acquire any commercially available robotic solution that

universally caters to Assistive Technology needs of domestic households. People

burdened with the care of their disabled family members have limited access to

Commercial Off-the-Shelf (COTS) or easily developed companion robots that assist in

carrying out their responsibilities while allowing caregivers to perform Activities of Daily

Living (ADL) in the meantime. This research investigated this problem and attempted to

formulate a solution by proposing a companion robot planning template that emphasizes

Object-Oriented mapping of functionalities to standalone solution modules built using

COTS components. However, the findings indicated that there is an absence of readily

available homogenous human-following capability for companion robots. This problem is

further refined into two navigational challenges: the need for an effective autonomous

indoor navigation and a reliable human tracking method. The research proceeded to

explore existing works that help influence the formation of a possible solution to these

challenges. The resultant solution is a robot navigation model based on an adapted

Potential Field Method and multi-sensor fusion (motion capture, raw depth and proximity

array), providing range-free pathfinding decisions that consider the primary target’s

relative position, immediate surroundings and mid-to-long-range depth profiles of the

environment. This model was implemented as a robot control system using Microsoft

Robotics Developer Studio (MRDS) and tested via the Visual Simulation Environment

(VSE). A total of 7 functional tests, local minima recreation, and 3 performance

benchmark scenarios were created to carry out observations on the effectiveness of this

system. The results show that the system satisfactorily passed all functional tests and

exceeded the performance of projected benchmark studies by 28.85%. This indicates that

the solution model presented in this research has clear potential in contributing towards

making indoor companion robots a common item in future households.

Page 3: Modelling a real-time multi-sensor fusion-based navigation ...

iii

PUBLICATIONS PRODUCED THROUGHOUT THE RESEARCH

Tee, MKT, Lau, BT & Siswoyo, HJ 2018b, ‘An Improved Indoor Robot Human-Following

Navigation Model Using Depth Camera, Active IR Marker and Proximity Sensors

Fusion’, Robotics, vol. 7, no. 1, Multidisciplinary Digital Publishing Institute, p. 4,

viewed 24 February, 2018, <http://www.mdpi.com/2218-6581/7/1/4>.

Tee, MKT, Lau, BT & Siswoyo, HJ 2018c, ‘Exploring the Performance of a Sensor-Fusion-

based Navigation System for Human Following Companion Robots’, International

Journal of Mechanical Engineering and Robotics Research (IJMERR).

Tee, MKT, Lau, BT & Siswoyo, HJ 2014, ‘Exploring the Possibility of Companion Robots

for Injury Prevention for People with Disabilities’, The 19th International Conference

on Transformative Science & Engineering, Business & Social Innovation (SDPS

2014), Kuching, Sarawak, Malaysia, pp. 199 – 210.

Tee, MKT, Lau, BT & Siswoyo, HJ 2017, ‘Pathfinding decision-making using proximity

sensors, depth camera and active IR marker tracking data fusion for human following

companion robot’, ACM International Conference Proceeding Series.

Tee, MKT, Lau, BT, Siswoyo, HJ & Lau, SL 2015, ‘A Human Orientation Tracking System

using Template Matching and Active Infrared Marker’, 2015 International Conference

on Smart Sensors and Application (ICSSA 2015), Kuala Lumpur, Malaysia.

Tee, MKT, Lau, BT, Siswoyo, HJ & Lau, SL 2016, ‘Potential of Human Tracking in

Assistive Technologies for Children with Cognitive Disabilities’, Supporting the

Education of Children with Autism Spectrum Disorders, IGI Global, pp. 245–247,

viewed

<https://books.google.com/books?hl=en&lr=&id=hxwRDQAAQBAJ&oi=fnd&pg=PA

245&dq=potential+of+human+tracking+in+assistive+technologies+for+children+wit

Page 4: Modelling a real-time multi-sensor fusion-based navigation ...

iv

h+cognitive+disabilities&ots=Tns3PcOKbt&sig=Ndby64oOnd7mKypAVSskjXiurhs>.

Tee, MKT, Lau, BT, Siswoyo, HJ & Then, PHH 2015, ‘Robotics for Assisting Children with

Physical and Cognitive Disabilities’, in LB Theng (ed.), Assistive Technologies for

Physical and Cognitive Disabilities, IGI Global, pp. 78–120, viewed 20 February,

2015, <http://www.igi-global.com/chapter/robotics-for-assisting-children-with-

physical-and-cognitive-disabilities/122905>.

Tee, MKT, Lau, BT, Siswoyo, HJ & Wong, DML 2016, ‘Integrating Visual Gestures for

Activity Tracking in the Injury Mitigation Strategy using CARMI’, RESKO Technical

Conference 2016: The 2nd Asian Meeting on Rehabilitation Engineering and

Assistive Technology (AMoRE AT), Rehabilitation Engineering & Assistive

Technology Society of Korea (RESKO), Goyang, Korea, pp. 61–62.

Tee, MKT, Lau, BT, Siswoyo Jo, H & Lau, SL 2016, ‘Proposing a Sensor Fusion

Technique Utilizing Depth and Ranging Sensors for Combined Human Following

and Indoor Robot Navigation’, Proceedings of the Fifth International Conference on

Network, Communication and Computing (ICNCC 2016), ACM Press, New York,

USA, pp. 331–335, viewed 20 June, 2017,

<http://dl.acm.org/citation.cfm?doid=3033288.3033345>.

Page 5: Modelling a real-time multi-sensor fusion-based navigation ...

v

ACKNOWLEDGEMENT

The author owes his deepest gratitude to Associate Professors Dr. Lau Bee Theng and

Dr. Lau Sian Lun as well as Dr. Hudyjaya Siswoyo Jo for their vigilant guidance and

support, without whom this research would not have been possible.

Special thanks go to Dr. Riady Siswoyo Jo for his unparalleled advice and suggestions

during inception of the solution model.

The author is indebted to the Research & Consultancy Office, Faculty of Engineering,

Computing & Science, his fellow postgraduate candidate colleagues, lab technicians and

staff at Swinburne University of Technology Sarawak (SUTS) for their countless

assistance and moral support.

Finally, the author wishes to thank his friends and family for their company throughout

this journey.

This work is dedicated to the members of the Robotics & Automation Club and IEEE

Student Chapter at SUTS in hopes that they never forget to reach out towards the sky.

Page 6: Modelling a real-time multi-sensor fusion-based navigation ...

vi

DECLARATION

This thesis contains no material which has been accepted for the award of any other

degree or diploma in any university, and to the best of my knowledge contains no material

previously published or written by another person, except where due reference is made

in the text of the thesis. Work based on joint research or publications in this thesis fully

acknowledges the relative contributions of the respective authors or workers.

Signature : ____________________

Name : Mark Tee Kit Tsun

Date : 31st May 2018

Page 7: Modelling a real-time multi-sensor fusion-based navigation ...

vii

TABLE OF CONTENTS

Abstract ............................................................................................................................ii

Publications Produced Throughout The Research .......................................................... iii

Acknowledgement ........................................................................................................... v

Declaration ......................................................................................................................vi

Table of Contents ........................................................................................................... vii

List of Figures ..................................................................................................................xi

List of Tables ..................................................................................................................xv

Nomenclature ............................................................................................................... xvii

Chapter 1: Introduction .............................................................................................. 1

1.1 Research Background ........................................................................................ 1

1.2 Research Problems ............................................................................................ 5

1.3 Research Aim and Objectives ............................................................................ 6

1.4 Research Scope ................................................................................................. 7

1.5 Thesis Organization ........................................................................................... 9

Chapter 2: Assistive Companion Robotics and their Challenges ............................. 11

2.1 Assistive Robotics in Preventing Injuries .......................................................... 11

2.2 Assistive Robotics in Physical Rehabilitation for Cerebral Palsy ...................... 15

2.3 Assistive Robotics in Social Interaction Rehabilitation ..................................... 17

2.3.1 Affinity to Robotic Interaction Companions ................................................ 18

2.3.2 Robot-Mediated Social Communication Treatment ................................... 20

2.3.3 Robot-Mediated Active Play ...................................................................... 25

2.3.4 Robotics in Preventive Intervention ........................................................... 28

2.4 Life-Long Assistive Robotics for Cognitively Disabled Children ....................... 30

Page 8: Modelling a real-time multi-sensor fusion-based navigation ...

viii

2.5 Identifying Companion Robots and Their Challenges ...................................... 33

2.6 Proposed Indoor Companion Robot Planning Template .................................. 36

2.7 Case Study: CARMI ......................................................................................... 39

2.8 Chapter Summary ............................................................................................ 43

Chapter 3: Indoor Robot Navigation and Human Following ..................................... 46

3.1 Challenge 1: Indoor Navigation ........................................................................ 46

3.1.1 Location-Specific Issues ............................................................................ 47

3.1.2 Autonomous Wayfinding Dilemmas ........................................................... 48

3.1.3 Sensory and Actuation Hardware Dilemmas ............................................. 51

3.1.4 Overview of Possible Autonomous Navigation Solutions .......................... 54

3.2 Challenge 2: Human Tracking .......................................................................... 58

3.2.2 Self-Localization ........................................................................................ 59

3.2.3 Body Tracking ............................................................................................ 65

3.2.4 Biomonitoring ............................................................................................. 69

3.2.5 Examples of Human Tracking Technologies Fusion in Existing Research 71

3.3 Proposed Combined Human Tracking and Indoor Navigation Solution ........... 73

3.4 Chapter Summary ............................................................................................ 75

Chapter 4: Design and Prototyping of the Multi-Sensor Fusion-Based Navigation

Model 77

Introduction ............................................................................................................ 77

4.1 .............................................................................................................................. 77

4.2 Human Orientation Tracking using an Active InfraRed (IR) Marker ................. 81

4.3 Sensor-Fusion Based Robot Navigation Model................................................ 86

4.3.1 Identification and Locking of Primary Subject ............................................ 87

4.3.2 Pathfinding and Obstacle Avoidance ......................................................... 94

Page 9: Modelling a real-time multi-sensor fusion-based navigation ...

ix

4.4 Example Model Application ............................................................................ 113

4.5 Robot Control Prototype Implementation ....................................................... 117

4.5.1 Use Case Model ...................................................................................... 118

4.5.2 Activity Model .......................................................................................... 123

4.5.3 Software Structure Model ........................................................................ 126

4.5.4 Development of the RobotBase Prototype Platform ................................ 128

4.5.5 Development of the Human Activity Tracking System ............................. 131

4.5.6 Navigation System Implementation using MRDS .................................... 134

4.5.7 CARMI Navigation System State Machine .............................................. 136

4.6 Chapter Summary .......................................................................................... 140

Chapter 5: Testing and Benchmarking Results ...................................................... 142

5.1 Introduction .................................................................................................... 142

5.2 Functional Testing Plan .................................................................................. 143

5.3 Functional Testing Simulation Results ........................................................... 147

5.3.1 Scenario: Single Uniform Obstruction ...................................................... 148

5.3.2 Scenario: Uniform Obstruction with Scattered Obstacles on the Left ...... 151

5.3.3 Scenario: Uniform Obstruction with Scattered Obstacles on the Right .... 153

5.3.4 Scenario: Single Non-Uniform Obstruction .............................................. 156

5.3.5 Scenario: Non-Uniform Obstruction with Scattered Obstacles on the Left

159

5.3.6 Scenario: Non-Uniform Obstruction with Scattered Obstacles on the Right

162

5.3.7 Functional Testing Simulation Findings and Discussion .......................... 167

5.4 Existing Indoor Robot Navigation Studies and Benchmark Scenarios Selection

171

5.4.1 Benchmark Study 1 ................................................................................. 171

Page 10: Modelling a real-time multi-sensor fusion-based navigation ...

x

5.4.2 Benchmark Study 2 ................................................................................. 173

5.4.3 Benchmark Study 3 ................................................................................. 174

5.5 Performance Benchmark Scenario Design and Results ................................ 175

5.5.1 Benchmark 1 Simulation Results ............................................................. 177

5.5.2 Benchmark 2 Simulation Results ............................................................. 179

5.5.3 Benchmark 3 Simulation Results ............................................................. 182

5.6 Chapter Summary .......................................................................................... 186

Chapter 6: Conclusion ........................................................................................... 187

6.1 Introduction .................................................................................................... 187

6.2 Contributions .................................................................................................. 187

6.2.1 Identification of Navigational Challenges for Indoor Companion Robots . 187

6.2.2 Design of a Novel Indoor Robot Navigation Model to Perform Real-time

Human-following and Obstacle Avoidance ........................................................... 190

6.2.3 Evaluation of the Effectiveness of the Proposed Navigation Model in Indoor

Human-following and Obstacle Avoidance ........................................................... 196

6.3 Limitations and Future Work .......................................................................... 198

6.4 Research Summary ....................................................................................... 201

References .................................................................................................................. 204

Appendices ................................................................................................................. 219

Appendix A – Use Case Modelling .......................................................................... 219

Appendix B – Activity Model Flow Charts ................................................................ 224

Appendix C - Custom Structural Schematics ........................................................... 228

Appendix D – Wiring Diagram ................................................................................. 236

Appendix E – Functional Testing Scenarios Simulation Results .............................. 237

Appendix F – Benchmark Scenarios Simulation Results ......................................... 311

Page 11: Modelling a real-time multi-sensor fusion-based navigation ...

xi

LIST OF FIGURES

Figure 2-1: G-EO System by Reha Technology AG (Reha Technology AG 2012). ...... 17

Figure 2-2: The HapticMaster end effector device (Delft Haptics Lab 2018). ................ 17

Figure 2-3: Triadic Interactions model (Colton, Ricks & Goodrich 2009). ...................... 19

Figure 2-4: KASPAR (Wood et al. 2013). ...................................................................... 19

Figure 2-5: Tito (Michaud et al. 2006). .......................................................................... 23

Figure 2-6: NAO H25 features diagram (Aldebaran Robotics 2014). ............................ 23

Figure 2-7: Scenario 3 transpiring between both patients and the facilitating robot (Costa

et al. 2010). ................................................................................................................... 25

Figure 2-8: Neuronics Katana 6M180 (Trevor, Howard & Kemp 2009). ........................ 26

Figure 2-9: The IROMEC (Ferrari, Robins & Dautenhahn 2009). .................................. 28

Figure 2-10: Roball (Trevor, Howard & Kemp 2009). .................................................... 28

Figure 2-11: The MATS robot (Balaguer & Gimenez 2006). ......................................... 35

Figure 2-12: Companion robot Planning Template intended for use with cognitively

disabled children. .......................................................................................................... 38

Figure 2-13: An early prototype of the CARMI robot as the planning template's proof of

concept. ......................................................................................................................... 40

Figure 2-14: The Robot-Based Injury Prevention Strategy. ........................................... 41

Figure 4-1: Microsoft Kinect documentation of hardware limitation. (Microsoft Corporation

2013) ............................................................................................................................. 81

Figure 4-2: First prototype of the active IR marker. The vest is equipped with hook & loop

strips that allow the IR modules to be mounted in a variety of patterns. An example of a

pattern as perceived by the camera is shown. (Tee, Lau, Siswoyo Jo & Lau 2015) ..... 82

Figure 4-3: IR Active Marker preconditioning process. (Tee, Lau, Siswoyo Jo & Lau 2015)

...................................................................................................................................... 83

Page 12: Modelling a real-time multi-sensor fusion-based navigation ...

xii

Figure 4-4: Example of an orientation pattern data set. (Tee, Lau, Siswoyo Jo & Lau 2015)

...................................................................................................................................... 83

Figure 4-5: Calibration rig for the active IR marker and camera. (Tee, Lau, Siswoyo Jo &

Lau 2015) ...................................................................................................................... 83

Figure 4-6: Illustration of the hardware detection zone performance, relative to orientation

tracking. (Tee, Lau, Siswoyo Jo & Lau 2015) ................................................................ 86

Figure 4-7: The depth sensor's camera space illustration. ............................................ 88

Figure 4-8: Illustration of the Active Marker IR Camera's view space. .......................... 89

Figure 4-9: Wandering Standpoint Algorithm. ............................................................... 95

Figure 4-10: Detection zones for an array of ultrasonic sensors on a robot. ................. 96

Figure 4-11: Simple visualization of the ranging sensor array, S. ................................. 97

Figure 4-12: Template of a depth map. ......................................................................... 99

Figure 4-13:Example of a depth image frame. .............................................................. 99

Figure 4-14: Illustration of the transformation problem from the Vertical to Horizontal

Plane. .......................................................................................................................... 101

Figure 4-15: Illustration of the Potential Field Method. (a) An overhead depiction of a

potential field. (b) The same field reimagined as a contoured slope. (Bräunl 2006) .... 102

Figure 4-16: Only the obstructions and target within field of view is considered when

deciding which direction to take. (Tee, Lau, Siswoyo Jo & Lau 2016) ......................... 104

Figure 4-17: Illustration of the transformed depth map into horizontal form. ............... 105

Figure 4-18: Revisited depth sensor camera view with top and bottom trims. ............. 107

Figure 4-19: Example of populating the Target Location array. ................................... 108

Figure 4-20: Example of the Bias and Sensor arrays alignment process. ................... 111

Figure 4-21: Example Model Application 1 – No Obstruction. ..................................... 114

Figure 4-22: Example Model Application 2 – Single Primary Obstruction. .................. 115

Figure 4-23: Example Model Application 3 – Obstruction and Clutter Encounter. ....... 116

Figure 4-24: Use Case model illustrating the general functions of the required robot avatar.

.................................................................................................................................... 118

Figure 4-25: The interaction model for the Injury Prevention Telepresence System. .. 119

Figure 4-26: Main execution loop of the Injury Prevention Telepresence System. ...... 123

Figure 4-27: Expression of the "Monitor Activity" subsystem. ..................................... 124

Page 13: Modelling a real-time multi-sensor fusion-based navigation ...

xiii

Figure 4-28: ANIMA System Structure. ....................................................................... 127

Figure 4-29: RobotBase development montage. (a) Drive and Controller assembly. (b)

Reworked mounting and custom power distribution. (c) Completed ultrasonic sensor array.

(d) 3D printed housings for the robot head. (e) Completed actuated turn-table mechanism.

.................................................................................................................................... 128

Figure 4-30: Electronics component block diagram. ................................................... 129

Figure 4-31: The completed version 1 of the RobotBase. ........................................... 130

Figure 4-32: Example of Visual Gestures sampling for identifying injurious actions. (Tee,

Lau, Siswoyo Jo & Wong 2016) .................................................................................. 132

Figure 4-33: Functionality exhibition during PECIPTA 2015.(Borneo Post Online 2016)

.................................................................................................................................... 133

Figure 4-34: Illustration of a standard telepresence robot services structure in MRDS.

(Microsoft 2012b) ........................................................................................................ 135

Figure 4-35: The CARMI navigation system state machine. ....................................... 138

Figure 4-36: Overview of the custom-build services for CARMI to be simulated using

MRDS and VSE. ......................................................................................................... 139

Figure 4-37: Example of the simulated CARMI and Child entities in VSE. .................. 140

Figure 5-1: Typical indoor environment simulated with VSE (Microsoft Corporation 2012).

.................................................................................................................................... 143

Figure 5-2: Classification of indoor obstacles. (Tee, Lau, Siswoyo Jo & Lau 2016) .... 144

Figure 5-3: A baseline and six obstacle scenarios. (Tee, Lau & Siswoyo Jo 2018a) .. 145

Figure 5-4: Baseline Scenario Sample. Unit for X-Z world coordinates in meters (m). 146

Figure 5-5: Combined CARMI paths during single uniform obstruction test. ............... 148

Figure 5-6: Combined motion graph and sample dataset for Uniform Obstruction with

Leftward Scatter. Unit for X-Z world coordinates in meters (m). .................................. 152

Figure 5-7: Combined motion graph and sample dataset for Uniform Obstruction with

Rightward Scatter. Unit for X-Z world coordinates in meters (m). ............................... 155

Figure 5-8: Combined CARMI paths during the single non-uniform obstruction scenario,

along with plot digitization of the alternate route and their approximated travel distance

comparisons. ............................................................................................................... 156

Page 14: Modelling a real-time multi-sensor fusion-based navigation ...

xiv

Figure 5-9: Combined CARMI paths during the non-uniform obstruction scenario with left

scatter field, along with plot digitization of the alternate route and their approximated travel

distance comparisons. ................................................................................................. 160

Figure 5-10: Combined CARMI paths during the non-uniform obstruction scenario with

right scatter field, along with plot digitization of the alternate route and their approximated

travel distance comparisons. ....................................................................................... 163

Figure 5-11: CARMI motion path logged during the Local Minima Problem recreation

scenario. ...................................................................................................................... 170

Figure 5-12: Graphical test results of the Multimodal Person-Following System. (Pang,

Seet & Yao 2013) ........................................................................................................ 172

Figure 5-13: Experimental pathfinding test results of the Meemo robot for tracking a

person in a gathering. (Harada et al. 2017) ................................................................. 173

Figure 5-14: Experimental pathfinding results of the fuzzy logic controller's performance

simulation. (Montaner & Ramirez-Serrano 1998) ........................................................ 175

Figure 5-15: Attempted replication of benchmark testing environments for performance

measurement. ............................................................................................................. 176

Figure 5-16: Plot digitization of the Human motion path from the Multimodal Telepresence

Robot project. (Pang, Seet & Yao 2013) .................................................................... 177

Figure 5-17: A graphical path result from a selected runtime sample of Benchmark 1

performance tests........................................................................................................ 178

Figure 5-18: Plot digitization of the Robot motion path from the Human-Following in

Crowded Environment project. (Harada et al. 2017) ................................................... 180

Figure 5-19: A graphical path result from a selected runtime sample of Benchmark 2

performance tests........................................................................................................ 181

Figure 5-20 Plot digitization of the Robot motion path from the Fuzzy Knowledge-based

Controller project. (Montaner & Ramirez-Serrano 1998) ............................................. 183

Figure 5-21 A graphical path result from a selected runtime sample of Benchmark 3

performance tests........................................................................................................ 184

Page 15: Modelling a real-time multi-sensor fusion-based navigation ...

xv

LIST OF TABLES

Table 4-1: Initial hardware plan for CARMI (Tee, Lau, Siswoyo Jo & Then 2015) ........ 78

Table 4-2: Initial software plan for CARMI (Tee, Lau, Siswoyo Jo & Then 2015) .......... 79

Table 4-3: Injurious gesture detection performance (Tee, Lau, Siswoyo Jo & Wong 2016).

...................................................................................................................................... 80

Table 4-4: Performance results of the human-orientation tracking system. (Tee, Lau,

Siswoyo Jo & Lau 2015) ................................................................................................ 84

Table 4-5: Example of raw distance input from ultrasonic sensors array. ..................... 96

Table 4-6: Adjusted example of ultrasonic sensors feedback. ...................................... 96

Table 4-7: Summary of generated Use Case stories................................................... 120

Table 4-8: Use Case story encompassing the autonomous monitoring function. ........ 120

Table 5-1: Excerpt from the combined logs of Uniform-Obstruction-Clear Sample 1. Unit

for X-Z world coordinates in meters (m). ..................................................................... 149

Table 5-2: Elapsed time for each sample in Uniform-Obstruction-Clear. .................... 150

Table 5-3: Elapsed time for each sample in Uniform-Obstruction-with-Left-Scatter. ... 153

Table 5-4: Elapsed time for each sample in Uniform-Obstruction-with-Right-Scatter. . 154

Table 5-5: A runtime sample excerpt from the combined logs of the Single Non-Uniform-

Obstruction scenario. Unit for X-Z world coordinates in meters (m). ........................... 157

Table 5-6: Elapsed time for each sample in Single Non-Uniform-Obstruction scenario.

.................................................................................................................................... 158

Table 5-7: A runtime sample excerpt from the combined logs of the Non-Uniform-

Obstruction-with-Left-Scatter-Field scenario. Unit for X-Z world coordinates in meters (m).

.................................................................................................................................... 161

Table 5-8: Elapsed time for each sample in the Non-Uniform-Obstruction-with-Left-

Scatter-Field scenario. ................................................................................................ 162

Table 5-9: A runtime sample excerpt from the combined logs of the Non-Uniform-

Obstruction-with-Right-Scatter-Field scenario. Unit for X-Z world coordinates in meters

(m). .............................................................................................................................. 164

Page 16: Modelling a real-time multi-sensor fusion-based navigation ...

xvi

Table 5-10: Elapsed time for each sample in the Non-Uniform-Obstruction-with-Right-

Scatter-Field scenario. ................................................................................................ 167

Table 5-11: Plot digitized travel distances for both Robot and Child entities for all samples

of Benchmark 1 performance tests. Travel distances are measured in meters (m). ... 179

Table 5-12: Tabulated calculations of Robot-To-Human travel distances for both

Benchmark 1 study and simulation results. ................................................................. 179

Table 5-13: Plot digitized travel distances for both Robot and Child entities for all samples

of Benchmark 2 performance tests.............................................................................. 182

Table 5-14: Tabulated calculations of Robot-To-Human travel distances for both

Benchmark 2 study and simulation results. ................................................................. 182

Table 5-15: Plot digitized travel distances for both Robot and Child entities for all samples

of Benchmark 3 performance tests.............................................................................. 185

Table 5-16: Tabulated calculations of Robot-To-Human travel distances for both

Benchmark 3 study and simulation results. ................................................................. 185

Page 17: Modelling a real-time multi-sensor fusion-based navigation ...

xvii

NOMENCLATURE

° Degrees

AAL Ambient Assisted Living

ADL Activities of Daily Living

AIRMT Active IR Marker Tracking

ANIMA Autonomous Injury Mitigation Avatar

API Application Programming Interfaces

ASD Autism Spectrum Disorder

CARMI Companion Avatar Robot for Mitigation of Injuries

CCR Concurrency and Coordination Runtime

COTS Commercial Off-The-Shelf

CP Cerebral Palsy

CWA Collaborative Wheelchair Assistant

DOF Degrees of Freedom

DSS Decentralized Software Services

EEG Electroencephalography

EMG Electromyography

EOG Electrooculography

ESM Experience Sampling Method

FOV Field of View

GPS Global Positioning System

HAL Hybrid Assistive Limb

IMU Inertial Measurement Unit

IoT Internet of Things

IQ Intelligence Quotient

IR InfraRed

IROMEC Interactive Robotic Social Mediators as Companions

LED Light Emitting Diode

LiDAR Light Detection and Ranging

LOS Line of Sight

Page 18: Modelling a real-time multi-sensor fusion-based navigation ...

xviii

LRF Laser Range Finder

m Meters

MATS Mechatronic Assistive Technology System

MEMS Micro-Electric-Mechanical Systems

MMDB Multimodal Dyadic Behaviour

MRDS Microsoft Robotics Developer Studio

PFM Potential Field Method

PIR Pyroelectric InfraRed

QOL Quality of Life

QR Quick Response

RF Radio Frequency

RGB-D Red Green Blue Depth

ROI Region of Interest

RSS Received Signal Strength

SDK Software Development Kit

SLAM Simultaneous Localization and Mapping

TB Turning Bias

TEBRA Teeth Brushing Assistance

VFH Virtual Force Histogram

VPL Visual Programming Language

VR Virtual Reality

VSE Visual Simulation Environment

WSA Wandering Standpoint Algorithm

Page 19: Modelling a real-time multi-sensor fusion-based navigation ...

1

Chapter 1: INTRODUCTION

This chapter introduces the area of application that the research intends to contribute in.

It encompasses initial literature survey over Assistive Robotics used in long-term

accompanying of people and seeks to identify the problems that prevent this technology

from being widely available for consumer-level use. The identification of these problems

leads to the establishing of research questions and objectives, forming the aim of this

study.

1.1 RESEARCH BACKGROUND

Assistive Technologies is a field that has long been in service of the elderly, children and

people with disabilities (Tee, Lau & Siswoyo 2014). One of its branches of study is

Assistive Robotics, which has made significant strides in aiding children with cognitive

disabilities. Originally predominant in augmenting therapies for children with Cerebral

Palsy (CP) and Autism Spectrum Disorder (ASD), there is a rising prevalence of

companion robots that are designed to accompany and interact with them (Jones, Trapp

& Jones 2011; Shamsuddin et al. 2012, 2013). Their function varies from reinforced

therapeutic exercises and induced play to aiding parental monitoring and injury prevention

(Cabibihan et al. 2013; Tee, Lau, Siswoyo & Then 2015). A large portion of the children's

time is spent indoors, which presents navigational constraints to an autonomous robot

that are commonly compensated using costly and resource-consuming technologies such

as Simultaneous Localization and Mapping (SLAM), Light Detection and Ranging (LiDAR),

and more. This usually results in companion robots being out of reach for most middle to

low income families.

There is yet to be a consumer-accessible companion robot platform that has widespread

acceptance. Over the last 40 years, robots have been widely circulated in consumer

markets from toys and industrial assemblers to intelligent household appliances. The

exponential growth of the Internet since the 1990’s has also instigated the introduction of

Page 20: Modelling a real-time multi-sensor fusion-based navigation ...

2

robots as interactive telecommunications platforms, telepresence avatars in medical

practice and elaborate Internet-of-Things (IoT) applications (Pang, Seet & Yao 2013).

There has also been a multitude of experimental applications for companion robots in

elderly care centers and various augmented therapies for cognitively impaired children.

Studies have shown that children with impaired social interaction skills have higher affinity

towards robots, preferring a non-human partner for therapy sessions. Because of this,

therapists have used robots as puppet surrogates to better elicit responses from their

children patients. There are also a variety of experimental companion robots that are

designed to interact and play with the children as a means of disguising reinforced

exercises between therapy sessions. These studies are explored as part of this

research’s literature review.

Despite the large number of assistive robot applications for people with disabilities, there

has yet been any widely accepted robot platform that is available for the consumer market

in terms of accessibility. Widespread accessibility can only be achieved if the platform is

easily acquired or scratch-built from common components, said components can be

locally procured in most parts of the world, and development resources can be freely

made available.

There are a few possible reasons for this, including the fact that most assistive robot

systems are developed for specific applications. Many of these systems serve to aid in a

single therapeutic function (e.g. reinforcing social interaction exercises, voice-controlled

avatar or etc.) and do not share common platform features (e.g. mobility between rooms

or floors, interfaces with external networks, unsupervised operation, etc.).

Also, most of these robots are not developed beyond their prototyping stages because

their purpose is to fulfil the requirements of their specific studies. The high hardware and

computational costs of a full-featured human-following companion robot is not feasible for

most research projects. Thus, the lack of common functionality, no incentive for

development beyond prototyping and high costs of companion robots result in little

Page 21: Modelling a real-time multi-sensor fusion-based navigation ...

3

demand for the creation of a platform that could potentially be refined for consumer

affordability use.

This research aims to contribute towards improving the current state of companion robot

accessibility. To do this, an initial study to identify the nuances of developing and

implementing companion robots was carried out. The findings show that one of the major

challenges in this endeavor is autonomous navigation.

As a human companion, a robot has several criteria to fulfil. Its principle requirement is to

perform human-following, an action that is comprised of tracking a human target and then

repositioning itself in response. The simplest example of this action can be observed by

physically tethering the robot to a human target using a rope. The tension on the rope

acts as a leash, indicating which direction the robot must redirect to so that it continues

to face its human target. Although this example is crude, it does help visualize the

navigational challenges that the robot must face.

The first challenge is to know which direction to turn to so that the target remains in front

of the robot. A physical leash can provide this information in the form of vectored tension,

while a virtual tether will need to employ some form of vision or wireless solution to

achieve this.

The second challenge is to maintain a specific distance between the robot and the human

target. A physical tether solves this by introducing rope tension to indicate distance, but

the challenge becomes more difficult for a virtual one. If the robot moves too fast or near

its target, a collision and injury could occur. Likewise, a robot that is too far away will lose

sight of the target and fail its escort function.

The third challenge is the act of reorientation and repositioning of the robot itself. Its

motion must be controlled, either via closed-loop control, visual servoing or equivalent

methods. The robot’s build and form must be considered during actuation to compensate

Page 22: Modelling a real-time multi-sensor fusion-based navigation ...

4

for motion deviations such as overshooting and drifts. These deviations impede the

human-following performance.

The final challenge is the ability to steer around obstacles in the operating environment.

The example of using a physical leash will most likely fail in this challenge, as the direct

tethering will cause the robot to turn and run into collision with objects between itself and

the human target. A virtual leash allows the robot the freedom to employ a variety of

maneuvering algorithms and methods, but the selection depends on what information and

provisions that is available to it.

For a companion robot to be effective, it must be able to overcome these challenges while

simultaneously perform its specific function autonomously. While there may be suites of

technologies to achieve this, their implementation cost may be unfeasible, as explored

further.

While a physically leashed robot offers a basic visualization of how human-following is

performed, almost all companion robots operate wirelessly. The foundation of robot

navigation is in localization, which is the ability to identify and track the location of itself

and the target within an operating environment. This is accomplished through a

combination of embedded environments, smart wearables and/or vision-based

technologies. However, due to the severity of the compound challenges of indoor human-

following, these systems often result in high cost of hardware and computational

resources.

Embedded environments involve installation and establishment of sensor networks within

the operating environment. These sensors help supply location information to the robot,

in addition to other miscellaneous environmental data. Unfortunately, embedded

environments are costly to set up and are not portable.

Smart wearables are sensors and devices mounted on the human target that serve as

beacons or part of virtual tethers that a robot can be “leashed” to. Variations of this method

Page 23: Modelling a real-time multi-sensor fusion-based navigation ...

5

are portable and cost less than embedded environments but are mostly effective at

reporting the proximity of the target, rather than actual position. Wearable technologies

are best employed in combination with other sensory techniques.

Vision-based sensors emulate human perception, by taking photographic snapshots of

the environment and using image processing to identify entities and obstructions. This

branch of technology is currently the most popular method for target identification and

tracking as well as robot-navigation. Sensors such as Light Detection and Ranging

(LiDAR) return 360° depth maps of the robot’s surroundings, enabling easy localization.

The drawback of vision-based technologies is their susceptibility to environmental lighting

conditions and image quality of their sensor output. Better performance can only be

acquired from high-cost options.

Effective human-following benefits from the ability of self-localization to enable navigation

in a known environment. This is commonly achieved via established mapping techniques

such as Simultaneous Localization and Mapping (SLAM), which require the use of LiDAR,

embedded environments or their equivalent. Mapping of the operating environment is also

the prerequisite for most of the existing navigation methods including A* algorithm,

Potential Field Method (PFM) and Wandering Standpoint Algorithm (WSA). The drawback

is that these methods require significant computational resources to accommodate path

searching and learning.

To summarize, one of the main contributors to the hampering of widespread companion

assistive robot accessibility is the complexity of overcoming autonomous navigation

challenges and the high component costs of its current solutions.

1.2 RESEARCH PROBLEMS

The initial literature survey and review have resulted in formalizing the navigation issues

into two main Research Problems (RP):

Page 24: Modelling a real-time multi-sensor fusion-based navigation ...

6

RP1: Complex Autonomous Navigation Challenge The ability to independently locate and move around a dynamic environment has been

the subject of intense research which resulted in surprisingly few universal solutions.

Machines lack the organic adaptability of animals and humans to easily maneuver

multiple kinds of terrain, thus making their development process very difficult and

complicated. This research narrows down on indoor navigation because most companion

robots operate under the roof while accompanying their human users. There is a need for

a simpler but homogenous solution to general indoor pathfinding and simultaneous

human-following so that the burden of custom robot navigation can be lifted from the

implementation effort.

RP2: Human-Following Robot Techniques Require Costly Sensor Technologies

Current indoor localization technologies such as SLAM and LiDAR require the use of

rotary rangefinder lasers, advanced environmental imaging devices and embedded

sensor networks that are either expensive, unfeasible or unwieldy for a standalone mobile

robot. In addition, pathfinding algorithms and machine learning that can help in optimized

human-following and obstacle avoidance require substantial computational resources to

cope with the volume of real-time calculations. A fusion of low-cost or simple sensor

solutions with acceptable performance limitations may be viable for lowering the difficulty

bar for indie developers of companion assistive robots.

1.3 RESEARCH AIM AND OBJECTIVES

The investigation of the presented research problems is carried out through exploring the

following Research Questions (RQ):

RQ1: What are the navigational challenges for indoor companion robots? RQ2: What available technologies are viable for solving the identified navigational challenges? RQ3: How to model a solution to the navigational challenge in indoor human following and obstacle avoidance?

Page 25: Modelling a real-time multi-sensor fusion-based navigation ...

7

The aim of this research is to model a robot navigation solution that relies on multi-sensor

fusion for real-time human-following and obstacle avoidance and gauge the effectiveness

of the model. Therefore, this aim is achieved through the following objectives:

RO1: To identify the challenges in navigation systems for indoor companion robots. The obstacles and difficulties faced in autonomous indoor navigation need to be studied

and encapsulated into an environment model with which navigation systems can be

tested in. This research aims to use this study to identify any possibility of abstracting the

perception-circumvention process to reduce the complexity of RP1.

RO2: To design a novel indoor robot navigation model to perform real-time human-following and obstacle avoidance. By abstracting the operating environment, a navigation strategy needs to be formulated

that capitalize on the perceived environmental attributes. The resultant model will then

provide a blueprint for implementing a robot control solution that emphasizes on

alternative sensory solutions that can overcome the high cost of RP2.

RO3: To evaluate the effectiveness of the proposed navigation model in indoor human following and obstacle avoidance. The robot control solution can then be gauged for validity by means of simulations against

both a set of isolated challenges identified in RO1. System performance can be measured

by comparing it to existing robot navigation studies with reproducible test data.

1.4 RESEARCH SCOPE

This research focuses on creating a novel navigation solution for companion robots that

operate in indoor environments. The reason for this is because most existing assistive

robot systems interact with their users in healthcare facilities and within the confines of

their homes. Also, autonomous robots in outdoor environments benefit from increasingly

accurate heading and localization systems such as GPS technology, which are

unavailable for their indoor counterparts. Hence, there is a significantly larger need for

indoor autonomous navigation solutions. This research will scope itself around improving

Page 26: Modelling a real-time multi-sensor fusion-based navigation ...

8

the operation of companion robots that are intended for typical indoor living environments

amongst human users.

Part of the research problem pertains to the high computational and hardware costs of

current companion robot implementations. This research aims to design and develop a

viable solution that does not involve complete environmental mapping, extensive external

sensor networks and extreme changes to the environment. This leaves computational

and hardware headroom for the specific functions an implemented robot may be tailored

to have, in addition to operating as an all-in-one portable package. The resultant robot

control will be implemented to run in standalone robot operating systems like typical

consumer variants.

An object-oriented approach is adopted for the design of solution, highlighting its ability

to utilize Commercial Off-The-Shelf (COTS) sensor hardware and combination with

existing navigation algorithms to complete the navigation system. This helps minimize the

difficulty in implementing companion robots tailored for aiding disabled people in low-

income families. The implementation process will emphasize on the use of consumer

procurable components and products to make the template available for use throughout

the world.

This research aims to present a novel robot navigation model that optimizes the

application of existing real-time pathfinding algorithms using perception data from multi-

sensors fusion. Thus, the design of the model emphasizes on generic algorithm interfaces

that can suit any vision-based non-mapped pathfinding methods. This research does not

include developing another pathfinding method because there is already a myriad of

existing indoor pathfinding algorithms which can be selected based on varying operating

environments and specialized robot purposes. The navigation model will serve as an

augmentation to any choice of existing pathfinding algorithms.

This research entails the design and development of a robot navigation model which does

not include the development of a companion robot. This is because the model must be

Page 27: Modelling a real-time multi-sensor fusion-based navigation ...

9

devised as a generic autonomous robot system that can be applied on any companion

robot that satisfies the required design template. Testing and performance benchmarking

will be done via software simulation. However, a baseline companion robot project is

selected to serve as a platform for implementing the model for the simulation purposes.

1.5 THESIS ORGANIZATION

This thesis documents the findings, proposed model, developed system, simulation

results and findings of this research within the following chapters:

Chapter 1: Introduction Provides an overview of the research problems, and how it is composed of research

questions that can be answered through the study’s aim and objectives. The overall

methodology used to organize the research is also explained here, in addition to a

summary of each thesis chapter’s content.

Chapter 2: Survey of assistive robotics The scope of this research covers companion robots for children with cognitive disabilities

such as Cerebral Palsy (CP) and Autism Spectrum Disorder (ASD). This chapter

documents the state-of-the-art of assistive robotics in service of these disabled children

and discusses the common traits or functions that can be found amongst the individual

robot prototypes. This chapter concludes by composing a generic robot template that can

be used to guide the design of a low-cost companion robot. This template was utilized in

creating an injury mitigation robot (CARMI) as an example platform for implementing this

study’s navigation system. In addition, this chapter also helped uncover the key

challenges that cannot be simply mitigated using that template’s COTS-centric approach.

Chapter 3: Survey of human tracking technologies This chapter attempts to delve into the nuances of the identified challenges in

implementing indoor companion robots: the need for an autonomous navigation system

that can traverse common human-populated rooms and a reliable method to tracking the

Page 28: Modelling a real-time multi-sensor fusion-based navigation ...

10

human subject’s position so that the companion robot can perform effective human

following. The spectrum of each problem is explored by reviewing related existing

research works, along with their proposed solutions. These publications help provide

clues to how a suitable fusion solution can be formulated to simultaneously tackle both

challenges. This chapter ends by proposing a joint human-following and indoor navigation

solution.

Chapter 4: Design and Development of the Multi-Sensor Fusion-Based Navigation Model To minimize the drawbacks of depth camera body detection, an Active InfraRed Marker

tracking system was created and documented. For the model’s Phase 1, this system is

used to reduce false detections and lock onto the body profile of the target human for

consistent activity tracking. Phase 2 involves transformation of the raw depth frame,

perimeter proximity sensors feedback and Phase 1 position data to help the robot decide

which direction around an obstruction to begin avoidance maneuvering. Lastly, this

chapter covers the design and implementation process of this model into a Microsoft

Robotics Developer Studio (MRDS) system that can be ported for both simulation and

robot hardware.

Chapter 5: Testing and Evaluation

This chapter documents the design and development of simulation scenarios that act as

the navigation system’s functional testing and performance benchmarks. The results are

presented here and discussed regarding its performance and viability.

Chapter 6: Conclusion This chapter presents the overall progress of the research effort by mapping the research

findings to the objectives and questions, before concluding with a discussion of possible

improvements and future work direction.

Page 29: Modelling a real-time multi-sensor fusion-based navigation ...

11

Chapter 2: ASSISTIVE COMPANION ROBOTICS AND THEIR CHALLENGES

Robotics has had a long history of application in a variety of industries since their infancy

in early Greek, Chinese and Egyptian civilizations. First introduced as automata solely for

entertainment, robotics has started to be used for relieving humankind of menial and

repetitive tasks since the mid-1900s. The technology was limited to purely industrial work

with limited to no direct human contact until the beginnings of rehabilitation robotics.

Unlike the industrial predecessors that center upon swift and precise execution of actions

using heavy equipment, rehabilitation robotics focus on lightening the workload of

therapists and caregivers for disabled children and the elderly. In these cases, human

factors, safety, intelligent adaptation of treatments according to previous progress, and

providence of Quality of Life (QOL) services are more emphasized for user wellbeing and

assisting therapeutic processes.

This chapter introduces the state of the art for Assistive Robotics and how companion

robots are part of this collection of technologies. Companion robots that are aimed at

long-term close-contact with their patients face a host of unique challenges in both

development and operation, which are also discussed.

Since the spectrum of assistive robot applications is incredibly wide, this research focuses

on the specific area of Assistive Robotics employed for aiding children with cognitive

disabilities such as Autism Spectrum Disorder (ASD) and Cerebral Palsy (CP).

2.1 ASSISTIVE ROBOTICS IN PREVENTING INJURIES

There have been a multitude of Assistive Robotics research effort dedicated in aiding

children afflicted with ASD and CP, which involves the active participation of Information

Communications Technology and Robotics as Assistive Technologies. Due to stereotypy

or motor function trouble, children with cognitive disabilities are subjected to the primary

Page 30: Modelling a real-time multi-sensor fusion-based navigation ...

12

cause of most injuries – falls. Falls had been found to be the biggest threat to elderly and

disabled patients, indicating that the detection and handling of falls should be of utmost

importance. Falls had been categorized into 3 forms, based on whether the victim fell

from sleeping, standing or sitting positions (Yu 2008). Thus in order to detect falls, current

assistive technologies come in the form of wearable devices, embedded sensors and

vision-based solutions (Mubashir, Shao & Seed 2013).

Wearable devices are usually sensor packages that are attached to the human body for

gauging and detecting the conditions which describe an injury occurrence. Some devices

are also equipped with health monitoring sensors to track real-time health signatures of

the patient to healthcare personnel remotely. It is by far the most common approach to

injury detection and prevention, without requiring extensive modifications of the living

environment. However portable, wearable devices require strapping equipment and

electronics to the human body, thus imposing significant intrusion to the subject’s

performance of daily routine. The user needs to be disciplined and trained to consistently

wear the sensors in the same correct manner, as well as attempt to get by with daily

routines as normal as possible. In most cases, live tests could not be carried out

consistently due to the test subjects’ lacking will and commitment to the devices (Tee,

Lau & Siswoyo Jo 2014). One good example of a wearable device for fall detection was

presented in Singapore in 2008, which uses sensors to examine the posture and motion

of the human body in real-time. If a combination of readings match those found in fall

templates, then a healthcare crew will be notified of a possible fall event (Yu 2008).

Likewise, another study attempted to integrate a piezoresistive sole into running shoes

for monitoring the pressure spots on the bottom of the feet during ambulation. The shoes

were developed to assist gait training by providing timely feedback on how the patient

walks. The system can be used to detect abnormal pressure signatures such as loss of

footing which indicates an impending fall (Canavese et al. 2014). The wide availability of

smartphones integrated with accelerometers had been capitalized by another study,

which utilized the said sensors in tandem with shoes embedded with sensors to anticipate

abnormal walk gait. In the event of falls, the system would automatically notify caregivers

for immediate assistance (Majumder et al. 2014).

Page 31: Modelling a real-time multi-sensor fusion-based navigation ...

13

Embedded sensors or the ambient approach aims to allay the intrusive nature of wearable

devices by embedded sensors within the confines of the living environment. Therefore,

instead of placing bulky appendages onto clothing or the body, sensors are integrated

into flooring, walls, ceilings, furniture and appliances to gauge and monitor the health

signatures of the patient. Embedded environments also partially serve as an enabler for

smart homes and automated living space. Even though the ambient approach completely

removes the need for intrusive wearable devices, they do require immense amount of

cost and work to acquire, embed and network the living environment. Also, the system

would only work when the patient is within the confines of the living environment, unlike

wearable devices which follow the patient around wherever he or she goes (Tee, Lau &

Siswoyo Jo 2014). A typical example of an ambient approach can take the form of a home

with flooring that was fitted with arrays of pressure and vibration sensors. While this

implementation is electronically simple, it provides a reliable method of indoor tracking as

well as fall detection. Pressure concentrations caused by feet would help locate anyone

within the premises, while any adverse trauma would signal a possible fall event.

Installation of such a system would unfortunately require a major overhaul of the flooring

(Yu 2008). Yet another simple ambient implementation is a bed that is fitted with bladders

that are filled with measurable liquid. By tracking the changing distribution of fluid within

the bladders, the system can identify and record how the patient shifts sleeping postures,

or whether a fall from the sleeping position may possibly occur (Yu 2008). Ambient

Assisted Living (AAL) describes the dimensions for intelligent living spaces that combines

both ambient and wearable devices for implementing telemetric health monitoring, indoor

personal localization, automated emergency alert and more. In addition to falls, AAL is

also geared to detect other danger situations that can cause injuries, such as fires, and

other critical health conditions (Spasova & Iliev 2014). Another similar system to AAL

integrates autonomous suites of triggers, sensors and alarms in order to monitor the

health state of the patient, and adjust the environmental condition to stabilize him or her

until auto-notified health personnel arrives (Taraporewalla 2014).

Page 32: Modelling a real-time multi-sensor fusion-based navigation ...

14

The third approach to injury detection comes in the form of vision-based systems that

utilize a myriad of camera hardware and computer vision instead of micro-electro-

mechanical-systems (MEMS) that use discrete sensors as discussed previously. Vision-

based systems utilize various technologies such as Infrared imaging, stereoscopic

cameras and combinations of depth sensing sensors to detect the location and orientation

of people who are within their view. The computer vision software will then process the

data feed and discern the orientation, distance, height, posture, and limb positions of the

subject (Mubashir, Shao & Seed 2013). When aimed at detecting falls and injury-causing

activities, vision-based systems provide a middle-ground between wearable devices and

the ambient approach, both in terms of intrusiveness and complexity in environmental

setup (Tee, Lau & Siswoyo Jo 2014). Vision-based systems had traditionally been

accomplished using high frame-rate cameras and motion-capture harnesses, but since

2009, motion tracking sensors such as Microsoft’s Kinect has been able to track human

position and skeletal orientations without the need for any wearable markers (Pham 2009).

Other alternatives to the Infrared-based depth sensor are available such as the Asus

Xtion Pro (ASUSTeK Computer Inc. 2014). Alternative technologies in the form of fixed

stereoscopic camera arrays can also be found in products such as Sony’s PlayStation

Camera (Sony Computer Entertainment Inc 2013) and Intel’s RealSense Camera

(Creative Technology Ltd. 2014). Both technologies are viable for 3D human body

tracking, albeit with differences with respect to reaction to environmental lighting

conditions and tracking accuracy. The Microsoft Kinect sensor has been heavily utilized

in human tracking efforts, such as the implementation of a human escorting telepresence

robot that moves in stride with a human conversation partner (Cosgun, Florencio &

Christensen 2013). Another Kinect-based project utilizes multiples of the sensor in order

to actualize an assembly station robot that examines the posture and motion of its human

partner and then calculates the best motion paths that prevent collision with him or her

(Morato et al. 2014).

A pivotal research was carried out in 2013 using a Microsoft Kinect sensor for human

activity tracking and detection of possible injuries. This system tracks the human subject

and attempts to match the input with a database of template activities for discerning what

Page 33: Modelling a real-time multi-sensor fusion-based navigation ...

15

the subject is currently doing. The system can detect 14 different activities at varying

degrees of success, ranging from standing and walking to drinking and eating (Ann & Lau

2013). This approach would seem to be the best lead for non-contact activity monitoring

and injury prevention for children with cognitive disabilities. Although vision-based

systems may seem to be the bulletproof solution to the difficulties faced by wearables

and the ambient approach, they do still suffer from hardware-limited Field of View (FOV),

false readings, interference from environmental lighting conditions as well as complexity

of implementing machine vision solutions.

2.2 ASSISTIVE ROBOTICS IN PHYSICAL REHABILITATION FOR CEREBRAL PALSY

Robotics for physical rehabilitation largely serve people who are diagnosed with Cerebral

Palsy (CP). There are 3 classes of CP cases, namely spastic, non-spastic (being dystonic

or ataxic) or a combination of both types (McMurrough et al. 2012). These types

differentiate the level of damage to the Pyramidal Tract that the patient is experiencing,

caused by irregular or fluctuating muscle development which leads to posture and

balance difficulties. This condition complicates the patient’s mobility, ability to interact with

the environment and communicating body language, inevitably interfering with personal

growth and capability of independent performance of daily routines.

Children with Cerebral Palsy suffer from muscle decay, motor control difficulties and

spasms which could be minimized via repetitive stretching and physiotherapy. By 2012,

Assistive Technologies for rehabilitation of children with CP come in 3 forms: Assistive

Devices, Interactive Games and Robotics. “Assistive Devices” is a category of tools

fashioned from household and care items that have been augmented for ease of use by

people with physical impairments. Examples of these tools include motorized wheelchairs,

intelligent spoons and the Ankle Foot Orthosis (McMurrough et al. 2012). As Assistive

Devices become increasingly intelligent and connected to the web, its distinction from

Assistive Robotics begin to diminish. Interactive Games have been steadily progressing,

partially due to the resurgent interest in consumer Virtual Reality (VR) facilities. These

Page 34: Modelling a real-time multi-sensor fusion-based navigation ...

16

games present interesting virtual environments that the patients can interact with, via a

series of active and passive exercises.

Assistive Robotics for rehabilitation consists of intelligent devices that act as components

in Robot Mediated Therapy. These devices offer mechatronic assistance in both sensing

the performance of the patient as well as active actuation of stretching exercises. While

a majority of these technologies are originally developed for aiding stroke and physically

disabled patients, their application can be adjusted to suit the needs of children with CP.

Assistive Robotics help in physical treatments for impaired lower and upper extremities,

as well as walking gait (Tee, Lau, Siswoyo Jo & Then 2015).

One prototype mechatronic apparatus combines both passive stretching and active

movement rehabilitation in a compact form factor. While concentrating on only the lower

extremity, the machine was targeted towards children with spastic CP which often cause

involuntary muscle spasms. These spasms often cause imbalance between both legs,

thus impeding normal ambulation. The apparatus can be strapped to the child and

automatically commence with passive stretching routines according to preset treatments

(Wu et al. 2011). A commercial scale robotic treadmill was developed by Reha

Technology AG for the assistance in rehabilitating ambulation capability.

The G-EO System, shown in Figure 2-1, is an end effector device which suspends the

patient’s body and simulates ambulation via actuated footpads. An end effector device is

basically a robotically actuated appendage that is grasped or connected to a patient’s

limb, simulating exerted forces typically experienced in a specific exercise routine. The

machine can simulate up to 3 Degrees of Freedom (DOF) for each foot and is extensively

used for extended sessions of repetitive walk exercises. The G-EO System is also

reprogrammable to support various custom routines for strengthening the patient’s

walking gait and control (Reha Technology AG 2012).

Page 35: Modelling a real-time multi-sensor fusion-based navigation ...

17

Figure 2-1: G-EO System by Reha

Technology AG (Reha Technology AG 2012).

Figure 2-2: The HapticMaster end effector

device (Delft Haptics Lab 2018).

Yet another example of an end effector device used for physical therapy would be the

HapticMaster robot arm (Figure 2-2), used in tandem with the NJIT-RAVR system in a

study that combines virtual reality and an adaptive robot for treating negative muscle

responses. A child with cognitive disabilities who suffers from negative muscle responses

can use this system to enter a simulated environment, with the HapticMaster acting as

the object to be grasped, manipulated or maneuvered according to the running program

(Fluet et al. 2009).

2.3 ASSISTIVE ROBOTICS IN SOCIAL INTERACTION REHABILITATION

Children with social and communication impairments require treatment of a mental and

interactive nature. While conventional rehabilitation seeks to establish physical aptitude

to patients in order to accomplish daily tasks, socio-communicative function rehabilitation

is not as straight forward (Tee, Lau, Siswoyo Jo & Then 2015). In this respect, children

with ASD face 3 major difficulties: beginning and maintaining social interactions,

participating in active play and stereotypy (Ricks & Colton 2010).

Page 36: Modelling a real-time multi-sensor fusion-based navigation ...

18

One lead that has spearheaded robotics as one of the prime assistive technologies

available for the support of rehabilitation and supervision of Autistic children indicated that

observations on these children show that they are far less intimidated by interactions with

robots as compared to fellow human beings (Michaud et al. 2006). A study hypothesized

that children with social and communication impairments find robots more appealing as

they lack the confusing and complex mixture of gestures, expressions and words from

human beings that cause the sensation akin to sensory overload (Ricks & Colton 2010).

Robots are designed with clear and consistent representations of interaction cues,

simulating correct and repeatable expressions to suit individual tailored exercises

(Michaud et al. 2006).

2.3.1 Affinity to Robotic Interaction Companions

Children suffering from Autism Spectrum Disorder (ASD) have often been described

feeling confused, frightened and frustrated by trying to understand social communication

concepts such as facial expressions, body language and social cues. Being detached

from the normal learning curve for social interaction, they experience sensory overload

when facing simultaneous tasks of maintaining eye contact, scanning facial expressions

and managing other social signals (Ricks & Colton 2010). Repeated bouts of sensory

overload from early childhood lead to growing fear and negative attachment to the

presence of other humans, making Autistic children seem lonely, aloof and displaced at

social environments. Eventually, this fear of people will develop into significant

communication impairments that would disrupt the child’s ability to live independently in

society. This condition also makes rehabilitation increasingly difficult for therapists as the

child gets older.

An interesting discovery was made when children with ASD were observed interacting

with animated robot constructs. It appears that most of them demonstrate more visible

response and less intimidation when interacting with anthropomorphic robots than they

do with human peers. One possible reason for this is that robots can be constructed to

show specific facial expressions clearly, as opposed to humans who are capable of

Page 37: Modelling a real-time multi-sensor fusion-based navigation ...

19

displaying complex combinations of expressions (Michaud et al. 2006). Also, the

expressions could be replicated clearly and consistently, easing the learning curve of the

child at recognizing which facial expression corresponds to a specific emotion before

transitioning to a live human partner at a later stage.

Another pilot study was conducted to investigate the responses of children with lower

Intelligence Quotient (IQ) classifications towards a humanoid robot. Lower IQ is a

common characteristic of cognitively impaired children who have negative reactions

towards human interaction. The study found that 5 of 6 children showed largely positive

reactions, indicating that there is a possible advantage at using child-robot interaction as

a means of encouraging active social responses from children with communication

impairments (Shamsuddin et al. 2013).

A robot companion can also act as a non-human social partner, gradually acclimatizing

the child to initiating active play, participate in joint imitation exercises and practicing basic

social cues. The robot may also function as an introducer to a human peer as part of the

social interaction exercise. Figure 2-3 shows an example model of robots used as

mediators in social rehabilitation of disabled children.

Figure 2-3: Triadic Interactions model (Colton, Ricks &

Goodrich 2009).

Figure 2-4: KASPAR (Wood et

al. 2013). Called the Triadic Interactions, this model ties the Autistic child with social impairments

to a human companion with the robot acting as a mediator, and a Wizard of Oz entity who

Page 38: Modelling a real-time multi-sensor fusion-based navigation ...

20

acts as a teleoperator of the robot and indirect accomplice to the companion (Colton,

Ricks & Goodrich 2009). In a typical session, a child is left alone with the robot, with the

Wizard of Oz controlling it from behind the scene. Exercises ranging from basic

conversation to joint imitation games can be performed using the robot. Once the child

has become accustomed to the robot, it can then be used to introduce a human

companion into the room. The robot can be used to lead activities and conversations

between all three entities until the child can acceptably interact with the human

companion entirely without the robot’s presence.

One particularly successful application of mediatory robotics according to the Triadic

Interactions is the use of the humanoid robot called KASPAR, shown in Figure 2-4.

KASPAR was developed by the Adaptive Systems Research Group at the University of

Hertfordshire to facilitate robot mediated social sessions between an Autistic child and a

human companion. The child-like robot has actuated facial features and upper extremity

limbs for expressing emotions during conversations. In one application, KASPAR was

successfully utilized to extract information favorably from children during high-stress

interviews. This will prove invaluable for future law enforcement, healthcare and social

service uses (Wood et al. 2013).

Though the impaired children’s affinity towards robots is a positive sign towards the

progression of Assistive Robotics, some studies suggest that this affinity may only be

applicable to children with ASD or specific socio-communication conditions. One such

study showed that normal-functioning people empathize more towards robots that closely

resemble human features (Riek et al. 2009). These findings hint that children with ASD

are more receptive of anthropomorphic robots because they do not perceive the need for

the magnitude of empathy that human companions seem to demand.

2.3.2 Robot-Mediated Social Communication Treatment

A child with ASD or other conditions that share similar characteristics is treated by

establishing basic components of communication skills, which include joint attention,

Page 39: Modelling a real-time multi-sensor fusion-based navigation ...

21

imitation, active play, taking turns and recognition of emotions (Ricks & Colton 2010).

These therapy sessions are rigorously repeated to ensure iterative reinforcement of

lessons, like developing muscle memory. Initial sessions are conducted between the child

and a therapist. After extensive lessons, the child can be introduced into sessions that

include other facilitators or peers. Thus, the therapy process for a single child is grueling

and time consuming for both parents and therapist. The amount of time and effort required

depend on how quickly the child can acclimatize to the presence of another human being.

This means that group session successes depend highly on the level of its members’

rehabilitation progresses.

The child will have to be formally diagnosed with ASD. Autistic symptoms in children can

be visibly identified by trained therapists from the age of 3 years and up (Ricks & Colton

2010). The most likely advice upon positive diagnosis would be to begin rehabilitative

therapy sessions as soon as possible. For higher chance of rehabilitation success, the

child will have to undergo continuous and consistent sessions to reinforce their learning.

This can be incredibly difficult for both child and parents due to the sessions being foreign

and intimidating. An Autistic child or one who suffers from language impairments often

feel inadequate in front of other people, especially towards therapists and caregivers. It

is in these particular moments that robots can lend a hand; their toy-like design and

motions can help coax little children to initiate active interaction and play (Colton, Ricks

& Goodrich 2009). Once a child is familiar with the presence and interaction with the robot,

it can then begin to act as a mediator between her and the therapist. All the while, an

operator (referred to as the Wizard of Oz in the Triadic Interactions model) continue to

control the robot’s actions and conversations hidden from view. It is imperative that the

robot does not establish itself as the focal point of interaction. Rather, it should be used

to introduce interactions with the human companion (the therapist) as soon as the child

is able. Doing this should prevent the robot from becoming an emotional crutch for the

child.

The first therapeutic exercise for building social interaction skills is the ability to imitate.

This attracts the child’s attention to detail and acts as a precursor to active play. When

Page 40: Modelling a real-time multi-sensor fusion-based navigation ...

22

done by a therapist, it is usually difficult to convince a child to begin participation. By

introducing the activity using a robot medium, the stigma of human presence is negated.

The presence of a human companion in the activity is gradually introduced by the robot.

Honda Research Institute had developed an advanced humanoid robot platform called

ASIMO which had been used to experiment with robot-mediated imitation games for

disabled children (Colton, Ricks & Goodrich 2009). Though the experiment was largely a

success, the immense cost of the robot would mean that the technology is still out of

reach for most households.

A research was undertaken to develop similar but simpler mediator robots for facilitating

therapy sessions which include imitation games, as shown in Figure 2-5. TITO was

designed to have a friendlier cartoonish look, with two actuated arms in addition to a

rotating and nodding head complete with an LED matrix for a mouth. TITO was designed

to appeal to young patients and is adequate for aiding in simple recognition of facial

expressions, body language and basic imitation. However, its physical limitations

constraint its accuracy for imitation games, although this would be the focus for its future

improvements (Michaud et al. 2006).

Another robot called CHARLIE was developed with similar build and purpose as TITO,

being equipped with hand and face tracking using an integrated camera in addition to its

actuated arms and head. CHARLIE uses the vision-based tracking system to implement

more accurate and involving imitation games. It is also capable of operating in two-player

configuration, for turn-taking imitation plays (Boccanfuso & O’Kane 2011).

Page 41: Modelling a real-time multi-sensor fusion-based navigation ...

23

Figure 2-5: Tito (Michaud et al.

2006).

Figure 2-6: NAO H25 features diagram (Aldebaran

Robotics 2014).

Subsequent communication skills such as active play, joint attention and recognition of

emotions require repetitive exercises to reinforce a child’s understanding and ability to

overcome her condition’s impediments. In most parts of the world, there are shortages of

trained therapists. It is often difficult for parents to schedule consistent sessions with

therapists. The Interactive Robotic Social Mediators as Companions project (IROMEC)

set out to develop a personalized companion robot that would hopefully assist in

facilitating reinforced lessons for Autistic children. The IROMEC was designed to be

programmable with a host of interactive games. Using its graphical user interface and

mobile robot build, it aims to coax children to initiate active play (Ferrari, Robins &

Dautenhahn 2009). While the robot is not designed to be humanoid, it is constructed to

withstand the wear and tear of domestic use, encouraging future development of

consumer obtainable access to Assistive Robotics.

Aldebaran Robotics developed a stand-alone humanoid robot called the NAO in 2004,

which is of a smaller scale when compared to Honda’s ASIMO. The NAO is fully

articulated with functioning upper and lower limb manipulation, as depicted in Figure 2-6.

Page 42: Modelling a real-time multi-sensor fusion-based navigation ...

24

The robot has been put to task with a multitude of robotics challenges including robot

soccer matches, humanoid robot locomotion and children rehabilitation. The NAO can be

reprogrammed to suite a multitude of interactive storytelling, game playing and mediatory

functions, utilizing its suite of sensors, cameras and voice recognition facilities. One study

made use of the NAO for imitation exercises in the attempt to document the behavioral

profiles of children with ASD (Tapus et al. 2012). Another study used the NAO’s HRI

modules that harnessed the robot’s LED eyes, audio speech, music and full-range

motions to successfully engage and hold the attention of 4 out of 5 children in its control

group (Shamsuddin et al. 2012).

Once the child has achieved a level of social interaction acuity with a single person without

the need of continuous robot company, she is ready to begin therapy sessions with

multiple peers. Again, the challenges of introducing the presence of other human beings

into the child’s social environment are great. While the interaction between the child and

a therapist may be stable, leaving two or more similarly afflicted children alone is

unpredictable. In this scenario, a robot can help introduce group interaction gradually, as

each child would have been accustomed to robot-based incremental social introductions

by that point. A child can first be coaxed into participating in a social activity using a robot,

before additional partners are individually added. One study performed this using a robot

constructed out of Lego, as shown in Figure 2-7. This robot mediated between two

cognitively impaired adolescents. The robot begins by passing a ball between itself and

a patient. Then, the game is altered by adding a pair of red and green cards. As a cognitive

task, the patient will figure out that displaying the green card signals the robot to kick the

ball back to him. The third scenario involves both patients taking turns to display the card

and pass the ball back to the robot. Finally, the robot is removed from the environment,

leaving both patients playing the game with each other. There were initial challenges in

acclimatizing the patients, but once the session is under way, both had to be stopped by

the therapists. The experiment was a positive indication of the potential of robots as a

non-human medium for easing group social therapy sessions. (Costa et al. 2010).

Page 43: Modelling a real-time multi-sensor fusion-based navigation ...

25

Figure 2-7: Scenario 3 transpiring between both patients and the facilitating robot (Costa et al.

2010).

Children with less severe social impairments can also benefit from mediatory robots in

group interactions. Group interactions can present these children with increased risk of

sensory overload and anxiety attacks. Instead of having a companion robot, Assistive

Robotics can also come in the form of a shared activity, promoting group work as a way

of allaying fear of others. A study of this was conducted in 2009 where a Lego robot-

building class was organized to examine its impact on fostering teamwork between

groups children with ASD. The children were positioned in groups of 3 and were tasked

to gradually build a simple robot. Their creations would compete within an arena to collide

and achieve simple challenges. The groups were observed for individual proximities,

communication, joint attention and emotional responses. The study reported positive

findings, stating progressive group collaboration within a shorter period of time compared

to normal therapy sessions (Wainer et al. 2010).

2.3.3 Robot-Mediated Active Play

Throughout early childhood, active play is responsible as the primary means for a child

to develop her senses and interactions with the living environment. As an activity, playing

helps to develop their motor skills in addition to exercising creative expressions and

enthusiasm with group dynamics. The International Classification of Functioning and

Disability states that the act of playing is an important factor in assessing a child’s Quality

Page 44: Modelling a real-time multi-sensor fusion-based navigation ...

26

of Life (Marti, Pollini & Rullo 2009). By default, normally-functioning children are expected

to automatically begin playing when presented with a toy. However through observation,

this has not always been the case (Trevor, Howard & Kemp 2009). It can be expected

that children with CP or ASD have even lower inclination of initiating play, possibly due to

lacking interest or physical strain.

The previously-discussed mediatory robot called KASPAR was used for an experiment

to determine if it can help spur the interest of a teenager with ASD. The robot featured a

head with 8 Degrees of Freedom (DOF) and arms with 3 DOF to minimally express human

features but is anthropomorphic to avoid rejection (due to it being too human-like). The

16-year-old teenager was reported to have little tolerance of other peers during play.

Once allowed supervised operation of KASPAR, the teenager expressed unexpected

fascination over the controls. This outcome was not previously possible with other therapy

attempts. Eventually, the teenager has gradually learned to participate in group imitation

games with other peers, using KASPAR as his avatar (Robins, Dautenhahn & Dickerson

2009).

Figure 2-8: Neuronics Katana 6M180 (Trevor, Howard & Kemp 2009).

Page 45: Modelling a real-time multi-sensor fusion-based navigation ...

27

The ROBOSKIN project was undertaken to add a synthetic skin that could facilitates

tactile feedback through KASPAR (Amirabdollahian et al. 2011). This additional feature

could offer deeper insights into the child’s degree of motor control over body contact

during robot-mediated play sessions (e.g. gauging grip strength during shaking hands,

and hi-fives). Continued development of mediatory robots such as KASPAR could lead

to the development of Assistive Robots as robotic playmates instead of robotic toys.

While direct control of KASPAR would technically render it as an intelligent toy, the

addition of autonomous operation (which turns it into a robotic playmate) may eventually

enable it to independently interact and accompany children, relieving parents and

caregivers for short moments. In the pursuit of demonstrating the application of a robotic

playmate, a system was developed using a Neuronics Katana 6M180 Robot Arm, shown

in Figure 2-8. The robot’s end effector is fitted with a camera for identifying colored blocks.

The robot system was used for imitation exercises using the blocks (Trevor, Howard &

Kemp 2009).

A robotic playmate was developed in the form of a piano-playing robot. The system was

programmed using the Boardmaker Plus educational software. The software was

designed to be a simplified graphical programming environment that disabled children

could be taught to use. With this, the children were able to manipulate the robot to hit the

keys of the toy piano (Jones, Trapp & Jones 2011). This form of assisted-play is

developed with the similar aim as the Katana robot project discussed previously.

The previously discussed IROMEC (Figure 2-9) is not merely limited to specific

facilitations of reinforced interactive exercises for children with ASD. It was essentially

intended to be a robotic companion for disabled children, which includes assisted play.

As the IROMEC is reprogrammable, it is possible to equip it with limited autonomous play-

assisting routines. To an extent, the IROMEC can also adapt to the child’s performance

in terms of motor control and cognitive skills, adjusting games and interactions for a

personalized learning curve (Ferrari, Robins & Dautenhahn 2009).

Page 46: Modelling a real-time multi-sensor fusion-based navigation ...

28

Figure 2-9: The IROMEC (Ferrari, Robins &

Dautenhahn 2009).

Figure 2-10: Roball (Trevor, Howard

& Kemp 2009).

Non-autonomous robots may yet contribute to assisted play as intelligent toys. The Yale

Child Study Centre attempted to demonstrate the applicability of intelligent toys in helping

to diagnose early signs of Autism (Trevor, Howard & Kemp 2009). The series of toys

contain passive sensors that tracks the motion and forces acted upon them as they are

played by the children. One such toy called the Roball (shown in Figure 2-10) comes

equipped with voice feedback to interact with the child. The ball can identify whenever it

was left unattended, carried or engaged in active play. The Roball reacts to its various

states using a bank of audio responses, eliciting more responses to initiated play by

cognitively disabled children (Michaud et al. 2006). A modified version of the Roball was

fitted with more obvious audio cues aimed at children with impaired vision acuity. Called

the I-Ball, the intelligent ball is intended for facilitating group soccer games for visually-

impaired children (Stephanidis & Antona 2013).

2.3.4 Robotics in Preventive Intervention

As previously discussed, the characteristics of cognitive disabilities can be identified from

the age of 3 years and above. This is especially true for Autism Spectrum Disorder (ASD)

as there are standard tests available that can be administered by trained therapists (Ricks

& Colton 2010). Early detection is vital as preventive measures become increasingly

Page 47: Modelling a real-time multi-sensor fusion-based navigation ...

29

difficult to apply as the child matures. In addition, early participation in rehabilitation

sessions would minimize the condition’s impact on learning and social development.

Treatment for cognitive disabilities are costly in terms of time and effort for both the

parents and the therapists. Due to global shortage of trained therapists, access to early

diagnosis of ASD and CP may not be possible for every family. The diagnosis process is

not easily replicable through artificial means because each child is assessed individually,

with differing tolerances for both interactive and cognitive processes. Therapists depend

on their experience to adjust the parameters of the tests according to each child. One

study attempted to implement a physiology-based robotic solution that gauges the child’s

level of enthusiasm. Through an affect-inference mechanism, the system matches the

enthusiasm level with an appropriate level of play for assessing the child for cognitive

impairment. The mechanism is a stepping stone towards developing Assistive Robotics

with the ability to perceive and understand the morphing states of its various users (Liu

et al. 2008). Future development of this system may eventually lead to the creation of

intelligent systems that relieves therapists of the sensitive process of diagnosing for early

signs of cognitive disabilities.

Mediatory robots could also be used to facilitate diagnostics of early social interaction

impairments. Since most children taking the tests are considerably younger than the

target age bracket for conventional therapy sessions, the robot will most likely be

designed to be much friendlier and simpler than constructs such as KASPAR and the

NAO. One mediatory robot was designed specifically for early intervention, consisting of

a durable robot with actuated eyes, eyelids, head and wings. The robot can be operated

semi-autonomously, following the orientation of the child’s face for joint attention

exercises. Otherwise, it is teleoperated by the therapist for imitation games and

recognition of facial expressions (Dickstein-Fischer et al. 2011). The robot is tethered

wirelessly to the operator’s controls, opening the doors to possible teleoperation via the

Internet and uncoupling therapists from geographical constraints.

Page 48: Modelling a real-time multi-sensor fusion-based navigation ...

30

LEGO has been featured twice in this chapter for providing the building blocks in the

construction of a mediatory robot as well as the vehicle for facilitating group activity. In a

similar fashion, the LEGO line of products presents a safe and flexible means for children

to exercise creative development and active play. Routine play of LEGO system products

in a group setting may help disabled children deviate from stereotypical reclusive behavior

– one of the first challenges to defeat via early intervention (Costa et al. 2011).

2.4 LIFE-LONG ASSISTIVE ROBOTICS FOR COGNITIVELY DISABLED CHILDREN

So far, it appears that the most successful social-aid implementation of Assistive Robotics

had been as components in robot-mediated therapy sessions. Physical rehabilitation with

robotics were largely augmentation solutions for existing physiotherapy functions. Both

facets share the common mode of operation being directly controlled by therapists. The

systems had to be tethered to a remote-control solution that still depends on trained

operators, making current Assistive Robotics unsuitable for prolonged independent use.

The goal of life-long assistance refers to long-term independent augmentation of a patient.

Achieving this goal would help unshackle the time and effort commitment of therapists to

care for more impaired children. This can be accomplished through the development of a

host of assistive technologies that range from a general non-human play companion and

intelligent toys that continue reinforcing lessons through games, to physical

augmentations that help with mobility and performing daily routines. As these

technologies are meant to operate without prolonged supervision, there are a host of

ethical and safety issues that need to be addressed so that these implements do not

inadvertently impede or endanger their users.

As it was with earlier applications of Assistive Robotics in therapy sessions, life-long

assistance is applicable for both social and physical support. The most iconic

representation of life-long robotics assistance comes in the form of companions.

Companion robots can help entertain, practice communication and initiate active play with

the children, reinforcing social rehabilitation lessons between formal sessions with

Page 49: Modelling a real-time multi-sensor fusion-based navigation ...

31

therapists. Granted, there is no artificial counterpart that can supersede the company of

a human therapist or parent, but Assistive Robotics aim to augment the interactive

experience of the disabled child, picking up where therapy and parenting leaves off to

ensure continuous reinforced learning. A good example of a companion robot prototype

is the IROMEC, which is designed to adapt to the user’s changing play characteristics

and dynamically adjusts the tolerances of its interactive games while consistently holding

the child’s attention (Ferrari, Robins & Dautenhahn 2009). The assistive robot system

utilizing the Neuronics Katana also functions in a similar function, presenting disabled

children a companion that does not tire from repetitive and prolonged sessions of

previously restrictive cognitive play (Trevor, Howard & Kemp 2009). The same format of

Assistive Robotics can be applied for aiding the performance of daily routines which

demand better reach, strength and flexibility that the user is not able to provide by himself.

Artificial assistance for performing daily routines constitute yet another avenue for life-

long Assistive Robotics. The said routines are assumed to be activities that are vital

components for day to day living. For instance, there has been extensive research in the

field of augmented mobility for disabled users. While mostly catering to the elderly and

patients suffering from grievous injuries, these Assistive Technologies can also be utilized

to aid children with Cerebral Palsy (CP). However, not all technologies can be directly

applicable without some form of modification. A child with CP would struggle with

conventional user interfaces because of inadequate motor control, making tools such as

motorized wheelchairs a possible health hazard. With the addition of a control system

consisting of a proximity sensor suite and a touch-capable graphical user interface that is

easily operated, a powered wheelchair can be made intelligent enough for a disabled

child to operate (Montesano et al. 2010). The wheelchair would operate according to its

user’s control but have safety routines that intervene whenever it is about to perform a

dangerous maneuver. Another attempt to augment the powered wheelchair comes in the

form of the Collaborative Wheelchair Assistant (CWA). The control system has a built-in

guidance program that recognizes, then charts preprogrammed waypoints in addition to

sensor-based avoidance routines. The wheelchair can adhere to preprogrammed paths

whenever a control mistake is perceived. The course correction can also be made to

Page 50: Modelling a real-time multi-sensor fusion-based navigation ...

32

dampen instead of completely overwriting the user’s command, enabling the child to

slowly learn to grasp and perfect the control of the wheelchair (Zeng, Burdet & Teo 2009).

Augmentation of both motor control and mobility can be provided via development of

various powered exoskeletons. These machines are essentially Assistive Robots that are

attached to the human body to amplify the user’s actions. Most current exoskeleton

projects are developed for military applications, but some companies such as Cyberdyne

in Japan has been developing prototypes of lower extremity exoskeletons that help

restore mobility to aging citizens. Called the Hybrid Assistive Limb (HAL), the prototype

was available for rental in 2014. Newer exoskeletons also account for upper extremity

augmentation, helping with the lifting of boxes and reaching items. It would be a matter

of time before future exoskeletons are developed to accommodate children suffering from

cognitively impaired motor control.

Control schema is one area of big concern over Assistive Robotics that augment the

physical capabilities of cognitively impaired children. The balance between manual

control and programmed intervention needs to be struck well so that the implementation

does not impede the user’s actions, while ensuring that control mistakes due to poor

motor control are mitigated. The possibility and repercussions of programmed

interventions hindering intended actions have been studied by Demiris & Carlson (Demiris

2009). The study aimed to assess the capability of an Assistive Technology system that

dynamically balances between manual control and programmed intervention, presenting

a foundation for designing adaptive control that can one day strike that balance for safe

and comfortable use by children with CP and ASD.

The issue of children with cognitive disabilities preferring the presence of robots over

humans must also be considered. One study observed the empathy between normal-

functioning human beings and an anthropomorphic robot having differed designs. The

study found that normal humans tend to react more favorably towards robots that have

human-like features (Riek et al. 2009). A likely explanation for this finding is that normal-

functioning humans tend to adhere to societal bias, preferring beings with human-likeness

Page 51: Modelling a real-time multi-sensor fusion-based navigation ...

33

over foreign creatures. If true, then this may present some doubt over the affinity that

cognitively disabled children have towards non-humanoid robot companions. It was

assumed that the reason for their affinity was that non-humanoid robots do not present

the complex combinations of expressions that spark sensory overload. In this case, robots

can be used to train them to become comfortable with engaging the components for social

interaction, before slowly transitioning to human peers. However, their affinity could also

be a fixation to a non-human companion as a way of escaping contact with human peers.

This regressive outcome can be avoided if the robot’s mediatory role is enforced from the

onset of exposure – only utilizing it as an introductory tool for ushering interactions with

human companions.

The development of Assistive Technologies and Assistive Robotics for rehabilitating

cognitively disabled children is heavily influenced by ethical issues over safety and

psychological impacts. The effects of using these technologies need to be considered in

terms of the child, parent, caregivers and therapists, especially if they are intended for

long-term use. Adherence to ethical safeguards can be designed based on medical core

principles, which states that a system must uphold the autonomy, beneficence, justice

and non-maleficence of its intended aid (David Feil-seifer 2011). When applied to

Assistive Robotics, a robot should be created primarily to be a fairly-distributed aid, serve

the benefit of the child’s health, not breaching the user’s privacy and never becoming the

cause of harm. While there have been numerous attempts at drawing ethical design

guidelines, a definitive standard has yet to be globally accepted and adhered to.

2.5 IDENTIFYING COMPANION ROBOTS AND THEIR CHALLENGES

Most of the surveyed projects only employ periodic mechatronic aid from the robots

involved. The more iconic systems such as KASPAR and NAO are specifically developed

for sessional use – always with some form of oversight from a therapist or operator.

However, examples such as the IROMEC were intended to be interacted with and used

by the subject for long periods of time, without full-time observation by caregivers. These

robots are more prevalent in elderly care and augmentations for disabled patients.

Page 52: Modelling a real-time multi-sensor fusion-based navigation ...

34

One example of these standalone robots is an anthropomorphic mechatronic toy named

Paro. Built to resemble a plush baby seal, Paro was used in social interaction sessions

at eldercare homes. Its animated fins and audio output responds to voice and touch,

eliciting active play and brings about emotional attachment amongst its users (Kidd,

Taggart & Turkle 2006). While not able to directly assist its elderly users in daily routines,

Paro was instrumental in uplifting morale and inducing social interactions between

patients – yet another important factor in QOL. Paro stands out from the rest of the

previously discussed projects because its operational use extends far beyond individual

therapy sessions. It can accompany its users well throughout the day so long as its

internal batteries are able to sustain its actuators and automated audio feedback. It was

also designed to be plush, so its users are unlikely to be hurt by it even without a caregiver

around.

Other examples such as the ASTROMOBILE platform (Cavallo et al. 2013) and Matilda

(Khosla, Chu & Nguyen 2013) provide more direct assistance via voice recognition. These

server robots can initiate audio-visual calls, handle reminders and call for caregiver

attention in addition of being a constant physical presence to accompany elderly patients.

With suitable programming, robots like Matilda can facilitate conversations by suggesting

topics to groups of patients. These machines can be considered as close to commercial

companion robots as currently possible. However, because they are intended to be

purpose-built automated tools for nursing homes, their construction is boldly industrial,

lacking the anthropomorphic and softer qualities of safer robots such as the IROMEC and

Paro. Also, these machines are not intended for operation by non-adult patients and

caregivers close at hand. Otherwise, these heavy-duty machines can cause bodily harm

if misused.

On the opposite end of the companion robot usability spectrum are systems that are

entirely purpose-built for long-term dedicated aid in performing daily routines on behalf of

disabled users (Tee, Lau & Siswoyo Jo 2014). A chief example is the Mechatronic

Assistive Technology System (MATS) which looks like an industrial assembly robot, but

Page 53: Modelling a real-time multi-sensor fusion-based navigation ...

35

with actuated attachment points on both ends (Figure 2-11). MATS is a multipurpose

manipulator robot specially designed for attaching itself on wheelchairs and several rail-

type attachment points installed throughout the living environment. This helps partially

immobile patients to climb stairs, reach higher areas and grasping objects typically out of

reach (Balaguer et al. 2006). Other similar purpose-built machines that accompany

disabled people include intelligent escorting wheelchairs, smart walking sticks and a host

of augmented implements (Tee, Lau & Siswoyo 2014).

Figure 2-11: The MATS robot (Balaguer & Gimenez 2006).

Overviewing the myriad of assistive robot projects, a conclusion can be drawn for

describing an assistive companion robot’s general attributes:

a) The ability to operate autonomously for the most part, without the need of a full-

time observer.

b) Safe for unattended interaction by its intended user.

c) Able to operate for periods longer than most therapeutic sessions.

d) Sufficiently maneuverable for circumventing the environment.

e) Maintain consistent proximity between itself and the primary user.

Page 54: Modelling a real-time multi-sensor fusion-based navigation ...

36

However, considering the polarized themes and variation in form factor between all the

discussed projects, it is not difficult to see the issues that prevent widespread

commercialization or mass-production or companion robots presently. The most glaring

issue is the specific purpose-built nature of each robot. The prototypes encountered thus

far are each developed for individual assistive goals varying from mediating social

interactions between human attendees, playing therapy-reinforcing games, participating

in active play, facilitating communication services, directly extending physical reach and

more. Some implementations require a large mobile platform that is unwieldy in confined

indoor spaces and presents physical hazard when used unattended.

Combined with the fact that robots are inherently complex mechatronic systems,

developing a multifunction companion robot platform becomes a monumental task.

Environmental factors such as terrain, use in indoor versus outdoor settings, obstacle

classifications and level of mobility must be considered. Again, accommodating these

issues in addition to the formfactor requirements impede upon the design effort for user

safety and human factors.

Finally, there is the issue of aesthetic design that impacts the user’s acceptance over

prolonged use. Extreme industrial builds such as the initial prototype MATS manipulator

and mechanized wheelchairs appear both imposing and unapproachable by less-

informed patients and younger children. However, synthetic reproduction of natural

entities such as lifelike androids and models of little children spark a sense of

apprehension in people of all ages due the effect of the Uncanny Valley. Thus, a balance

between both ends of the spectrum must be achieved before a machine may qualify as a

companion robot.

2.6 PROPOSED INDOOR COMPANION ROBOT PLANNING TEMPLATE

The outcome of the literature survey was the realization that a general-purpose

companion robot is difficult to implement. The reasons for this finding are that the platform

must have provisions for a myriad of specific use, sufficiently maneuverable in the

Page 55: Modelling a real-time multi-sensor fusion-based navigation ...

37

intended operating environment, aesthetically designed for optimal user acceptance while

most importantly, not become a source of injury by itself. This research recognizes that it

not feasible to strive towards an all-purpose build, so the scope of possible improvements

to companion robots are narrowed down to the following:

a) Intended for operation in entirely indoor environments.

b) Limited to operation within a living room setting and its common contents.

c) Primarily applied for injury prevention in cognitively disabled children with provision

for other functionalities.

d) Sized for use with children and other users of similar stature.

With these focused scope attributes, a build template can be formalized as the baseline

to help guide how a companion robot is designed and constructed. Each of the explored

challenges can be dealt with using the principles surrounding this template (shown in

Figure 2-12). First proposed in 2014, this template was originally intended for initial

inception of injury prevention companion robots targeting cognitively disabled children

(Tee, Lau, Siswoyo & Then 2015). The prototype was to be created under as minimal

budget and development time as possible while still providing ample space for fitting

interchangeable equipment for facilitating different study requirements.

Page 56: Modelling a real-time multi-sensor fusion-based navigation ...

38

Figure 2-12: Companion robot Planning Template intended for use with cognitively disabled

children.

Page 57: Modelling a real-time multi-sensor fusion-based navigation ...

39

Foremost, the template chiefly employs an Object-Oriented Approach to organize how

companion robot implementations are managed, arraying them as encapsulated

hardware and software modules. The key to enabling rapid prototyping is by exploiting

available technologies in the form of Commercial Off-the-Shelf (COTS) products.

Hardware modules range from enveloped sensors, actuators, structural build and Human

Interface Devices, each enveloped in a self-sustaining package that can be portable

between robot projects according to requirements. Meanwhile, software modules

champion the encapsulation of programmable functionalities, easing the process of

swapping routines according the dynamic therapeutic or monitoring needs. The Object-

Oriented Approach is fully involved throughout the software development process.

An overarching design language whereby emphasis on COTS components, likeable

aesthetics and safety concerns should permeate throughout the entire template’s

application when used to plan the concept of a companion robot.

2.7 CASE STUDY: CARMI

In the attempt to better understand the nuances and challenges of a multi-purpose injury-

preventing companion robot, an example prototype was developed using the proposed

Robot Planning Template in Figure 2-12 as a separate auxiliary project. The prototype is

intended to be a hands-on example accompanying literature review findings as well as

used as a testbed for this research to develop a navigation solution. The summary of its

inception and development details can be found at the beginning of Chapter 4.

Dubbed CARMI, the Companion Robot Avatar for Mitigation of Injuries operates similarly

to telepresence robots such as the Double (Double Robotics 2014), but was supposed to

autonomously follow a child from a distance while continuously monitoring her for possibly

dangerous actions. If such an action is detected, a caregiver is alerted who can then start

a video call via the robot to survey the situation and address the child. This was to fulfil

the requirements of a Robot-Based Injury Prevention Strategy that was based on the

triadic interaction model (Colton, Ricks & Goodrich 2009). Figure 2-13 shows the initial

Page 58: Modelling a real-time multi-sensor fusion-based navigation ...

40

CARMI prototype as it was intended for vision-based autonomous detection of possibly

injurious activities, while serving as an on-demand telepresence robot for facilitating

audio-visual calls to a mobile device on the same Local Area Network.

Figure 2-13: An early prototype of the CARMI robot as the planning template's proof of concept.

The Robot-Based Injury Prevention Strategy (Figure 2-14) requires the active

involvement of the caregiver who is free to handle their own daily routines while the robot

watches over the child autonomously. In the event of an alert, it is the caregiver’s

responsibility to survey the situation and speak to the child as a means of intervening

before the injury occurs. Within five months, the research effort was successful at

planning and implementing a roughly working prototype that demonstrated rudimentary

position tracking and vision-based gesture recognition.

The main structure of the CARMI prototype consists of an indoor mobile robot

development platform equipped with a variable drive system and basic bump switches.

Rapid implementation of multiple redundant proximity sensors and an actuated turn table

were carried out, entirely relying on the Arduino microcontroller development kit. A

Page 59: Modelling a real-time multi-sensor fusion-based navigation ...

41

Microsoft Kinect sensor unit was adapted into the activity tracking system, utilizing the

software development kit’s gesture recognition facilities as a means of identifying

matching body motion profiles of possibly injurious activities such as jumping, falling,

punching and pushing. The early prototype alongside its concept were showcased during

PECIPTA 2015, garnering positive responses from the public and receiving the bronze

award (Borneo Post Online 2016).

Figure 2-14: The Robot-Based Injury Prevention Strategy.

Its success during the research expo has demonstrated that the template’s effectiveness

at rapid planning as well as implementation of most of the subsystems required for

Page 60: Modelling a real-time multi-sensor fusion-based navigation ...

42

fulfilling CARMI’s role within the Robot-Based Injury Prevention Strategy. The robot

platform was commercially acquired, having been built with mounting provisions for a

variety of sensors and components needed for most general-purpose operations in indoor

environments. The variable drives were included with the platform, so there was little

effort needed for calibrating mobility. The turntable had to be custom built, but consists of

laser-cut acrylic sheets, servo drive system and supplied power using locally bought parts.

Finally, the Kinect unit was easily acquired because it existed as a common product for

adding vision-based motion capture to a popular video game console platform called

Microsoft Xbox (Microsoft 2014).

While most of the necessary hardware components were easily acquired commercially,

there are some aspects of the companion robot that must be built from the ground up.

These are subsystems that operate specifically according to how the robot was built and

its operational conditions. From examining the performance of the initial CARMI

conceptual prototype, the following findings were drawn:

a) The human activity tracking system frequently lost track of the user, either because

the primary target moved outside its Field of View (FOV) too quickly, or the

crowded FOV caused the sensor to switch targets.

b) Due to limited orientation data from the Kinect-based tracking system, the turntable

mechanism was not able maintain the primary target within the sensor’s FOV for

continuous tracking.

c) The rudimentary navigation system was only able to steer the robot towards the

current target by wall-following algorithms using immediate proximity data of the

immediate surroundings. This means that CARMI often begins circumventing

furniture in the wrong general direction, leading it away from the user.

Page 61: Modelling a real-time multi-sensor fusion-based navigation ...

43

d) The combined inability to maintain visual lock on the user and the unreliable

rudimentary obstacle avoidance method resulted in the robot constantly falling

behind during escort routines.

These findings present critical flaws that prevent the current CARMI build from being an

effective companion robot. Since most indoor companion robot projects involve a similar

form factor and indoor escort requirements, it can be assumed that they also share similar

problems with their implementation efforts. Thus, the findings are distilled into two major

challenges:

a) A more reliable human activity tracking system is needed, preferable one that can

also be implemented using COTS components.

b) An indoor navigation method or strategy that goes beyond simply considering the

immediate surroundings. This can help route the robot so that it maintains a set

escort distance while circumventing obstacles with the least effort required.

The next objective of this research is to explore available options to meet these

challenges under the same COTS and design language requirements as defined by the

Companion Robot Planning Template. For continued application and testing of proposed

solutions to these challenges, the CARMI robot is used as an ideal case study for the

remainder of the research.

2.8 CHAPTER SUMMARY

Over the years, robots have come a long way to assert itself as a viable assistive

technology in the service of the elderly, disabled patients and children with developmental

disabilities. Its applications span a broad stroke from direct augmentation of therapeutic

functions to robot-mediated social sessions, active play and long-term QOL assistance.

There had been an unclear divide between which applications are deemed companion

robots from amongst the multitude of assistive robotic projects.

Page 62: Modelling a real-time multi-sensor fusion-based navigation ...

44

This literature survey of assistive robotics state-of-the-art had helped form an

understanding of a set of characteristics exclusive to companion robots. The robot must

be able to maneuver within the confines of typical indoor living spaces, operate for periods

longer than therapy sessions, carry out its duties autonomously without a full-time

operator, consistently maintains a suitable proximity between itself and the user while

ensuring that it doesn’t present any form of harm during runtime.

This research narrowed its scope upon strictly indoor and living room equivalent

environments, exclusive use by cognitively disabled children and focuses on injury

prevention as its paramount goal. The first step in attempting to suggest an improvement

upon companion robots within this scope definition was to create a roadmap to help plan

and guide their development. This Companion Robot Planning Template achieves this by

setting a few principles in place: Object Oriented Approach in planning structure and

software design, encapsulated solutions for each requirement made entirely from

standalone COTS components and a design language that is concerned with human

factors, safety and user acceptance.

The effectiveness of this template was gauged by applying it in creating CARMI, a

companion robot that acts as an integral part in a Robot-Based Injury Prevention Strategy.

Its ability to visually identify a target user’s actions and scans them for any matching

dangerous profiles, had earned it public acceptance during PECIPTA 2015.

Despite positive feedback, the conceptual experiment revealed several flaws that could

not be mitigated by the planning template alone. Chief among these is the frequency of

the system losing sight of the intended user. On its own, the commercially available

human tracking device could not adapt to changing lighting conditions, crowds and erratic

user movement. In addition, the robot platform could not maintain consistent escort

distances with the user as its rudimentary pathfinding system was not enough in finding

a least impeded route around obstacles between itself and the target. It was assumed

Page 63: Modelling a real-time multi-sensor fusion-based navigation ...

45

that other similar companion robot projects suffer similar problems in attempting to create

a multi-purpose companion robot platform for indoor use.

Despite introduction of a companion robot planning template and a development strategy

based on COTS and OOA, its experimental implementation’s findings were distilled into

two remaining challenges that cannot be easily overcome from existing homogenous

solutions: a need for an effective human tracking, and a robust indoor robot navigation

method. The solutions for both challenges combine into a novel autonomous human

following system that is still scalable to robots built according to the planning template.

Page 64: Modelling a real-time multi-sensor fusion-based navigation ...

46

Chapter 3: INDOOR ROBOT NAVIGATION AND HUMAN FOLLOWING

The previous literature survey identified two major challenges faced by developers of

indoor companion robots meant for affordable multipurpose use. The first is a need for a

navigation method for traversing the lay of indoor environments while avoiding dynamic

obstacles. The second is a reliable human tracking system that can help lock onto the

intended target user to be followed by the robot. Unlike the remainder of the robot

characteristics outlined in the companion robot planning template in Chapter 2, these two

challenges could not be solved via homogenous Commercial Off-The-Shelf (COTS)

solutions because current effective solutions had to be tailored according to the

combination of robot platform (hardware and software) characteristics and operating

variables introduced by different environments and specific application.

This chapter continues to expound on the findings by elaborating on the specific nuances

of these two challenges, through surveying existing works that encounter similar

predicaments. After which, this research proceeded to explore current technologies that

may be useful for formulating a solution to both challenges.

3.1 CHALLENGE 1: INDOOR NAVIGATION

Indoor autonomous navigation for mobile robots persists as a seasoned and wide area

for study today. The scope of challenges in this field is matched only by the multitude of

attempts at implementing possible solutions. However, exclusive coverage navigation

problems in general is not commonly researched and published. Most related research

work present novel methods to circumvent specific maneuvers or sensory problems.

While this helps contribute to the pool of possible technological solutions, there is little

guidance in how they can be used for solving combinations of problems faced by an

autonomous companion robot in an indoor human-populated environment. At the very

least, almost every published works agree that robotic pathfinding and obstacle

avoidance is a complex problem (Budiharto, Purwanto & Jazidie 2011) acerbated by the

Page 65: Modelling a real-time multi-sensor fusion-based navigation ...

47

fact that their intended working environment is constantly affected by an amalgamation

of dynamic motion, clutter and sensor noise. After reviewing a collection of related robot

wayfinding research works, common themes of navigation problems could be found.

These are grouped into the following categories:

a) Location-Specific issues

b) Autonomous wayfinding dilemmas

c) Sensory and actuation hardware dilemmas

The following subsections help summarize the literature survey findings for each of these

categories, as well as an overview of possible navigation methods that can offer hints to

a workable solution for this research effort.

3.1.1 Location-Specific Issues

One of the most common issues mentioned in robot navigation research work is the

structural and logistical complexities involved when dealing with real-world human-

populated environments. At the macro level, the human indoor environment can consist

of multiple levels, each containing a combination of rooms with different obstacle layouts

depending on what they are purposed for. Each level is connected via stairwells and

structural cavities, while rooms lead from one to another via doors and windows of varying

dimensions. Moving between these obstacles, rooms, levels and orifices is extremely

challenging for ground-based mobile robots as well as airborne ones (Shen, Michael &

Kumar 2013).

Observing each room, static arrangements of furniture and dynamic obstacles such as

toys and people create an organic environment that constantly morphs operating

variables. Passages between obstacles close while new ones are formed, ambulation

over uneven surfaces cause oscillating sensor readings and noise leading to unreliable

perception of the immediate vicinity are to be expected (Budiharto, Purwanto & Jazidie

Page 66: Modelling a real-time multi-sensor fusion-based navigation ...

48

2011). This shows that even moving through a single room can have varying degrees of

difficulty, before scaling up when the robot attempts to traverse across rooms and levels.

Traditional pathfinding is done with knowledge of the present topography in hand, referred

to as “a priori”. While parts of the environment can be modeled, it is impossible to maintain

in the real-world effectively for long because those aspects would undergo changes over

time (Thrun 1998). Due to the dynamically changing operating variables in human

populated environments, the previously complete a priori knowledge becomes rapidly

outdated and invalid (Sgorbissa & Zaccaria 2012).

The other method is to plot a course towards a goal position via reflex behavior, sensing

the immediate surroundings and finding an immediate opening when it presents itself.

This may be workable, but is prone to deviating from known paths and getting into

deadlock situations (Sgorbissa & Zaccaria 2012).

Even the basic characteristics of the environment such as presence of sunlight, indoor

lighting zones, sound absorption in obstacles, reflective surfaces, and uneven surface of

the floor can cause an unknown distribution of noise (Thrun 1998) and sensor misreads.

Static calibration values are not viable since the dynamic environmental characteristic

changes will call for a shift in sensory threshold. Some form of sensor redundancy to

combat noise, or a machine-learning solution to actively adjust the sensor thresholds in

real-time is needed.

3.1.2 Autonomous Wayfinding Dilemmas

There are two ways of approaching the classical problems of pathfinding and obstacle

avoidance: global and local methods (Mohammad Khansari-Zadeh & Billard 2012).

Global methods cover algorithms that best represent the iconic area of computational

search optimizations. It requires a complete or near complete a priori knowledge of the

operating environment. This top-down snapshot of the environment is then parsed using

any select flavor of search algorithms such as Brute Force, Breadth-First-Search, A*,

Page 67: Modelling a real-time multi-sensor fusion-based navigation ...

49

Bayesian Search or the like, finding a best-fit or shortest path between the robot and a

goal if at least one exists. Most global methods involve some form of search optimization,

Fuzzy Logic or Neural Network to adapt for incrementally faster path searches. One

popular method represents the operating environment as a potential field where obstacles

are sources of repulsive force (Khatib 1986). Only the goal position exists as an attractive

force. Thus, the layout of the potential field presents a workable path as a flow of attraction

shaped by the repulsive forces around them.

Unfortunately, global wayfinding is difficult to implement due to a multitude of reasons.

Firstly, a priori knowledge is hard to acquire without the use of expensive scanning

hardware (Budiharto, Purwanto & Jazidie 2011) such as Light Detection and Ranging

(LIDAR) lasers, and is computationally intensive to process (Lapierre & Zapata 2012;

Mohammad Khansari-Zadeh & Billard 2012). Also, the previous sub-section had also

discussed how a priori knowledge alone will not be effective in a dynamically changing

environment that companion robots are often expected to operate in.

The local method is the other extreme end of the wayfinding spectrum. All environmental

perceptions are limited to the immediate surroundings of the robot. This immediately

garners the advantage over global methods in terms of usable sensor selections. This

can range from rudimentary InfraRed and Ultrasonic ranging modules to laser range

finders, and even LIDAR. Since the coverage of the snapshot is much smaller than a

priori floorplans, wayfinding between obstacles towards the goal position is carried out

using simple search algorithms that aim for the next available passageways between

obstructions. Other local methods exist including Bug’s Algorithm, Vector Field Histogram

and Curvature-Velocity method that help make local pathfinding decisions in the event of

dynamic changes to the immediate vicinity, called perturbations (Mohammad Khansari-

Zadeh & Billard 2012). This reflex behavior (Lapierre & Zapata 2012) in path selection is

often modeled after insects and biological creatures exploring the environment using only

current knowledge of their immediate surroundings as opposed to preplanning using

global methods.

Page 68: Modelling a real-time multi-sensor fusion-based navigation ...

50

In return for lower computational overhead and flexible sensor selections, local wayfinding

methods do suffer several severe disadvantages. First, it lacks any internal source of

localization (Shen, Michael & Kumar 2013). Localization is of paramount importance to

autonomous robots operating in both indoor and outdoor environments. This allows it to

identify its current location and orientation so that it can determine the heading towards

its goal position. Outdoor systems have the benefit of the satellite network powered

Global Positioning System (GPS) but this does not work indoors. Under overhead cover,

GPS-centric robots switch to Inertia Measurement Units (IMU) that measure forces from

robot motions to approximate its deviation since the last point of GPS contact. This

method, called Dead Reckoning, suffers from inaccuracies due to incrementing errors

over time. Fully indoor systems must rely on alternate ways to identify their location and

heading, with limited degrees of success. Outside of installing localization beacons

throughout the environment, there are currently no onboard-only approach to enabling

indoor robots to self-localize.

Secondly, the local approach suffers from not having access to the wider scope of

topography granted by global methods. This means that purely reflex choices of

wayfinding between obstacles will frequently lead to dead ends and longer routes. Also,

there is no prior knowledge whether a viable path between the robot and the goal is even

available. In worst cases, the robot will iterate in an infinite loop around its general vicinity

as there is no available path. Local adaptations of the Potential Field method or other

biologically inspired algorithms may face the dilemma of sensory plateaus where pursuing

the choice paths leads the robot to circle within the same area not recognizing that it is

another form of deadlock condition. This situation has been commonly referred to as the

local minima problem (Budiharto, Purwanto & Jazidie 2011; Lapierre & Zapata 2012;

Sgorbissa & Zaccaria 2012).

Similar to the outcome on Location-Specific issues, it is necessary for an autonomous

robot’s wayfinding system to possess the ability to adapt to perturbations in its operating

environment (Mohammad Khansari-Zadeh & Billard 2012). Given that both global and

local methods have their disadvantages in terms of coverage scope, computational

Page 69: Modelling a real-time multi-sensor fusion-based navigation ...

51

requirements and differing sensor needs, perhaps the solution to this comes in the form

of a combination between both.

On the other hand, there are also issues exclusively related to obstacle detection and

avoidance – an integral part of wayfinding. At the very base of autonomous behavior, a

mobile robot should be able to sense an impending obstacle and attempt to avoid it. Early

works involve using simple bump switches or proximity sensors to trigger an action that

simulates reversing away from a wall or object. Today, obstacle avoidance is usually

found as an important part of any robot navigation system, dodging obstructions while

making its way towards a goal position. The effectiveness of an avoidance system heavily

depends on the quality of feedback from the sensors and how it perceives them. Most

obstacle avoidance algorithms assume obstructions as entities possessing circular

profiles for ease of representation in simulations (Alsaab & Bicker 2014). However,

obstacles in the real-world are not always circular. Various furniture such as sofas and

shelves have possibly irregular rectangular shapes with sharp protruding corners that will

collide with a robot if the programmed maneuvers exclusively registers them as circular.

In this case, a popular way to mitigate this problem is by relying on depth data rather than

just an array of ranging modules. Depth data offers more resolution to adequately profile

the shape of obstructions, thus making a detailed maneuver around it possible after

building a collision cone (Alsaab & Bicker 2014). However, depth data requires more

computing power to process and may suffer from input latency compared to reading

bursts of data from ranging sensor arrays.

3.1.3 Sensory and Actuation Hardware Dilemmas

Hardware is yet another avenue for developmental issues when it comes to creating

indoor autonomous robots. As most of the requirements for multifunction indoor

companion robots can be met using COTS components, this section focuses on hardware

dilemmas specific to sensors and actuator implementations for robot navigation and

obstacle avoidance.

Page 70: Modelling a real-time multi-sensor fusion-based navigation ...

52

One of the most common problems with sensors for navigation is the inherent hardware

limitations that each possess. For instance, vision-based sensors such as RGB cameras

and depth cameras suffer from limited Field-of-View (FOV), lens occlusions, susceptibility

to changing lighting conditions, and input latency (Budiharto, Purwanto & Jazidie 2011).

Hardware centering on optical sensors have limited FOV because of the way light-

sensitive modules are arranged in standard manufactured packages. Limited FOV is a

major problem especially when the robot must keep track of human targets while being

in motion. The combined erratic motions of human subjects and the opposing movements

of the robot’s platform will cause the target to be truncated or occluded frequently (Choi,

Pantofaru & Savarese 2011).

This FOV can be modified by selecting different degrees of wide-angle lenses. The

drawback of choosing a lens that offers the widest viewing angle is that the image will

become distorted. In robot navigation, some extent of image distortion may be acceptable

if the image is usable for developing collision cones and for visually identifying targets. A

balance must be struck between having an optical sensor that has the widest viewing

angle possible, while maintaining acceptable clarity of the objects that it is intended to

view. However, there has yet to be an optical sensor suite that can rival the consistent

effectiveness of the human eye in both adjustable focus width and image clarity. The

problem of limited FOV also means that the robot will have to roam the environment more

actively than a typical human being would, just so that it can profile a room or search for

a target for human following (Thrun 1998).

Another hardware dilemma is the lack of context garnered from the raw data fed from

current sensors (Thrun 1998). Low level RGB and depth cameras could only return raw

frames that need to be parsed manually before a captured object could be identified and

treated as an entity in navigation algorithms. Ranging sensor arrays consisting of

ultrasonic modules or IR ranging LEDs return raw converted analog-to-digital values that

must be calibrated before they indicate proximity of obstacles correctly. This additional

layer of operations add overhead onto autonomous robots, especially companion variants

Page 71: Modelling a real-time multi-sensor fusion-based navigation ...

53

that have to handle all processing on-board in a standalone unit (Shen, Michael & Kumar

2013). Fortunately, there are COTS sensor modules available today that come as a

combination of camera and image processing unit. One successful example is the

CMUcam (Carnegie Mellon University 2018), which can be programmed to modulate the

image and perform object detection on its own, sending high-level contextual data for the

robot host’s use in navigation. There are also depth camera suites that come bundled

with software developer kits that handle the necessary (and proprietary) optimizations in

human detection and tracking on behalf of the robot (Microsoft 2014; ASUSTeK Computer

Inc. 2014; Creative Technology Ltd. 2014). While there are more areas in sensor

hardware technology that is yet to be upgraded with contextual data output, the steps in

the right direction has already been taken. These existing systems can be considered for

formulating the sensory hardware component of the solution for this research.

Another avenue of robot hardware dilemma is the issue of actuation. There is a multitude

of research in different ways a mobile robot can ambulate. These range from basic

wheeled or tracked ground vehicles and bipedal humanoid platforms to Unmanned

Amphibious Vehicles and multirotor copter drones. While it is tempting to adopt a

humanoid (A. Jung Moon 2014) platform for the more degrees of freedom, this available

range of motions scales proportionally to development costs. Most robot navigation

projects are planted upon rudimentary dual-drive robot platforms such as Microsoft

Robotics Developer Studio Reference Platform (Microsoft 2012b). While these platforms

can perform basic motions to move around a room, it cannot simply perform sidestep

motions or tilt its body in case it is stuck in a confined place (Shen, Michael & Kumar

2013). One study specifically identifies the problem of under-actuation, where a robot has

less degrees of freedom than necessary to perform maneuvers its intended use requires

(Lapierre & Zapata 2012). On the other hand, over-actuation (oversaturation of actuators)

could result in unintended coupling of both longitudinal and rotational velocities in motion.

Both conditions are undesired, so enough thought must be put into the design of actuation

during robot implementation.

Page 72: Modelling a real-time multi-sensor fusion-based navigation ...

54

Throughout the literature survey, it can be observed that a majority of robot navigation

and obstacle avoidance projects base their performance findings on successful simulation

runs. While this is the primary method of validating their system’s functionalities, most

companion robots are intended for operation under long periods of use. It is expected that

a robot be fully utilized in a typical session for at least over the duration of the patient’s

waking hours (typically 12 hours a day). This feat is difficult to accomplish because of the

presence of sensor drift and error accumulation in the robot controller over time

(Sgorbissa & Zaccaria 2012). It is not rare for an autonomous robot to experience

performance degradation due to compounded errors from multiple sources, as was

explored in the previous discussions. In all cases, such degradation becomes severe over

time, and can be mitigated simply by a watchdog timer that periodically resets the system.

The downside of doing this is that the reset procedure requires a calibration step that may

not be possible without the assistance of an operator. Other hands-off methods of allaying

the issue include using machine learning or adaptive integral algorithms to offset the

sensor drift by studying the robot’s real-time behavior.

Much of the robot hardware issues stem from the constraints of having all sensor and

mobility systems situated in a standalone body (Shen, Michael & Kumar 2013). The

reason for such restriction is because companion robots are meant to be portable and

applicable in whichever location the intended users are in. Thus, all navigation and human

following operations must be carried out by relying solely on compact, self-contained

computation, sensors suite and actuated platforms while being unencumbered enough to

maneuver around domestic indoor environment conditions.

3.1.4 Overview of Possible Autonomous Navigation Solutions

It is paramount to consider the nuances of autonomous wayfinding in an indoor human-

populated environment when attempting to format the solutions brainstorming process.

The operating environment is assumed to be cluttered with a collection of walls (acting as

borders), furniture, and miscellaneous objects of various sizes. To limit the scope of the

research, the intended companion robot is confined within a single room. There may be

Page 73: Modelling a real-time multi-sensor fusion-based navigation ...

55

more than a single person in the room at any one time, but there will always be one

primary target. Hence, an ideal navigation system is one that can help a suitably

maneuverable robot traverse between these obstructions (furniture, objects and people)

to stay within a set proximity from the primary target. The obstructions are expected to

move dynamically, fitting the description of environmental perturbations (Mohammad

Khansari-Zadeh & Billard 2012).

The issue of maneuvering between obstructions can be tackled using either local or global

methods. Global methods are predominantly more effective in charting shortest paths

given that an a priori snapshot of the environment is made available to the robot. This can

be done using rotating imaging sensors such as LIDAR, but this equipment can be

expensive and unwieldy on a small robot frame. A popular group of global pathfinding

systems are based on the Potential Field Method (Khatib 1986), which assigns repulsive

forces to perceived obstructions, and using the flow of attractive forces between the robot

entity and a goal position as the travel path. However, global methods like these are

susceptible to the local minima problem (Lapierre & Zapata 2012), and environmental

perturbations will invalidate snapshots and generated paths unless there is little latency

between renewed a priori datasets (which in turn, requires more expensive hardware and

computational power).

Local methods may be a better choice in dealing with dynamically shifting environmental

conditions. The simplest version is to adopt a purely reflex behavior: sampling the

immediate environment with a proximity or ranging sensors array, then selecting a

direction that is least obstructed. This will most likely head to a dead end or seemingly

random direction that will lead away from the primary target. One study proposed

improving reflex obstacle avoidance by adding a Bayes Estimator to weigh the probability

of a selected path against presence of noise and sensor glitches (Budiharto, Purwanto &

Jazidie 2011). However, the major disadvantage of using a fully local method is the limited

topographic knowledge for localization making it necessary for the robot to perform more

vigorous roaming (Thrun 1998).

Page 74: Modelling a real-time multi-sensor fusion-based navigation ...

56

A possible alternative between the global-local method debate is to adopt a global-style

potential field paradigm and apply it in a local method fashion. The Virtual Force

Histogram (VFH) assigns magnitudes of repulsion to data feed from ranging sensor

arrays. This way, a direction with least repulsion will indicate a possible maneuver path

between the robot and the goal. However, this does not allay the inability of local

wayfinding to consider mid-to-long range topographies. Also, applications such as VFH

are very affected by the problem of relying on static sensor thresholding in an environment

full of perturbations (Budiharto, Purwanto & Jazidie 2011).

Partial knowledge of the topography or localization may help in reducing the margin of

error inherent between local and global wayfinding. A standalone version of area imaging

can be accomplished via Simultaneous Localization and Mapping (SLAM) (Atia et al.

2015). LIDAR is used to constantly scan the immediate visible surroundings of the robot,

which provides a partial map that can be applied with search algorithms to chart a best-

fit path. Alternatively, Radio-Frequency (RF) emitting beacons can be mounted

throughout the environment to act as waypoints (Atia et al. 2015). Once a robot is trained

to self-localize via sampling Received Signal Strength (RSS), it can be coupled with a

rudimentary obstacle avoidance routine while being somewhat aware of the correct

heading that will lead it towards the primary target. The downside of opting for partial

knowledge is the high cost of imaging equipment or embedding the environment with

active beacons, leveraged against wayfinding reliability somewhat between fully global

and local methods.

Focusing on obstacle avoidance, it is important that the issue of representing obstruction

profiles be carried out thoroughly but within conservative computational costs. One study

suggested the use of depth maps which provides higher resolution to better define visible

obstructions as collision cones to maneuver around. While formatting raw depth maps is

computationally intensive at high frame rates, there exists motion tracking cameras and

software development kits such as Microsoft Kinect (Mankoff & Russo 2013) that can

output high-level body tracking and processed low-level depth frames simultaneously.

Selecting this option affords the possibility of performing both robot navigation and human

Page 75: Modelling a real-time multi-sensor fusion-based navigation ...

57

tracking at the same time using the same hardware. The processed depth map can be

used to extract relative shapes using significantly less computing resources (Alsaab &

Bicker 2014).

The issue of under or over-actuation is beyond the scope of this research. It involves

detailed study of structural concerns, drive systems and traction design that balances

between development cost and the appropriate degrees of freedom to circumnavigate

around the living environment. For this research, a standard dual-drive holonomic robot

package will be used as the baseline platform for indoor mobility. This also allows this

research to gauge performance between an implemented solution against existing

projects that adopt the same platform.

Finally, there are a multitude of existing robot navigation works that suggest various

degrees of integration between technologies and techniques. Mechatronics employ

combinations of redundant components to sample or produce an effect using different

hardware so that the drawbacks of each component is overcome by the other. One

example study proposed the use of accurate but complex grid-based maps but organized

and accessed using a topographic-inspired overlay (Thrun 1998). Another study

attempted to breach the gap between global and local wayfinding methods. This was

done by first establishing a priori mapping to define the primary travel paths throughout a

hall. Then when the robot encounters an obstruction, it switches to an offline obstacle

avoidance mode which deviates from its original path until it has passed the obstruction

(Sgorbissa & Zaccaria 2012). One of the most interesting integration attempts was

presented as a fusion between an environment embedded with RF beacons and onboard

LIDAR for combined indoor self-localization (Atia et al. 2015). Hardware fusion between

human tracking systems are also possible, such as the combination of motion capture

and face-voice recognition (Hu et al. 2010) for overcoming the problem of subject

occlusion and clipping (Choi, Pantofaru & Savarese 2011). Perhaps it is entirely possible

to perform indoor robot wayfinding simultaneously with human following using the same

hardware via tracking system fusion.

Page 76: Modelling a real-time multi-sensor fusion-based navigation ...

58

3.2 CHALLENGE 2: HUMAN TRACKING

The second challenge identified in Chapter 2 points to the need for a more reliable method

to detect, identify and track a specific human target as the intended user for the

companion robot to follow. Combined with autonomous navigation, a companion robot

can perform its primary function of human following. This can be carried out by using

motion tracking systems that were fundamentally developed for studying how a body

naturally moves. They achieve this by identifying the individual limbs and joints that a

body consists of. Using marker suits, worn beacons or a standalone software shape-

identifier algorithm, the characteristic articulation and motion profile can be captured

directly and rapidly processed for rigging animations or physically reproduced using

Mechatronics.

Assistive Technologies today has the benefit of consumer access to sophisticated motion

capture devices that used to be astronomically costly and only employed for industrial

movie making and ergonomics testing. Gaming and Human Interface Device products

such as Microsoft Kinect (Microsoft 2014), Asus Xtion controller (ASUSTeK Computer Inc.

2014), and Intel RealSense-based devices (Creative Technology Ltd. 2014) are example

systems that can be easily acquired and retrofitted for tracking human bodies and their

positions. These products come packaged with complete proprietary motion tracking

capabilities encapsulated within software development kits (SDK) which can be

harnessed for a variety of applications that depend on tracking the human body and its

gestures.

Having access to these technologies at hand may invite the assumption that human

tracking can be easily done via COTS. However, this is untrue because the field of

machine vision has continued to struggle with the complexities and nuances of this

problem despite advances in both related hardware and software. The difficulties faced

here are similar in magnitude as indoor navigation problems as there are seemingly

infinite environmental variables which dynamically affect the performance of tracking

systems.

Page 77: Modelling a real-time multi-sensor fusion-based navigation ...

59

One study discussed how a single human tracking hardware solution is not effective

because it is limited by hardware specifications, how human subjects move erratically as

well as contending with shifting environmental conditions (Choi, Pantofaru & Savarese

2011). Another study suggested that a more viable alternative was to complement motion

tracking devices with other systems that may assist in reducing false detections, multiple

decoys and cluttered environments. It proposed a joint subject tracking system that

separately recognizes the person’s face, tracks the upper body profile, and confirms the

position of the subject using an omnidirectional microphone array (Hu et al. 2010). The

findings revealed varying degrees of success at overcoming the FOV and subject

occlusion. It is possible that instead of combining multiples of same-natured tracking

systems as redundancy, more success can be found by fusing different human tracking

methods (Tee, Lau, Siswoyo Jo & Sian Lun 2016).

This research pursues this line of inquiry by continued literature survey into various facets

of human tracking technology research, grouped into the following categories:

a) Self-Localization

b) Body Tracking

c) Biomonitoring

This subsection also covers some cross-category fusion examples as applied in some

Assistive Technology projects.

3.2.2 Self-Localization

Localization is a distinct icon of tracking technologies commonly paraded in Sci-Fi and

industrial robot use. Applications in areas such as security, healthcare, construction and

safety have always valued the ability to locate a target without manual human labor,

however the methods to make this happen are rarely simple to implement. The most basic

Page 78: Modelling a real-time multi-sensor fusion-based navigation ...

60

of these methods is localization via Line of Sight (LOS). For instance, an intruder must be

physically within the viewing angle and viewable distance threshold of a security guard

before he elicits the raising of alarms.

This method has since been augmented today via developments in optical sensor

technologies and computer vision processing algorithms that help format captured frames,

identify body profiles of potential trespassers, and send alert notifications to warn security

staff of the situation. Networks of cameras and sensors can work together to monitor the

location of workers in high risk workplaces to issue warnings that prevent anyone from

inadvertently entering danger zones.

Some technologies can even provide approximate indoor localization without the need

for LOS, such as Radio Frequency (RF). RF beacons can be installed throughout the

environment to broadcast their position-tagged signals to be picked up by onboard

receivers on mobile robots. These robots examine the tag and the Received Signal

Strength (RSS) to assess its distance from the signal’s source location. Using

triangulation, a robot could theoretically calculate its indoor position by examining three

or more received signals per cycle.

Localization can be carried out in one of two ways: via range-free or range-based methods

(Chen et al. 2008). Ranged-based systems work using fixed installation of nodes that are

programmed with their individual location information. The RF example discussed

recently is one example of a range-based system. The robot itself is considered as a blind

node, which is a node that lacks position data. It acquires this information by sampling

the broadcast signals from all contactable fixed nodes and attaches their RSS to their

addresses. Thus, the blind node’s position is calculated using the acquired signals via

algorithms such as Triangulation and Trilateration (Chen et al. 2011). Other related

studies explore possible improvements using alternate implementations of nodes and RF

technologies, or via algorithm optimizations.

Page 79: Modelling a real-time multi-sensor fusion-based navigation ...

61

The range-free method examines the characteristics of mostly wireless sensor networks,

relying on relative distances between anchored and non-anchored nodes. These nodes

are not fixed with position data, but the system helps estimate the relative position of a

node within proximity of other nodes. In the case of the RF example, raw RSS feedback

from contacted nodes can be applied with Angle of Arrival or Time of Arrival calculations

to assess both the angle and estimated distance between nodes (Chen et al. 2008).

One study presented a unified approach that combines the best elements from both

range-based and range-free approaches to localization. It begins by performing location

estimation with ranged-based sampling of location data from all contactable nodes. Then,

it improves and refines that estimation using the range-free assessments of the wireless

connection characteristics between nodes (Quattrone, Kulik & Tanin 2015).

Outdoor localization is dominated by reliance on the Global Positioning System (GPS), a

satellite network that acts as ranged-based fixed nodes across the Earth. GPS-enabled

devices receive broadcast signals from at least four visible satellites and examines each

one for Time of Arrival. The period between transmission and receipt times for each signal

help indicate the distance between the node and the originating satellite. By repeating

this process with at least four sources, the node’s longitude and latitude on Earth can be

estimated. As the number of visible satellites increase, the accuracy of the estimated

position is further improved. Today, fitness trackers such as the Polar M400 (Polar Electro

2015) and Garmin Vivoactive (Garmin Ltd. 2015) products have built-in GPS modules

that help gather location history data. Almost every smartphone manufacturer including

Samsung, Apple, Xiaomi, and Nokia has similar GPS functionality integrated as standard

issue for their products, enabling consumer access to location-based services.

Unfortunately, there are some inherent limitations of GPS that needs to be addressed

when considering this technology for human tracking. The most glaring problem is that

GPS requires direct LOS with the receiver module for it to function. Various factors such

as cloud cover, weather conditions, and overhead structural coverage will adversely affect

the performance of this position tracking technology, resulting in miscalculations known

as GPS glitches, or total loss of service.

Page 80: Modelling a real-time multi-sensor fusion-based navigation ...

62

There is a way to enable momentary GPS position tracking if the receiver is under

temporary indoor locations or the LOS is negatively affected by some reason. The

localization system can be complemented with an Inertial Measurement Unit (IMU) which

is a suite of accelerometers and gyroscopes that measures the changes in experienced

forces when the host is moving (Seo et al. 2012). These measured changes can be

translated into estimated vector deviations from the last known location pinpointed via

GPS, using a method called Dead Reckoning. The downside of this system is that the

measured values only provide a rough position estimation. The error in this method

accumulates over time, making the localization data rapidly decay while the target

remains outside of GPS reach.

One study attempted to rely solely on the IMU system without GPS for a car localization

system that is integrated with WiFi and GSM connectivity found in most smartphones.

Called Dejavu, it uses the same Dead Reckoning method to approximate the location of

the car as it travels along roads forming a multi-modal sensor database (Aly & Youssef

2013). The databases from all users are combined via crowd-ware to create a composite

representation of roads and potential choke points or potholes while helping the system

collect performance data to reduce drift errors resulting from prolonged Dead Reckoning.

Another major issue of GPS is that most receiver modules consume high amounts of

energy, hence were rarely included in a lot of wireless Internet of Things (IoT) applications.

A study presented an energy efficient localization method that combines GSM and WiFi

networks as an alternative to GPS (Oguejiofor et al. 2013). The method also incorporates

RSSI of signals from cell towers to improve its sensor database, thus resulting in over

347% improvement to battery performance over the use of GPS.

As reliable and commonly used GPS is, this technology is severely limited in indoor

environments that do not have viewable access to the sky. Hence, the area of indoor

localization experienced more technological diversity in the absence of an easily

accessible range-based node network. The two most predominant types of solutions

applied for indoor localization are RF-based and vision-based systems. Each type comes

Page 81: Modelling a real-time multi-sensor fusion-based navigation ...

63

with its own advantages and disadvantages that must be addressed on a case-by-case

basis.

Continuing where the RF example left off, it should be noted that RF localization involves

active broadcasting of multiple nodes over long periods of time which translates to high

energy consumption. A study carried out at Carnegie Mellon University Qatar attempts to

undermine this disadvantage by tracking localized clusters rather than individual devices

(Neishaboori & Harras 2013). Instead of custom-programmed RF hardware, the research

relies on WiFi, which is a wireless networking technology that can be implemented using

consumer accessible infrastructure components based on IEEE802.11 and Bluetooth.

Instead of performing multiple RSS examinations and triangulation techniques for each

receiver node, the study proposes an umbrella approach to tagging nearby devices as a

single cluster. The standard RF localization process is only performed on a per-cluster

basis, significantly reducing total energy consumption. This method is most effective for

heavy-traffic areas where people tend to move in clusters, such as train stations and malls.

Another RF related issue is the need for calibrating RSS to distance estimations. This is

carried out as an online training phase where each node communicates with the tracked

receiver and records the results as corresponding to the physical gap between them.

Thus, every possible target position is recorded as a combination of RSS reads from

every node into what is called a Radio Map. However, this Radio Map can be invalidated

by dynamic presence of people, shifting obstructions and multiple receiver devices. One

study presented the idea of including Line of Sight (LOS) into defining the Radio Map,

thus reducing the effect of multipath signal distortions (Guo et al. 2014).

The other popular alternative is vision-based localization that rely on optical sensors. This

form of solution is more appropriate in the case of companion robots as they do not rely

on embedding the environment with wireless nodes. One of the most effective forms of

vision-based localization is carried out via Simultaneous Localization and Mapping

(SLAM). SLAM involves consistent imaging of the surrounding environment and then

processing the image for viable paths, identifying obstructions, matching it to a priori maps

to self-localize and more (Kim & Lee 2013). The imaging is often done using a rotating

Page 82: Modelling a real-time multi-sensor fusion-based navigation ...

64

LIDAR, depth cameras or similar optical sensor array that collects 2D depth maps of the

surroundings.

A drawback of SLAM is that there is a notable mismatch between the periodic mapping

and the actual position of the host. This can be caused by hardware nuances,

irregularities of the environment surface, and orientation of the imaging devices on both

the robot or the target user which culminates in accumulating drift errors. A study

conducted by the University of Massachusetts attempted to solve this problem by

complementing a SLAM localization system with an overhead facing camera that tracks

and identifies Quick Response (QR) markers pasted on the ceiling throughout the

obstacle course (McCann et al. 2013). The overhead camera identifies and examines the

angle of the viewed CR marker to determine the severity and vector of the robot’s path

deviation. This information is used to skew the SLAM map and reduce the accumulated

drift error.

Combinations of RF and vision-based technologies for self-localization are also a viable

option for simultaneously mitigating the weaknesses of both types. One example involves

the fusion of RF and Pyroelectric Infrared (PIR) sensors for tracking the indoor location

of a patient. On its own, the PIR module is mounted overhead and is triggered whenever

it detects motion within its zone of detection. However, these devices are inaccurate

because they are susceptible to detecting changes in lighting conditions as movement.

By combining a ZigBee IEEE802.15.4 receiver to each PIR node and placing an RF

transmitter on the patient, the system’s reliability is elevated dramatically. As the patient

approaches the node, the PIR module triggers and prompts the receiver to sample the

RSS of the emitted signal. This will help filter out false PIR detections when the patient is

nowhere nearby.

While there are numerous examples and advances in self-localization technologies, it is

observed there currently is no portable solution that can be worn by the human target for

human tracking without incurring significant obtrusiveness. However, these techniques

may be viable when applied to the companion robot for determining its own location within

Page 83: Modelling a real-time multi-sensor fusion-based navigation ...

65

the operating environment’s complex. In addition, the use of QR markers as a corrective

measure to complement the robot’s SLAM localization could be applicable for tracking

the orientation of a human target relative to the companion robot in a range-free fashion.

3.2.3 Body Tracking

The motion capture technology discussed earlier in this chapter is one of the key

components in the field of human body tracking. Over the last three decades, there has

been great strides in the effort to record and study the natural motion of humans and living

creatures. The applications of this study are vast, ranging from creating believable

animations, reproducing natural gaits in humanoid and bioinspired robotics, to gait

corrective therapies, motion-activated video games and interactive telecommunications.

In the past, motion capture could only be done via cameras capable of high framerate

videography, bodysuits with marker points and processing software banking on blob

detection. The recorded videos were processed to remove everything except for the

marker points that are registered as blobs. These blobs correspond to the joints and limbs

of a tracked person which can be rigged to an animated body to visualize the natural

motion and gestures that a human subject is capable of. The entire operation used to be

time consuming and the actors had to wear uncomfortable marker bodysuits throughout

the recording process while being expected to move and behave naturally.

Today’s motion capture devices rely on simulating depth perception using a variety of

techniques. The captured frames are depth maps, with each pixel representing a distance

between the sensor and the physical point in the real-world. These depth maps offer a

3D representation of the world within the device’s view-space, including people, obstacles

and boundaries. These shapes can then be compared for shape profiles matching human

bodies, which then have a virtual skeleton rigged to them. Thus, subsequent motion

capture and study can be done by tracking and recording how the rigged skeletons move.

This markerless method is far less obtrusive than the use of bodysuits, making them a

popular technology to be adapted for entertainment purposes.

Page 84: Modelling a real-time multi-sensor fusion-based navigation ...

66

One of the most iconic consumer-grade motion capture devices available off the shelf is

the Microsoft Kinect (Microsoft Corporation 2013). It was primarily marketed as a next

generation video game controller for the Xbox console, capable of markerless body

gesture tracking for up to 4-6 people (Mankoff & Russo 2013). When applied for research

use, an accompanying Windows development kit has made it possible for researchers to

acquire an affordable depth camera capable of visual mapping. The applications for this

technology spans from SLAM to rapid imaging real world objects into 3D models.

Other alternative motion capture products are also available, such as the Intel RealSense

(Creative Technology Ltd. 2014) which was intended to be a hardware template for PC

manufacturers to implement their own desktop productivity motion capture. Integrated

into consumer computers and smartphones, RealSense enables gesture recognition as

a supplement Human Interface Device alongside mice, keyboards, webcams and

microphones to facilitate a whole host of possible productivity functions. The RealSense

hardware template indicates the use of stereoscopic cameras to emulate depth

perception as opposed to using and IR-based depth camera that Microsoft Kinect

implements.

Another COTS motion capture device was produced by Sony for their PlayStation series

of video game consoles. Called the PlayStation Eye, the device is basically a single RGB

camera with accompanying image processing suite that conditions the captured frames

and extracts shapes that corresponds to human profiles before attempting to find

matching gestures. Unfortunately, this device is highly susceptible to environmental

lighting and sensor noise. An updated version has since been released, implementing an

RGB-D depth camera similar to Microsoft Kinect (Sony Computer Entertainment Inc

2013). Although single camera depth perception performs tenuously at best, it may be a

viable option for applications that demand some degree of motion capture but under

limited mounting space. Once such study attempted to examine the performance of a

monocular camera setup after modifying it to vary the focus of the captured frames so

that the depth component of a viewed object can be approximated. This method of depth

perception from a single optical sensor is referred to as the Depth from Focus technique

Page 85: Modelling a real-time multi-sensor fusion-based navigation ...

67

(Gaspar et al. 2015). The system was able to approximate the motion of a small blimp as

it floats in 3D space. The success of this study shows that there are viable alternatives

aside from the stereoscopic and IR array depth cameras whose performance depends

largely on triangulation strategies but suffer from ‘blind spot’ situations where a tracked

target is within the FOV of one camera but not in the other.

The rising availability of these devices has of great benefit to vision-based robot

navigation, machine vision, physiotherapy, and assistive technologies. Augmented with

existing microelectromechanical sensors such as the Inertial Measurement Units and

biomonitoring sensors, these systems have helped elevate explorations into previously

unknown avenues for studying human behavior.

To gauge the performance between consumer grade motion capture devices like

Microsoft Kinect and industrial standard offerings, a study was carried out by applying

both systems to assess gait and standing balance (Yang et al. 2014). At the time of writing,

pioneering consumer grade motion capture devices such as the Kinect had already been

in commercial circulation for a few years and subjected to a multitude of firsthand

accounts on the grounds of body tracking. The industrial benchmark selected for the study

was the NDI Optotrak Certus provided by Northern Digital (Northern Digital Inc. 2015). To

carry out the experiment, subjects were instructed to assume three different standing

poses for each system. Posture and balance are determined by factoring variables such

as the person’s center of mass via kinematic data output from software. Traditionally, this

test is carried out manually by a physician and it demands full attention and prior

assessment history to produce a viable assessment. The study showed that the industrial

benchmark reported higher kinematic readings, but similar variance in measurements

between all posture sets are reported by both systems. This shows that even though the

more expensive Certus system produced more accurate readings, the Kinect proved to

be equally as effective for body posture assessments.

RGB-D depth cameras had also been involved with application experiments for

augmenting physiotherapy sessions held in home environments. One example used the

Page 86: Modelling a real-time multi-sensor fusion-based navigation ...

68

device as part of a personal rehabilitation tool to help facilitate reinforced exercises at

home without requiring the presences of a physician (Su 2013). The exercises were

carried out in a series of repetitive strength-building challenges that helps to gradually

increase the user’s motor capabilities. Performance data for each session helps measure

the patient’s progress and actively adjust the exercise tolerances for future attempts. The

motion capture device helps record the patient’s posture and motion throughout the

sessions, and then supply the performance data to a machine learning component that

determines the settings for progressive attempts. The goal of the machine was to offer

rehabilitative exercises that continually challenges the user without resulting in injury.

However, this study has also reported that conventional motion capture devices suffer

from lens occlusions, insufficient adaptability to changing lighting conditions and glitches

inherent to the body tracking software. These drawbacks had unfortunately limited the

potential of the prototype.

When applied to specific body tracking and activity monitoring needs, conventional depth

cameras had resulted in significant successes. A study was carried out to examine the

applicability of a Kinect device for upscaling an existing test which evaluates the upper

extremity motion in children diagnosed with Cerebral Palsy (Rammer et al. 2014). The

Shriners Hospitals for Children Upper Extremity Evaluation is a comprehensive test for

measuring a child’s capability in carrying out Activities of Daily Living (ADL). Once the

motion capture device locates the child’s body and assigns the virtual skeleton to it,

individual limbs can be examined for degrees of alignment and limb control. Meanwhile,

the hands can be tracked for grasping and releasing actions. The markerless vision-

based device had proven to be of great help in providing quantifiable measurements for

quality of motor control and physical action executions.

This subsection has presented some useful options for developing the human tracking

portion of this research’s solution. While depth cameras are susceptible to hardware

limitations, the ability to perform vision-based markerless body tracking and motion

capture may enable a companion robot to perform range-free localization between itself

and the target human. The availability of consumer grade motion capture devices such

Page 87: Modelling a real-time multi-sensor fusion-based navigation ...

69

as Microsoft Kinect that performs comparatively with industry standard versions also

mean that there is little sacrifice of performance over price. In addition, this move also

supports the companion robot planning template’s objective of relying on COTS

components.

3.2.4 Biomonitoring

This third category of human tracking technologies is a collection of miscellaneous

biometrics and health monitoring systems that do not directly contribute towards

localization nor motion capture. Even so, some of these projects may offer indirect

metadata that could complement existing human tracking systems. Having biofeedback

alongside markerless body tracking may eliminate false detections. For example, having

smart wearable periodically transmit heart rate readings to the robot can prevent it from

inadvertently switching focus to another body that is currently engaged in an activity that

does not match the current readings. Existing biomonitoring technologies enable the

possibility for long term tracking of health signatures such as heart rate, body temperature,

respiratory rate, galvanic skin response and more (Singleton, Warren & Piersel 2014).

Recent trends seem to indicate a rising prevalence of research surrounding the

applications of Electroencephalography (EEG), biofeedback and popularity of consumer

grade smart wearables.

In 2014, a study was carried out to explore the link between positive health experiences

and spending time in outdoor natural environments. The proposed system relies on a

Blackberry smartphone which is carried by a test subject throughout the day. Whenever

the subject spends time at a park, the GPS module logs the occurrence and triggers data

collection from onboard accelerometers to gauge the vigorousness of the outdoor activity.

Meanwhile, Experience Sampling Methods (ESM) is used to elicit feedback from the

subject on his current emotion and state of mind (Doherty, Lemieux & Canally 2014).

While the experiment did not intend to attempt quantification of positive wellbeing, it did

demonstrate how sensor fusion and consumer devices help open new avenues for

exploring human biomonitoring.

Page 88: Modelling a real-time multi-sensor fusion-based navigation ...

70

EEG is a neurodevelopmental monitoring tool that offers non-invasive tracking of

electrical activity at the surface of the brain. It is important to note that the device is only

sensitive enough to detect the electrical activities that are closest to the scalp as that is

where the electrodes are placed. While there have not been concrete findings over the

possibility of mapping EEG readings to human behavior, the technology does allow

researchers to observe repetitive signatures whenever the subject attempts repeated

emotions and thought exercises. For instance, a study was conducted to observe the

emotional responses from Autistic children when they interact with a robot that showed

pet-like behavior (Goulart et al. 2014). Due to impaired social interaction skills, it was

difficult to elicit feedback from these children. Using a head harness to collect EEG

readings from them as they interacted with the robot, it was possible to roughly model the

change in emotions as read patterns corresponding to shifting emotional states.

Another interesting biomonitoring research was carried out using Electrooculography

(EOG) to study the micromovements of the eye in children suffering from Cerebral Palsy.

These children had faced difficulty in using the computer mouse due to inadequate motor

control, so a head-mounted system called EagleEyes uses EOG to track the eye motions

and blinks as an alternative Human Interface Device (McMurrough et al. 2012).

Unfortunately, the headgear was reportedly commented as being too bulky and

cumbersome for extended use.

Biofeedback and haptics are electromechanical means of providing physical feedback to

a user via the sense of touch. One example use of biofeedback is for naturally alerting a

user about the internal state of the body that is not actively monitored. Children with

Cerebral Palsy were commonly observed to be favoring a stronger limb while neglecting

to exercise the other. The current approach to rehabilitating this behavior is by constricting

the stronger limb, forcing the child to frequent using the other limb for ADL. A biofeedback

system was developed that prompts the user whenever a stronger limb was used

subconsciously (Bloom, Przekop & Sanger 2010). An armband embedded with

Electromyography (EMG) electrodes is strapped to the stronger limb and detects active

Page 89: Modelling a real-time multi-sensor fusion-based navigation ...

71

nerve pulses whenever it is about to be used for an activity. The controller will then issue

an obvious vibration, alerting the user and reminding her to use the other limb instead.

Due to the behavior plasticity of the human brain and the need to prevent motor detriment

to the stronger limb instead, the device was used in 5-hour periods each day.

As initially mentioned, there is no direct benefit of biomonitoring on discerning the position

of a human target within a human following strategy. However, exploring its various

avenues of application show that there is sizeable potential at involving some degree of

smart wearable that complements a human tracking system, for the purpose reducing or

eliminating performance defects caused by vision-based hardware.

3.2.5 Examples of Human Tracking Technologies Fusion in Existing Research

The following subsection will discuss several projects in Assistive Technologies that

demonstrate sensor fusion and technique combinations between localization, human

tracking and biomonitoring. Exploring how these integrations were implemented and used

may offer insights on how fusion can be used for solving the human tracking challenge

that this research is focused on.

A prototype system was built to help cognitively disabled people overcome their struggle

with task sequencing, which is the ability to mentally dismantle a task into a sequence of

steps. Oftentimes, they had to be fully supervised so that they do not inadvertently

become distracted and deviate from their original tasks. Called the Teeth Brushing

Assistance (TEBRA) system, the machine was made to guide the person in the task of

brushing teeth. It was expected that the user would lose concentration, deviate from the

sequence or possibly injure herself. Thus, the machine had to be able to monitor the

accomplishment of each sequence while offering context sensitive instructions to the

person to mitigate any deviation. TEBRA was constructed around a washbasin set,

augmented with an LCD display “wash mirror”, speakers and proximity sensors. And IMU

module is integrated into the toothbrush to monitor active use. A pair of cameras, one

overhead and the other facing the user, are used for body tracking motion capture. The

Page 90: Modelling a real-time multi-sensor fusion-based navigation ...

72

tap is attached with a flow sensor as well. As the user enters the washbasin set, step-by-

step interactive instructions are presented to guide the accomplishment of teeth brushing

without the need for a supervisor. Deviations are detected by the combination of motion

capture and biomonitoring sensors while the appropriate corrective instructions are

determine via machine learning (Peters et al. 2014). While the TEBRA system was

applied with some success, it could not stop the user from outwardly ignoring the audio-

visual prompts or exiting the washbasin set entirely. Ultimately, TEBRA was entirely

purpose-built to guide accomplishments of a single task. It would not be feasible to

produce a version for the multitudes of ADL.

Another study demonstrated that any technology can be repurposed for tracking humans

if its features coincide with the intended characteristic for logging. The experiment aimed

to study behavioral differences between baseline children and those diagnosed with

Autism Spectrum Disorder (ASD) when placed in a group play environment. The

monitoring was done using a Noldus EnthoVision-XT, an outdoor camera nest that is

originally intended for wildlife filming and study of repetitive behavior in the absence of

humans. This system was mounted overhead and set to monitor the children who wore

color-coded clothes for easy identification. Through video processing, the Region of

Interest (ROI) and Turning Bias (TB) could be examined in each child. The findings show

that children with ASD tended to stay away from large groups of other children (Cohen et

al. 2014). Similar forms of repurposing motion capture devices for the use of human

following may be possible.

Georgia Institute of Technology introduced the term “Behavioral Imaging” by exploring

the potential of merging behavioral science and computational technology. The study of

natural behavior has traditionally been carried out manually and is limited by the available

fieldwork time span for human researchers. Today, the process of observation and

building behavior portraits can be automated using multi-modal sensor networks. These

networks can be outfitted with a variety of sensors that take respiratory, cardiovascular

and electrodermal readings. This allows collection of long-term observation data into

Multimodal Dyadic Behavior (MMDB) unified data-sets which can be gradually improved

Page 91: Modelling a real-time multi-sensor fusion-based navigation ...

73

by other study sessions that involve the same subject. A pilot study applied this system

to the Child Study Lab, sampling body orientation, actions, and speech from children with

ASD while they were engaged in social interaction exercises. These were done using a

combination of motion capture devices, smart wearables and microphones (Regh et al.

2014). Continued use can help enable investigations in a variety of facets surrounding

ASD and behavioral study.

These existing applications in Assistive Technologies has shown that it is a viable practice

to integrate tracking technologies of various natures to circumvent individual hardware

limitations and in some cases, exceed the functionalities inherent in separate systems

when used in fusion. It is also possible to repurpose a tracking system hardware to suit

another human tracking strategy, if its raw functionality can be adopted as part of that

strategy.

3.3 PROPOSED COMBINED HUMAN TRACKING AND INDOOR NAVIGATION SOLUTION

After reviewing surveyed current works between indoor robot navigation systems and

human tracking techniques, the garnered findings have helped shape a potential solution

to the two challenges identified in this research. Working backwards, the solution

formation begins with solving the problem of reliable human tracking while catering to the

COTS requirement of the robot planning template in Chapter 2.

The literature survey has indicated in more than several occasions, the potential and

applicability of consumer grade motion capture devices such as Microsoft Kinect. These

vision-based sensor systems can identify and track the body profiles of human subjects

without the need of bodysuits or expensive studio setups of old. However, the drawback

to these systems include susceptibility to lens occlusions, latency and false detections

that are caused by limited hardware FOV, sensor refresh rates, erratic human subject

motions, cluttered environments and dynamic ambient lighting. Augmenting standalone

motion capture devices with similar redundant systems may help reduce the frequency of

Page 92: Modelling a real-time multi-sensor fusion-based navigation ...

74

false detections. However, the use of QR code markers in a surveyed study helped inspire

the idea of a candidate redundant system that tracks active markers worn by the primary

target. By matching the tracked position of bodies and the target’s markers, false

detections and miscellaneous bodies can be eliminated, singling out the primary target.

The fusion of RGB-D based motion capture and a combination of vision-based marker

tracking, and smart wearables could result in greater reliability and performance.

With the ability to keep track of the primary subject’s position, this information can be

made available as a component in tackling the indoor robot navigation problem. Instead

of expending resources in implementing embedded environments and beacons, a range-

free method is adopted. Considering both the robot and the primary subject as nodes, the

distance and heading between both can be acquired by means of extracting and

approximating depth data. Computational resources may be limited because the entire

robot control system must reside on a standalone platform, so a LIDAR-based SLAM

approach to environmental perception is not feasible. However, the ability of scanning the

immediate environment can be carried out by extracting the raw depth frames from the

RGB-D motion capture device. Inspiration is drawn from yet another covered study by

attempting to create a multi-layer map of the environment consisting of the primary target

position awareness, immediate proximity landscape and mid to long-range depth

landscape.

Since one of the most popular wayfinding algorithms revolve around the Potential Field

Method, perhaps a vertical adaptation like the Virtual Force Histogram (VFH) could be

generated using the transformed layers (subject position, immediate proximity and mid to

long-range landscape). The resulting composite array could be used as an indicator for

the best direction to head towards, in case an obstruction is encountered while the robot

attempts to relocate itself so that it is within escort distance to the primary target.

This proposed solution will be further developed into a formal indoor companion robot

navigation system in the next chapter.

Page 93: Modelling a real-time multi-sensor fusion-based navigation ...

75

3.4 CHAPTER SUMMARY

This chapter aimed to explore existing research that had encountered the two challenges

while implementing indoor robots, then examining the solutions presented. Beginning with

indoor wayfinding and obstacle avoidance, the survey has found that the indoor

navigation problem had to be categorized into location, autonomous pathfinding and

hardware related categories due to variations of complications. The primary issue with

indoor navigation is that it must contend with a dynamically shifting environment that

makes preplanning both computationally taxing and rapidly invalidated. More success

could be found in adopting organic algorithms related to the Potential Field Method that

react and adapt to dynamic environment changes. Hardware wise, SLAM is the most

prominent solution for real-time imaging of the surround environment for planning

maneuvers between obstacles, but it is also the most expensive and computationally

demanding.

The second half of the chapter explores existing human tracking technologies in the

attempt to identify possible solutions to implement a more reliable and effective way of

consistently tracking a human target’s position relative to the companion robot. The

survey findings were interesting because they documented a rising prevalence of

consumer accessible motion capture devices that do not require worn markers.

Unfortunately, most of these devices are vision-based hardware that suffer from common

limitations such as lens occlusions, refresh rates and shifting lighting conditions. The

literature survey continued to explore other monitoring technology avenues in the attempt

to find alternative tracked human attributes that can complement vision-based motion

capture. The results have shown that there are a multitude of research work documenting

the idea of sensor fusion and its successes at both mitigating hardware limitations of

individual sensors while allowing functionality repurposing.

The result of this chapter is the formation of a solution based on reviewing the existing

technologies covered throughout the literature survey. It proposes composite human

tracking fusion between a COTS motion capture device and a redundant marker tracking

Page 94: Modelling a real-time multi-sensor fusion-based navigation ...

76

system involving an optical camera and active markers mounted as smart wearables.

Having both tracking systems homed into a human target will help eliminate false

detections as well as mitigate subject occlusions that tend to plague single vision-based

systems. The use of an RGB-D motion capture device also allows extraction of raw depth

maps that can be processed as a mid to long-range snapshots of the environment.

Combined with immediate proximity data and constant update on the primary target’s

position, these components can be fed into an adapted Potential Field algorithm to help

guide the companion robot around obstacles towards reaching the correct escort distance.

This solution will be expanded and implemented in the next chapter.

Page 95: Modelling a real-time multi-sensor fusion-based navigation ...

77

Chapter 4: DESIGN AND PROTOTYPING OF THE MULTI-SENSOR FUSION-BASED NAVIGATION MODEL

4.1 INTRODUCTION

Chapter 2 presented the results of a literature survey on the current applications of

assistive robotics in aiding children, the elderly and people with disabilities. It elaborated

on the way robots are used to augment therapies, improve quality of life and serve as a

buffer to compensate for absence or shortages of caregivers. This need for robotic

assistance is most evident in the care for cognitively disabled children, as emphasized in

application area of this research. Children with ASD and Cerebral Palsy are very

susceptible to self-inflicted injuries and there is a need for more prevalent companion

robots to counter these occurrences. Hence, the chapter ends with a proposed

companion robot planning framework that focuses on Commercial Off-The-Shelf (COTS)

hardware and software components.

In accordance to the framework, the Companion Robot Avatar for the Mitigation of Injuries

(CARMI) was planned and designed as an application case study for this research (Tee,

Lau, Siswoyo Jo & Then 2015). CARMI is intended to act as an observer that tracks and

follows a child autonomously. It identifies the child’s actions using motion capture and

image processing to see if it matches any predefined activities that commonly lead to

injuries (such as punches, jumps and falls) (Lau, Ong & Putra 2014). This system is

extended by adding a notification feature that wirelessly warns a caregiver whenever such

an action is detected. The caregiver may also choose to initiate a video call through the

robot, using CARMI as an avatar to facilitate conversation between the child and

caregiver. Hence, CARMI satisfies the requirements for implementing the Robot-Based

Injury Prevention Strategy as outlined in Chapter 2.

To accomplish this in line with the robot planning framework, the activity tracking system

must be mounted on a basic indoor mobile robot platform that is actuated with dual-drive

DC motors. It senses the child and environment using a combination of depth sensor,

ultrasonic sensor modules and an RGB camera. Video-calling is enabled using a

Page 96: Modelling a real-time multi-sensor fusion-based navigation ...

78

microphone and a dedicated webcam that come bundled in an on-board computer. These

hardware component selections are laid out in Table 4-1.

Table 4-1: Initial hardware plan for CARMI (Tee, Lau, Siswoyo Jo & Then 2015)

Hardware

Input Output

The primary software module for CARMI governs the operation of the autonomous activity

tracking system in tandem with robot navigation. Essentially, CARMI needs to relocate

and re-orientate itself to optimally detect and track the child’s actions. The human-

following and activity-tracking work must be carried out without direct human control.

There is also no option for direct interaction between the child and robot, however

triggering the video-call feature will cease this module’s operation. This planned behavior

is outlined in Table 4-2.

Direct - Audio/Visual

Communication o Microphone o Embedded Webcam

Indirect - Collision Detection

o Ultrasonic Sensors

- Subject Position Tracking o Depth Sensor o Webcam

Physical - Relocation

o Motor-Actuated Wheels (variable drive system)

Non-Physical - Audio/Visual

Communication o Tablet\Laptop

Speakers o Tablet\Laptop Display

Page 97: Modelling a real-time multi-sensor fusion-based navigation ...

79

Table 4-2: Initial software plan for CARMI (Tee, Lau, Siswoyo Jo & Then 2015)

Module Layer Configuration

Automated

Injury

Detection

Mode

Interaction

Autonomous – Full reliance on camera, depth sensor and

proximity sensors to track the subject and avoid collisions

Interface

• Motion Tracking using vision processing

• Dedicated Vision-based Injury Prevention System

• Embedded-level collision detection suite

Operation

Augment Only – use mobility and separate camera-based

tracking to minimize Injury Prevention System Field-Of-View

limitations

Intervention

Monitoring Only – No pre-programmed intervention

routines. Caregivers are notified when the Injury Prevention

System detects a possible situation.

Embedded

Control

Closed-Loop – All tracking, following and collision

avoidance routines are autonomously carried out by

embedded microcontrollers.

The CARMI inception process and planning using the framework has highlighted the

importance of human-following for companion robots. In all applications of assistive

robots that accompany a human user, some form of autonomous human tracking and

Page 98: Modelling a real-time multi-sensor fusion-based navigation ...

80

following is required so that they may continue functioning in the appropriate proximity of

their target.

Table 4-3: Injurious gesture detection performance (Tee, Lau, Siswoyo Jo & Wong 2016).

Gesture/Activity Samples Success Accuracy (%)

Punch (L/R) 30 11 36.667

Push (Both) 30 19 63.333

Jump 20 18 90.0

Fall (Backwards) 20 16 80.0

Overall Accuracy 58.425 %

Thus Chapter 3 encompasses a survey of the state-of-the-art for human tracking methods,

particularly in indoor localization using various combinations of wireless, vision-based and

wearable technologies. One study involving human motion tracking via Microsoft Kinect

has shown great promise in vision-based injury detection (Ann & Lau 2013). This method

was reproduced for the CARMI prototype to examine its performance as part of the

autonomous activity tracking system (Tee, Lau, Siswoyo Jo & Wong 2016). The system’s

visual gestures detection algorithm uses a neural-network that can be trained to identify

possibly injurious motions such as falls, punches, pushes, and jumps with decent

performance, as shown in Table 4-3. More detail of this experiment can be found in

section 4.5.5.

However, the performance of the activity-tracking system is hampered by the limited zone

of optimal detection in addition to optical occlusions, susceptibility to sunlight and effects

from environmental lighting. The depth sensor’s optimal zone for detection is a 57.5° cone

projected in front of it (Figure 4-1). Only subjects situated between 1.2-3.5m within that

cone can be tracked properly (Microsoft Corporation 2013). Other human activity tracking

Page 99: Modelling a real-time multi-sensor fusion-based navigation ...

81

systems such as Asus Xtion (ASUSTeK Computer Inc. 2014) and Intel RealSense

(Creative Technology Ltd. 2014) also share the similar hardware limitations.

Figure 4-1: Microsoft Kinect documentation of hardware limitation. (Microsoft

Corporation 2013)

The literature survey also revealed that there have been numerous attempts to improve

human activity and position tracking performance by augmenting a sensor system with

another redundant one. The auxiliary device is meant to take over sensory perception in

case the primary fails. This technique of sensor-fusion is explored as a possible solution

to overcome that optimal zone limitation, resulting in the attempt to implement a human-

orientation tracking system using an active InfraRed marker.

4.2 HUMAN ORIENTATION TRACKING USING AN ACTIVE INFRARED (IR) MARKER

The primary problem with optical depth-based motion tracking hardware is the narrow

Field of View (FOV). Coupled with the general struggles of vision-based systems such as

susceptibility to environmental lighting and optical occlusions, these systems suffer from

a limited zone for optimal operation.

The first attempt to solve this problem is to find a suitable redundant tracking system that

can help identify the relative position and orientation of the target with reference to the

Page 100: Modelling a real-time multi-sensor fusion-based navigation ...

82

robot. This study resulted in a hybrid wearable and optical based method (Tee, Lau,

Siswoyo Jo & Lau 2015). The target wears a vest that is lined with IR LEDs that form an

active marker which is visible to the IR camera. OpenCV is used to identify the visible

light blobs and assess their relative positions to form the marker (Figure 4-2).

Figure 4-2: First prototype of the active IR marker. The vest is equipped with hook &

loop strips that allow the IR modules to be mounted in a variety of patterns. An example of a pattern as perceived by the camera is shown. (Tee, Lau, Siswoyo Jo & Lau 2015)

The raw camera feed is processed to correct tilt, cropping and scale so that a standard-

sized marker is acquired. This preconditioning process is illustrated in Figure 4-3. The

mapped blobs are rotated so they form an upright square with two inner blobs indicating

the orientation (midpoint and top-left quadrant for upright configuration). Finally, the outer

blobs are removed so the inner blobs form the standard pattern that can be used for

matching saved templates once scaled to the correct size.

Page 101: Modelling a real-time multi-sensor fusion-based navigation ...

83

Figure 4-3: IR Active Marker preconditioning process. (Tee, Lau, Siswoyo Jo & Lau

2015)

Once this conditioned marker output can be acquired in real-time, a database set of

patterns can be recorded, each corresponding to a different target orientation relative to

the camera as shown in Figure 4-4. This data set is acquired at the beginning of each

runtime, using a calibration rig depicted in Figure 4-5.

Figure 4-4: Example of an orientation pattern data set. (Tee, Lau, Siswoyo Jo & Lau

2015)

Figure 4-5: Calibration rig for the active IR marker and camera. (Tee, Lau, Siswoyo Jo &

Lau 2015)

Page 102: Modelling a real-time multi-sensor fusion-based navigation ...

84

During operation, the system was expected to see the target’s constellation of LEDs,

precondition the raw video frame into a pattern, then perform a comparison between this

pattern and each calibrated data set. This comparison process is carried out using the

Template Matching technique (Anderson & Schweitzer 2009). The closest matched data

set should indicate the approximate orientation of the target relative to the robot. This

information can then be used for visual servoing the robot, relocating to where the target

is facing. The use of ultrasonic sensors was initially intended for determining the distance

between the robot and target.

Table 4-4 shows the IR-based system’s detection zone, as sampled during morning and

evening conditions in Office and Laboratory environments. Each sample is collected by

gradually moving the marker away from the camera until it is no longer detected. The

distance and angle at which the marker was last visible were recorded. While the

orientation tracking system did operate as intended, it could not maintain consistent

accuracy beyond an average of 1.67m. This did not exceed the optimal zone of detection

for the Microsoft Kinect benchmark device, thus eliminating this system as a viable option

for the sensor-fusion solution. However, it should be noted that the raw hardware

detection performance of the full pattern is double of the orientation tracking. Figure 4-6

illustrates the hardware detection zone with respect to the orientation tracking

performance.

Table 4-4: Performance results of the human-orientation tracking system. (Tee, Lau, Siswoyo Jo & Lau 2015)

Criteria Office Laboratory

Average Morning Evening Morning Evening

Horizontal FOV (degrees) 54 56 68 61 68 63.25

Vertical FOV (degrees) 63 54 62 54 52 55.5

Hardware Detection

Zone

Straight-Line Distance (meters)

2.6 4.04 4.14 3.2 2.44 3.455

Left-Extreme Distance (meters)

- 2.86 2.63 2.48 2.62 2.648

Page 103: Modelling a real-time multi-sensor fusion-based navigation ...

85

Right-Extreme Distance (meters)

- 2.35 3.09 2.62 2.51 2.643

Orientation Tracking

Zone

Straight-Line Distance (meters)

- 1.59 1.92 1.61 1.56 1.67

Left-Extreme Distance (meters)

- 1.34 1.47 1.15 1.48 1.36

Right-Extreme Distance (meters)

- 1.21 1.58 0.95 1.32 1.265

Further attempts were taken to gauge the maximum distance between the camera and

IR marker before none of the LEDs are left visible. The results showed that the raw IR

marker can be tracked by the camera to more than 10 meters under ideal indoor lighting

conditions. By extending the constellation of IR LEDs across the entire vest and ignoring

target orientation, the wearer’s relative position can be tracked by the camera within 10m

of the robot. This development has led to the inception of the following multi-sensor

fusion-based navigation model.

Page 104: Modelling a real-time multi-sensor fusion-based navigation ...

86

Figure 4-6: Illustration of the hardware detection zone performance, relative to

orientation tracking. (Tee, Lau, Siswoyo Jo & Lau 2015)

4.3 SENSOR-FUSION BASED ROBOT NAVIGATION MODEL

The fusion model proposed in this project is aimed at improving the performance of

human activity tracking systems that utilize single depth sensor solutions, by adding an

Page 105: Modelling a real-time multi-sensor fusion-based navigation ...

87

additional tracking system that relies on an Active Marker. Aside from the benefit of adding

redundant sensors to compensate for loss of tracking, this model presents two algorithms:

a) A method to identify and lock onto the body that represents the primary subject of

the tracking system, utilizing the tracking of the active Infrared Marker.

b) A fusion mechanism that combines the imaging sweeps of the environment using

both ranging sensors array and the depth sensor in addition to the locked position

of the Primary Subject.

4.3.1 Identification and Locking of Primary Subject

When used for human tracking, the depth sensor functions by identifying and tracking

objects in front of it that fit the profile of a human body. Once a person is identified, it is

registered as a body by the sensor system. Often, human tracking depth sensors can

track multiple bodies at a time. However, without any additional data, it is difficult for it to

continuously track a single primary subject’s body, especially in situations where there

are multiple secondary bodies moving around. This model relies on the addition of an

Active IR Marker worn by the primary subject. The position of the IR Marker is picked up

by the IR camera and used to help identify which body belongs to the primary subject.

4.3.1.1 The Depth Sensor Camera Space

To begin, the camera space of the depth sensor must be understood. For expressing this

model, the depth sensor’s camera space or view space is illustrated in Figure 4-7.

Page 106: Modelling a real-time multi-sensor fusion-based navigation ...

88

Figure 4-7: The depth sensor's camera space illustration.

The camera space is projected as a 2D inverted Cartesian plane, with the origin set at

the top right corner (𝑈𝑜,𝑉𝑜) . ′𝑈𝐿𝑒𝑛𝑔𝑡ℎ′ and ‘𝑉𝐿𝑒𝑛𝑔𝑡ℎ′ represent the horizontal and

vertical spans of the camera space, typically in pixels. Most depth sensor systems identify

the positions of bodies by their individual centroids with reference to the (𝑈𝑜,𝑉𝑜). Thus, the

bodies would be tracked as (Eq. 4-1):

(𝑈1, 𝑉1), (𝑈2, 𝑉2), (𝑈3, 𝑉3) … (𝑈𝑛, 𝑉𝑛) (Eq. 4-1)

However, this method is wholly dependent on the sensor hardware’s Field of View (FOV).

To decouple the positioning method from hardware dependence, the coordinate system

(𝑈𝑉) is transformed to ′𝑢𝑣′, by defining the new origin (𝑢𝑜,𝑣𝑜) (denoted by the Δ) at

the middle point of the camera space (Eq. 4-2).

𝑢0 =𝑈𝐿𝑒𝑛𝑔𝑡ℎ

2 ; 𝑣0 =

𝑉𝐿𝑒𝑛𝑔𝑡ℎ

2 (Eq. 4-2)

Page 107: Modelling a real-time multi-sensor fusion-based navigation ...

89

Thus, the position of each tracked body will be expressed in terms of the 𝑢𝑣 coordinate

system, using the following transformation (Eq. 4-3):

𝑢 = 𝑈 − 𝑢0 ; 𝑣 = 𝑉 − 𝑣0 (Eq. 4-3)

The adjusted position of all tracked bodies will now look as the following (Eq. 4-4):

(𝑢1, 𝑣1), (𝑢2, 𝑣2), (𝑢3, 𝑣3) … (𝑢𝑛, 𝑣𝑛) (Eq. 4-4)

4.3.1.2 The IR Active Marker Tracker Camera Space

Next, the camera space of the IR Active Marker tracking system is illustrated in Figure

4-8.

Figure 4-8: Illustration of the Active Marker IR Camera's view space.

Page 108: Modelling a real-time multi-sensor fusion-based navigation ...

90

Since both IR camera and Depth sensor are vision-based systems, they share the same

definition of camera space in terms of inverted 2D Cartesian coordinate system. The

origin (𝑋𝑜,𝑌𝑜) is located at the top left of the plane, while ′𝑋𝐿𝑒𝑛𝑔𝑡ℎ′ and ′𝑌𝐿𝑒𝑛𝑔𝑡ℎ′

represent the horizontal and vertical spans of the camera space. To differentiate the

discussion from that of the Depth sensor, the figure uses the ′𝑋𝑌′ notation to represent

the IR Marker system’s camera space.

Unlike the Depth sensor, this tracking system only has one Active Marker to look for. The

Active Marker consists of a series of Infrared light sources (most typically LEDs) arranged

in a pattern around the Primary Subject’s body. When worn, multiple IR LEDs should be

visible to the IR Camera so long as the primary subject (the wearer of the IR Marker) is

within the vicinity of the tracking system. The position of each LED blob is expressed with

reference to the ′𝑋𝑌′ coordinate system’s origin (𝑋𝑜,𝑌𝑜) (Eq. 4-5).

(𝑋1, 𝑌1), (𝑋2, 𝑌2), (𝑋3, 𝑌3) … (𝑋𝑛, 𝑌𝑛) (Eq. 4-5)

The position of the Active Marker is indicated by the centroid of the cluster of visible IR-

LEDs with respect to the IR camera’s view space. This can be found by finding the

Arithmetic Mean of the set of visible LED positions (Eq. 4-6).

(𝑋𝐶𝑒𝑛𝑡𝑟𝑜𝑖𝑑

𝑌𝐶𝑒𝑛𝑡𝑟𝑜𝑖𝑑) = ∑

1

𝑛(

𝑋𝑛

𝑌𝑛)

𝑛

𝑖=1

(Eq. 4-6)

Like the case of the depth sensor, the raw position data from the IR camera is dependent

on the hardware’s FOV. To decouple it, the coordinate system of ′𝑋𝑌′ must be

transformed to ′𝑥𝑦′, whose origin is (𝑥𝑜,𝑦𝑜) (denoted by the Δ) centered at the midpoint

of the camera space (Eq. 4-7).

𝑥0 =𝑋𝐿𝑒𝑛𝑔𝑡ℎ

2 ; 𝑦0 =

𝑌𝐿𝑒𝑛𝑔𝑡ℎ

2 (Eq. 4-7)

Page 109: Modelling a real-time multi-sensor fusion-based navigation ...

91

Finally, the Active Marker position (𝑥𝑀𝑎𝑟𝑘𝑒𝑟,𝑦𝑀𝑎𝑟𝑘𝑒𝑟) is found after adjusting the centroid

position to the ′𝑥𝑦′ coordinate system (Eq. 4-8).

𝑥𝑀𝑎𝑟𝑘𝑒𝑟 = 𝑋𝐶𝑒𝑛𝑡𝑟𝑜𝑖𝑑 − 𝑥0 ; 𝑦𝑀𝑎𝑟𝑘𝑒𝑟 = 𝑌𝐶𝑒𝑛𝑡𝑟𝑜𝑖𝑑 − 𝑦0 (Eq. 4-8)

4.3.1.3 Calibration: Finding the Calibration Offset and Scaling Factor

Even though the coordinate system for both the depth sensor and IR camera has been

decoupled from dependence on the hardware screen dimensions, the perceived Marker

position will not match the appropriate Body position due to the displacement between

sensors as well as the differing lens specifications of each device. All these factors can

to be addressed by applying appropriate transformations to the position data of one

sensor, so that it matches the scale and orientation of the other.

First, we address the issue of sensor displacement. Since this model uses a multi-sensor

fusion approach, it is assumed that both sensors exist as separate hardware components

mounted at a fixed displacement between each other. Thus, if the primary subject is

positioned at the origin of one sensor, that person may be perceived at a position away

from the origin of the second sensor. Figure 4-7 and Figure 4-8 show a good example

for this scenario. In this case, the Primary Subject is tracked by the depth sensor as Body

3, which is currently positioned at the origin (𝑢𝑜,𝑣𝑜). However, due to the position and

orientation of both sensors, the IR Camera finds that the Marker position is slightly off to

the bottom right of the origin (𝑢𝑜,𝑣𝑜). To adjust the IR Camera position data to match the

depth sensor’s, an offset (𝑥𝐶𝑎𝑙𝑖𝑏,𝑦𝐶𝑎𝑙𝑖𝑏) must be applied. Since the position and orientation

of both sensors should not change during the operation, the offset values can remain as

a constant variable. Thus, the following mapping is applied (Eq. 4-9):

(𝑥𝑀𝑎𝑟𝑘𝑒𝑟 − 𝑥𝐶𝑎𝑙𝑖𝑏

𝑦𝑀𝑎𝑟𝑘𝑒𝑟 − 𝑦𝐶𝑎𝑙𝑖𝑏) ⇒ (

𝑢𝑃𝑟𝑖𝑚𝑎𝑟𝑦

𝑣𝑃𝑟𝑖𝑚𝑎𝑟𝑦)

(Eq. 4-9)

Page 110: Modelling a real-time multi-sensor fusion-based navigation ...

92

During initial calibration of the sensor fusion system, the Primary Subject must wear and

activate the Active Marker, then be positioned in front of the depth sensor so that the

Primary Body is located at the origin (𝑢𝑜,𝑣𝑜). Then, the marker position is collected from

the IR camera. This offset from the origin is used as the Calibration Offset (𝑥𝐶𝑎𝑙𝑖𝑏,𝑦𝐶𝑎𝑙𝑖𝑏) .

Thus, when the Primary Subject is positioned at the depth sensor’s origin (𝑢𝑜,𝑣𝑜), the

marker position (after application of the ‘Calibration Offset’) should also be at its origin

(𝑥𝑜,𝑦𝑜).

While the positions of the Primary Subject’s Body and Marker coincide at the origin, the

displacement of both may not be the same after he moves to another position. The use

of different lens between the devices introduces the influences of differing apertures,

angles and occlusion. This means that even if the Primary Subject moves to a new

position, the displacement between the new position and the origin would not be the same

between different sensors. Barring the effects of extreme lens occlusion and angles, this

model assumes that the difference in perception of Subject displacement between

sensors is a roughly linear problem that can be compensated by application of a Scaling

Factor ′𝑘′ (Eq. 4-10).

𝑘𝑥 =𝑢𝑃𝑟𝑖𝑚𝑎𝑟𝑦

𝑥𝑀𝑎𝑟𝑘𝑒𝑟 − 𝑥𝐶𝑎𝑙𝑖𝑏 ; 𝑘𝑦 =

𝑣𝑃𝑟𝑖𝑚𝑎𝑟𝑦

𝑦𝑀𝑎𝑟𝑘𝑒𝑟 − 𝑦𝐶𝑎𝑙𝑖𝑏 (Eq. 4-10)

After successfully finding the Calibration Offset, move the Primary Subject to another

position that is still within the camera space of both depth sensor and IR camera. Collect

the Primary Subject’s Body Position ( 𝑢𝑃𝑟𝑖𝑚𝑎𝑟𝑦 , 𝑣𝑃𝑟𝑖𝑚𝑎𝑟𝑦 ) and Marker Position

(𝑥𝑀𝑎𝑟𝑘𝑒𝑟 , 𝑦𝑀𝑎𝑟𝑘𝑒𝑟). Using the Calibration Offset obtained from the previous step (𝑥𝐶𝑎𝑙𝑖𝑏,

𝑦𝐶𝑎𝑙𝑖𝑏), the Scaling Factor (𝑘𝑥, 𝑘𝑦) can be found. The Scaling Factor k is the ratio of

subject displacement from the origin, between sensors.

Page 111: Modelling a real-time multi-sensor fusion-based navigation ...

93

4.3.1.4 Transforming the Active Marker Position and Finding the Primary Body

After the calibration phase, the Primary Subject is free to roam around the sensor fusion

human tracking system. During typical operation and sampling, there can be multiple

bodies registered by the depth sensor, either due to the addition of other people in the

same room, or cases of false detections because of environmental factors. The system

will attempt to find the body that best represents the Primary Subject by comparing each

registered Body’s position against the transformed Marker position.

During each sampling of Active Marker position using the IR camera, both the Calibration

Offset and Scaling Factor must be applied. Successful application will transform the

Marker position to match the coordinate system, midpoint and scale of the Depth sensor’s

camera space (𝑢𝑀𝑎𝑟𝑘𝑒𝑟 , 𝑣𝑀𝑎𝑟𝑘𝑒𝑟) (Eq. 4-11).

(𝑢𝑀𝑎𝑟𝑘𝑒𝑟

𝑣𝑀𝑎𝑟𝑘𝑒𝑟) = (

𝑘𝑥 00 𝑘𝑦

) (𝑥𝑀𝑎𝑟𝑘𝑒𝑟 − 𝑥𝐶𝑎𝑙𝑖𝑏

𝑦𝑀𝑎𝑟𝑘𝑒𝑟 − 𝑦𝐶𝑎𝑙𝑖𝑏) (Eq. 4-11)

Now that the Active Marker position (𝑢𝑀𝑎𝑟𝑘𝑒𝑟 , 𝑣𝑀𝑎𝑟𝑘𝑒𝑟 ) can be expressed within the

camera space of the depth sensor, comparison can be carried out in terms of assessing

the Euclidean Distance between each Body position and the Marker position via

Pythagoras Theorem, where n represents the total number of registered Bodies during a

sample (Eq. 4-12):

𝐹𝑜𝑟 𝑖 = 1, 2 … 𝑛: 𝐷𝑀𝑎𝑟𝑘𝑒𝑟,𝑖 = √(𝑢𝑖 − 𝑢𝑀𝑎𝑟𝑘𝑒𝑟)2 + (𝑣𝑖 − 𝑣𝑀𝑎𝑟𝑘𝑒𝑟)2 (Eq. 4-12)

The set of Euclidean distance measurements can be evaluated to find the Body that has

the least difference with respect to the Marker position. That Body would be assumed as

the one representing the Primary Subject (Eq. 4-13).

𝑃𝑟𝑖𝑚𝑎𝑟𝑦 𝑇𝑎𝑟𝑔𝑒𝑡′𝑠 𝐵𝑜𝑑𝑦, 𝐵(𝑢, 𝑣) = min𝑖=1

𝐷𝑀𝑎𝑟𝑘𝑒𝑟,𝑖 (Eq. 4-13)

Page 112: Modelling a real-time multi-sensor fusion-based navigation ...

94

4.3.2 Pathfinding and Obstacle Avoidance

Autonomous pathfinding, especially in indoor environments, has been the singular subject

of study for numerous researchers over many years. Unmanned robot navigation must

contend with a complex combination of terrain, obstacle and dynamic motion problems

which demand equally elaborate solutions. While GPS has been integral in its role for

outdoor localization, it cannot help with navigating indoor locations without a clear view

of the sky.

Recent years have witnessed a growing number of studies that explore the fusion of

sensors for indoor localization and navigation, with specific combinations of algorithms

and strategies according to the nature of the operating environment and application. This

model is one such case: having already pinpointed the location of the Primary Subject

with reference to the robot’s camera view, ‘𝐵(𝑢, 𝑣)’, this information can be used in

conjunction with a depth sensor array to navigate around the room and get into a suitable

position where the Subject is within the optimal zone of detection.

To establish an examinable container for which this navigation model is to be built in,

some assumptions must first be made. While the Target Locking portion of the model is

designed to accommodate position tracking in both horizontal and vertical planes (x and

y axes), the Navigation portion will only utilize the horizontal component (x axis

coordinate) because the robot platform is assumed to be a ground unit that moves across

the floor.

In addition, all sensors involved in navigation are mounted on the robot itself, with no

reliance on markers, transmitters, cameras or devices placed around the environment.

4.3.2.1 Wandering Standpoint Algorithm (WSA)

With reference to navigation methods of embedded robotics outlined by (Bräunl 2006),

this model assumes that the robot will traverse its immediate environment using the

Page 113: Modelling a real-time multi-sensor fusion-based navigation ...

95

Wandering Standpoint Algorithm (WSA). Utilizing this method as shown in the Figure 4-9

example, the robot will primarily move towards the Goal in a straight line if unobstructed.

If a barrier is detected, such as at position 1, it will stop and decide which direction to turn

to. It will turn towards position 2 because that direction will require the least amount of

adjustment. Its direction of adjustment is always determined by the general position of the

Goal with reference to itself, as seen at positions 2, 3, and 4.

Figure 4-9: Wandering Standpoint Algorithm.

This sort of robot navigation is both popular and simple to implement, because it only

relies on basic proximity or ranging sensors. These sensors determine the range between

itself and a solid obstruction using ultrasonic, Infrared or laser reflection. However,

ranging sensor arrays are constrained in terms of mounting, resolution and environmental

interference. Figure 4-10 shows an array of eight ultrasonic sensors mounted on a robot.

Note that the size of the detection cones as well as the sensor mountings are not

consistent, resulting in poor resolution. However, this arrangement is usable for roughly

identifying obstructions within the general directions (front, back and sides).

Page 114: Modelling a real-time multi-sensor fusion-based navigation ...

96

Figure 4-10: Detection zones for an array of ultrasonic sensors on a robot.

Assume that the robot is currently at position 1 of the navigation example in Figure 4-9.

Using the ultrasonic sensors layout of Figure 4-10, the raw ranging feedback can be

represented by Table 4-5.

Table 4-5: Example of raw distance input from ultrasonic sensors array.

Sensor 1 2 3 4 5 6 7 8

Distance 10 25 60 70 75 72 40 35

The raw data can be rearranged into Table 4-6, adjusted according to the general

directions around the robot. The South position feedback (Sensor 5) is ignored as it is

assumed that the robot would not consider a reverse maneuver.

Table 4-6: Adjusted example of ultrasonic sensors feedback.

Direction SW W NW North NE E SE

Degrees -120° -90 -60° 0° 60° 90° 120°

Sensor 6 7 8 1 2 3 4

Distance 72 40 35 10 25 60 70

Page 115: Modelling a real-time multi-sensor fusion-based navigation ...

97

The feedback value of ranging sensors shows the distance between an obstruction and

itself. Thus, a lower value indicates a fast-approaching barrier, while a higher value hints

at possible open spaces. In the Table 4-6 example, the Forward (North) sensor detected

an obstruction (at distance of 10). To proceed, the robot will have to turn either left or right,

and it decides based on the readings of the NE and NW sensors, both showing 15 and

35 respectively. NW has a higher value, so the robot decides to turn Left (Westward)

towards position 2. In a way, it is possible to visualize the way the robot perceives its

environment by placing the ranging sensor readings into a 1-Dimensional matrix or an

array. Figure 4-11 shows an illustration of the ranging sensor array S, if only the front 5

sensors are considered. The Wandering Standpoint Algorithm will move the robot

forwards towards the Goal direction if there are no obstructions. If the S1 detects a barrier,

the left and right front sensors (S2 and S8) would be compared. The side with the higher

value indicates the clearer direction to turn to. Once the coast is clear, the robot is turned

towards the Goal direction again.

Figure 4-11: Simple visualization of the ranging sensor array, S.

While the WSA is rather robust for simple navigation, it is subject to a host of caveats due

to the way it is implemented using sensors. The ideal case would be to mount the sensors

at equal angles from one another, e.g. at -90°. -60°, -30°, 0°, 30°, 60° and 90°. However,

this may not be feasible due to the design of the robot frame not being able to

Page 116: Modelling a real-time multi-sensor fusion-based navigation ...

98

accommodate equal mounting angles. The resolution could be improved by adding more

ranging sensors, but this is not possible due to the risk of overlapping sensor cones.

Ranging sensors consist of an emitter and receiver. By emitting an ultrasonic, Infrared or

laser pulse in a cone then receiving the echo after it bounces off solid barriers, the sensor

can correlate distance with the time or intensity of the received pulse. Clustering the

sensors close together will overlap the cones, causing resonances and receiving stray

pulses from neighboring modules.

Functionally, WSA is good for short-range navigation, allowing robot to roughly scan its

immediate vicinity, then identifying which way to turn to in case it runs into an obstruction.

The algorithm depends on the general direction of the Goal position to influence its

decision. In the ideal scenario, this refers to the Primary Target’s position with reference

to the robot’s camera view. Unfortunately, having a constantly moving Goal and not

knowing the general topography may end up causing the robot to take a longer route.

4.3.2.2 Fusion with the Depth Component and Target Lock

A possible solution to improve this method’s effectiveness is by augmenting it with a more

informed sensing system that presents knowledge of the shape of obstructions within

view. This system updates the Goal position for the Wandering Standpoint Algorithm in

real-time, biasing its decision-making process to favor directions that lead to less medium-

range barriers. When applied to the example in Figure 4-9, the robot may instead favor

the alternate route A, for fewer obstructions later.

Page 117: Modelling a real-time multi-sensor fusion-based navigation ...

99

Figure 4-12: Template of a depth map.

The raw data from the depth sensor used in the Target Locking phase can be extracted

in the form of an image frame called a depth map, as shown in Figure 4-12. The dimension

of the depth map is the same as the depth sensor’s camera view space as used during

the Target Locking phase. Thus, the horizontal and vertical dimensions are also

expressed by ‘𝑈𝐿𝑒𝑛𝑔𝑡ℎ’ and ‘𝑉𝐿𝑒𝑛𝑔𝑡ℎ’. Each pixel of the image (shown as squares within

the template) contains a depth value that indicates the distance between the sensor and

a solid surface at that position.

Figure 4-13:Example of a depth image frame.

Page 118: Modelling a real-time multi-sensor fusion-based navigation ...

100

Figure 4-13 shows an example of an 8-bit depth map taken of a person between two

boxes. The box on the left is closest to the depth sensor, followed by the person and then

the box on the right. Note that the color of the pixels representing each of the three objects

vary from very dark to very light indicating the distance from close to far. Since each pixel

holds an 8-bit value, this ranges between 0 (black) to 255 (white). The values are unit-

less and should be scaled according to the depth sensor’s detection range.

The depth sensor is considered as a high-resolution vision-based rangefinder, which

snapshots the depth between all objects and itself within its view space. While this may

seem useful as a replacement for the sensor array used by the WSA, the depth map is

expressed as a 2-Dimensional array that visualizes obstructions from the Vertical plane.

To map itself in a useful form onto the existing ranging sensor array S, the depth map

must be transformed into a 1-D array that represents itself in the Horizontal plane, as

illustrated in Figure 4-14.

Transformation can be as simple as experimental selection of a row of depth data to

gauge the immediate surroundings for possible ways around an obstacle. However, this

method is dangerous because the vision-based nature of depth sensor produces noisy

depth readings due to environmental lighting and optical effects. Blending several rows

of depth data is also possible to reduce the effect of noise, but since the depth sensor is

to be mounted high enough to capture the full body of the subject from a distance, it would

be unlikely that its view space is wide enough to accommodate short-range obstructions.

This eliminates the option of using the depth sensor as a replacement of the ranging

sensor array. However, the sensor should be able to detect medium-range obstacles

between the robot and the Primary Subject.

Page 119: Modelling a real-time multi-sensor fusion-based navigation ...

101

Figure 4-14: Illustration of the transformation problem from the Vertical to Horizontal

Plane.

The position of the Primary Subject is also expressed with reference to the depth sensor.

Blending the position of the Primary Subject as well as knowledge of the surrounding

obstacles can be used to process a Goal position that help guide the Wandering

Standpoint system towards a path that is biased for less obstructions while keeping the

Primary Subject within the human activity tracking system’s detection zone.

Page 120: Modelling a real-time multi-sensor fusion-based navigation ...

102

4.3.2.3 Introducing the Potential Field Method (PFM)

This is where the Potential Field Method (PFM) comes into play. Proposed by Khatib in

1986, this method was to be a low-level motion planning strategy for mobile robots that

incorporates simultaneous pathfinding and obstacle avoidance. It assumes that the robot

has complete awareness of locations and shapes of boundaries, obstructions, starting

and goal positions in the form of a map. Each component (boundary, obstruction, starting

position or goal) is visualized to emanate a potential field, like how a magnetized object

has a magnetic field surrounding it. Boundaries, obstructions and the starting position

have a repulsive potential field, whereas the goal has an attractive potential field. When

placed in a map, each cell holds a value that is a sum of all attractive and repulsive fields

of its surroundings. The magnitude of the fields experienced by a map’s cell increases as

it is nearer to a component. Hence the map appears as an overall artificial potential field

similar to the illustration excerpt from (Bräunl 2006) in Figure 4-15(a).

Figure 4-15: Illustration of the Potential Field Method. (a) An overhead depiction of a

potential field. (b) The same field reimagined as a contoured slope. (Bräunl 2006)

(Khatib 1986) describes this method of collision avoidance and path planning as applied

to guiding a robot arm’s end effector towards a specific position. The artificial potential

field ‘𝑈𝑎𝑟𝑡’ is the sum of the potential fields exuded by the goal position ‘𝑈𝑋𝑑’ as well as

obstructions ‘𝑈𝑂’ (Eq. 4-14).

𝑈𝑎𝑟𝑡 = 𝑈𝑋𝑑+ 𝑈𝑜 (Eq. 4-14)

Page 121: Modelling a real-time multi-sensor fusion-based navigation ...

103

To help formulate a visible motion path, the artificial potential field expression is

transformed into a vector command form ‘𝐹𝑎𝑟𝑡∗’ representing the forces that act upon the

end effector (Eq. 4-15).

𝐹𝑎𝑟𝑡∗ = 𝐹𝑋𝑑

∗ + 𝐹𝑂∗

𝐹𝑋𝑑

∗ = −𝑔𝑟𝑎𝑑[𝑈𝑋𝑑(𝑋)]

𝐹𝑂∗ = −𝑔𝑟𝑎𝑑[𝑈𝑂(𝑋)]

(Eq. 4-15)

(Eq. 4-16)

(Eq. 4-17)

The attractive force ‘𝐹𝑋𝑑

∗’ is applied to the end effector at position X so that it reaches

towards the goal position ‘𝑋𝑑’ (Eq. 4-16). Likewise, ‘𝐹𝑂∗’ represents the repulsive force

that emanate from the surface of obstacles (Eq. 4-17). These forces are expressed as

attractive and repulsive potential fields adapted as negative gradients as shown in Figure

4-15(b). The reason for this is that the sum of forces acts on the robot end effector so that

it gravitates from its current position towards the goal. Repulsive forces emanating from

obstacles along the path are represented by the hills and peaks. Thus, the valleys from

the top to the bottom of the slope represent the pathway for the robot to take.

The attractive potential field ‘𝑈𝑋𝑑(𝑋)’ described by (Khatib 1986) appears to be inspired

by the elastic potential energy function of springs, with some notable differences.

‘(𝑋 − 𝑋𝑑)2’ indicates that the magnitude of the attractive potential field is proportional to

the square of the distance between the robot and the goal, thus imposing a dissipative

force that slows it down as it approaches its destination. The constant ‘𝑘𝑝’ is a proportional

gain for the motion’s velocity (Eq. 4-18).

𝑈𝑋𝑑(𝑋) =

1

2𝑘𝑝(𝑋 − 𝑋𝑑)2 (Eq. 4-18)

It can also be observed that the repulsive potential field is structured similarly to its

attractive counterpart. This expression considers the shortest distance to the obstacle P

and the potential field’s limit distance ‘𝑃0’. Khatib (1986) acknowledges that the effects of

Page 122: Modelling a real-time multi-sensor fusion-based navigation ...

104

the repulsive field should be made negligible if the shortest distant between the robot and

the obstacle ‘𝑃’ is beyond the field’s limit distance ‘𝑃0’. Note that the repulsive field’s

magnitude is inversely proportional to the square of the distance between the robot and

the obstruction. Finally, the constant ‘𝜂’ represents a situational gain (Eq. 4-19).

𝑈𝑂(𝑋) = {𝑤ℎ𝑒𝑛 𝑃 < 𝑃𝑜 :

1

2𝜂(

1

𝑃−

1

𝑃𝑜)2

𝑤ℎ𝑒𝑛 𝑃 > 𝑃𝑜 : 0

(Eq. 4-19)

4.3.2.4 Adapting PSM for Transforming the Depth Component

Application of PSM in its entirety to a self-contained mobile robot would not be feasible

due to several factors. The robot is only equipped with ranging sensors to roughly scan

its immediate surroundings, and a depth sensor nest that constantly pivots to face a

primary target. This arrangement is not enough to supply PSM with an overall awareness

of the obstructions, boundaries and target of the entire room. Thus, a complete artificial

potential field is not available to calculate the traversable pathway between the robot and

the subject.

Figure 4-16: Only the obstructions and target within field of view is considered when

deciding which direction to take. (Tee, Lau, Siswoyo Jo & Lau 2016)

Page 123: Modelling a real-time multi-sensor fusion-based navigation ...

105

Instead of scanning the entire environment and plotting a path for the robot to maneuver

to, a snapshot of its immediate medium-range obstacles and the target is acquired at

regular intervals, such as illustrated in Figure 4-16. Since only the immediate

surroundings are acquired, the process should be less resource consuming. Each pixel

of snapshot is examined for the robot’s proximity to the target and obstructions. The

results can then be used to bias the robot’s decision when it comes to deciding which

direction to turn when meeting an obstruction between itself and the target. Because of

this, the PSM’s reliance on calculating the artificial potential field to plot a traversal path

around obstructions is no longer necessary. This is replaced with a 1D array indicating

the direction leading towards the target with least amount of obstructions.

Figure 4-17: Illustration of the transformed depth map into horizontal form.

The goal of transforming the vertical depth map is to produce its 1-Dimensional array form

in the horizontal plane. Each element corresponds to a visible direction relative to the

robot field of view, as simply illustrated in Figure 4-17. The elements or cells numerically

indicate the likelihood of reaching the target with minimal obstructions if the robot moves

towards the corresponding directions. This likelihood can be represented by a sum of

attractive forces ‘𝐹𝐴𝑡𝑡 ’ (emanating from the Primary Target) and repulsive forces ‘𝐹𝑅𝑒𝑝’

(emanating from walls and obstructions) (Eq. 4-20).

𝑆𝑢𝑚 𝑜𝑓 𝑓𝑜𝑟𝑐𝑒𝑠 𝑖𝑛 𝑎 𝑐𝑒𝑙𝑙, 𝐹𝐶𝑒𝑙𝑙 = 𝐹𝐴𝑡𝑡 + 𝐹𝑅𝑒𝑝 (Eq. 4-20)

Page 124: Modelling a real-time multi-sensor fusion-based navigation ...

106

The analogy of using linear spring potential energy may not be suitable for this adaptation,

because the sum of forces on each cell is not relative to the potential fields, but to direct

proximity of obstructions and the direction of the target. A closer analogy can be drawn

from Coulomb’s Law, which describes the interacting forces between two charged

particles (Eq. 4-21).

𝑆𝑐𝑎𝑙𝑎𝑟 𝑒𝑙𝑒𝑐𝑡𝑟𝑜𝑠𝑡𝑎𝑡𝑖𝑐 𝑓𝑜𝑟𝑐𝑒, 𝐹 = 𝑘𝑒

|𝑞1𝑞2|

𝑟2

(Eq. 4-21)

Coulomb’s Law states that electrically charged particles induce a magnetic field on each

other, being attractive or repulsive depending on the particles being opposite or like

signed. ‘|𝑞1𝑞2|’ is the scalar product of the magnitude of both charges, while ‘𝑟2’ is the

square of the distance between the particles. Just like the treatment of repulsive forces

from obstructions in PSM, the strength of the electrostatic force is inversely proportional

to the square of that distance. ‘𝑘𝑒’ is the electric force constant. Coulomb’s Law can be

used as an inspiration for defining a general form ‘𝐹’, for the attractive and repulsive forces

in this model (Eq. 4-22).

𝐺𝑒𝑛𝑒𝑟𝑎𝑙 𝑓𝑜𝑟𝑐𝑒 𝑓𝑜𝑟𝑚, 𝐹 = 𝐶𝑇

𝑟2 (Eq. 4-22)

The general force form shares similar nature to that of Coulomb’s Law, beginning with the

numerator ‘𝑇’ representing the sign to indicate a repulsive (negative unit) or attractive

force (positive unit). ‘𝑟2’ represents the square of a distance variable, which depends on

application. The constant 𝐶 is a scaling factor to adjust the significance of the force values

according to implementation.

To begin formulating the repulsive force model, the depth sensor’s camera view will have

to be revisited, as in Figure 4-18. The depth map pixels are referenced by ‘𝑈𝑉’ coordinates,

but not all rows will be considered for processing. This is because of lens occlusions and

environmental clutter that may constitute too much noise for the top and bottom rows of

Page 125: Modelling a real-time multi-sensor fusion-based navigation ...

107

data. Thus, the upper/ceiling (G) and lower/floor (H) trims must be specified. These values

are determined through experimentation and varies according to hardware.

Figure 4-18: Revisited depth sensor camera view with top and bottom trims.

To process the repulsive force for a cell, the distance component consists of the average

depth values contained within a column of the depth map, limited by both ceiling and floor

trims (Eq. 4-23).

𝑟𝐷𝑒𝑝𝑡ℎ,𝑈 =1

𝑉∑ 𝐷(𝑈, 𝑉)

𝐻

𝑉=𝐺

(Eq. 4-23)

Referring to the general force form, the repulsive force ‘𝐹𝑅𝑒𝑝’ assigns a negative unit to

‘𝑇’, molding the repulsive force component as a subtractive value when solving for the

sum of forces ‘𝐹𝐶𝑒𝑙𝑙’. The magnitude of repulsive force for a cell is inversely proportional

to the square of the average depth value for a column of the trimmed depth map (Eq. 4-

24).

𝑅𝑒𝑝𝑢𝑙𝑠𝑖𝑣𝑒 𝐹𝑜𝑟𝑐𝑒, 𝐹𝑅𝑒𝑝 = 𝐶𝑅𝑒𝑝

−1

𝑟𝐷𝑒𝑝𝑡ℎ2 (Eq. 4-24)

Page 126: Modelling a real-time multi-sensor fusion-based navigation ...

108

Since the dimension of the transformed depth array corresponds to the horizontal scale

of the depth map, the contents of each cell can be defined by ‘𝐹𝑅𝑒𝑝,𝑈’ (Eq. 4-25).

𝑅𝑒𝑝𝑢𝑙𝑠𝑖𝑣𝑒 𝐹𝑜𝑟𝑐𝑒 𝑜𝑓 𝑐𝑒𝑙𝑙 𝑈, 𝐹𝑅𝑒𝑝,𝑈 = 𝐶𝑅𝑒𝑝

−1

𝑟𝐷𝑒𝑝𝑡ℎ,𝑈2 (Eq. 4-25)

By having the repulsive force be inversely proportional to the square of the distance

between the robot and obstruction, the surface of the barrier will result in value closest to

-1. The constant ‘ 𝐶𝑅𝑒𝑝 ’ is used to experimentally adjust the scale of the repulsive

component for performance adjustments.

4.3.2.5 Adapting PSM for Transforming the Target Array

The attractive forces emanate from the Primary Target’s horizontal position ‘𝑈𝑃𝑇 ’ as

shown in Figure 4-19. The awareness of the target’s position can be expressed in the

form of a 1-dimensional Target array ‘ 𝑃 ’ that shares identical dimensions as the

transformed depth array. The magnitude of the attractive forces for its cells are inversely

proportional to the square of the distance between the cells’ indices and the Primary

target’s horizontal index ‘𝑈𝑃𝑇’.

Figure 4-19: Example of populating the Target Location array.

To begin, the distance component is calculated by finding the absolute distance between

the current cell and the index that corresponds to the target’s position, then incrementing

that value by one. This will ensure that processing the cell representing the target’s

position will not result in a division by zero when calculating for its attractive force (Eq. 4-

26).

Page 127: Modelling a real-time multi-sensor fusion-based navigation ...

109

𝑟𝐸𝑙𝑒𝑚,𝑈 = |𝑈 − 𝑈𝑃𝑇| + 1 (Eq. 4-26)

Referring to general force form, the attractive force component assigns a positive unit to

‘𝑇’, molding the attractive force component as an additive value when solving for the sum

of forces ‘𝐹𝐶𝑒𝑙𝑙’ (Eq. 4-27).

𝐴𝑡𝑡𝑟𝑎𝑐𝑡𝑖𝑣𝑒 𝐹𝑜𝑟𝑐𝑒, 𝐹𝐴𝑡𝑡 = 𝐶𝐴𝑡𝑡

1

𝑟𝐸𝑙𝑒𝑚2 (Eq. 4-27)

Since the dimension of the target location array corresponds to the transformed depth

array, the contents of each cell can be defined by ‘𝐹𝐴𝑡𝑡,𝑈’ (Eq. 4-28).

𝐴𝑡𝑡𝑟𝑎𝑐𝑡𝑖𝑣𝑒 𝐹𝑜𝑟𝑐𝑒 𝑜𝑓 𝑐𝑒𝑙𝑙 𝑈, 𝐹𝐴𝑡𝑡,𝑈 = 𝐶𝐴𝑡𝑡

1

𝑟𝐸𝑙𝑒𝑚,𝑈2 (Eq. 4-28)

As with calculating for the transformed depth array, each cell of the Target array contains

an attractive force value that is inversely proportional to the square of the index distance

between itself and the square representing the target position. Thus, the cell representing

the target position will hold the maximum value of 1, whereas cells radiating away from it

will hold gradually diminishing values. The constant ‘𝐶𝐴𝑡𝑡’ is used to experimentally adjust

the scale of the attractive component for performance adjustments.

4.3.2.6 Resolving the Sum of Forces

By having both sets of attractive and repulsive force arrays, it is now possible to combine

them into a bias array that is used to provide a goal direction bias for the Wandering

Standpoint Algorithm (WSA) outlined earlier in this model. To achieve this while providing

flexibility of performance adjustments, a gain constant ‘ 𝐶𝐹 ’ is added to the present

equation for the sum of forces for each cell of the Bias array, ‘𝐵’ (Eq. 4-29).

Page 128: Modelling a real-time multi-sensor fusion-based navigation ...

110

𝐴𝑑𝑗𝑢𝑠𝑡𝑒𝑑 𝑠𝑢𝑚 𝑜𝑓 𝑓𝑜𝑟𝑐𝑒𝑠 𝑓𝑜𝑟 𝑒𝑎𝑐ℎ 𝑒𝑙𝑒𝑚𝑒𝑛𝑡 𝑖𝑛 𝐵, 𝐹𝐵 =𝐹𝐴𝑡𝑡 + 𝐹𝑅𝑒𝑝

𝐶𝐹 (Eq. 4-29)

Interpreting the values in the Bias array is a matter of identifying the index of the element

that holds the maximum value throughout the set. That index corresponds to the direction

that leads towards the target while presenting the least visually perceived medium-range

obstructions. Others range from very negative (expect most obstructions) to positive

values (indicating possible alternative routes) (Eq. 4-30).

max𝑖=0

𝐵𝑖 (Eq. 4-30)

4.3.2.7 Aligning the Bias and Sensor Arrays to Apply the Goal Direction Bias

This model is developed for the use with a mobile robot that has a pivoting head which

houses the depth sensor nest. Since the tracking strategy involves the sensor nest doing

its best to constantly turn and face the Primary Target, at most instances it will not be

aligned with the bottom half of the robot which houses the ranging sensors array and

drive systems. The Bias array must be transformed into the same reference point as the

Ranging Sensors array before the Goal Bias can be finalized.

Page 129: Modelling a real-time multi-sensor fusion-based navigation ...

111

Figure 4-20: Example of the Bias and Sensor arrays alignment process.

Figure 4-20 illustrates this process in three steps. If the midpoint of the ranging sensor

array ‘S’ is the origin, the magnitude and direction of the angle mismatch between both

arrays is represented by ‘θ’. This information can be acquired via integration of a

potentiometer or other angle feedback sensor into the turning mechanism of the sensor

nest. Next, the index of the Bias (B) Array’s cell with the highest sum of forces can be

transformed by adding an offset that is based on the mismatch angle ‘θ’, adjusted using

a suitable scaling factor ‘𝐶𝐵’ (Eq. 4-31).

𝐴𝑑𝑗𝑢𝑠𝑡𝑒𝑑 𝑖𝑛𝑑𝑒𝑥 𝑓𝑜𝑟 𝑠𝑒𝑙𝑒𝑐𝑡𝑒𝑑 𝑐𝑒𝑙𝑙 𝑖𝑛 𝐵𝑖𝑎𝑠 𝑎𝑟𝑟𝑎𝑦 𝐵 = 𝑈 + 𝐶 𝐵𝜃 (Eq. 4-31)

Page 130: Modelling a real-time multi-sensor fusion-based navigation ...

112

The final step involves processing a set of results that represent the decision-making

tendencies of the robot when faced with a barrier between itself and the target. The

tendency towards favoring the direction indicated by the ranging sensor array is

represented by ‘𝛽’, while ‘𝛼’ represents the Bias array counterpart (Eq. 4-32).

𝛼 = 𝑘𝐵(𝑈 + 𝐶𝐵𝜃) (Eq. 4-32)

The tendency towards the direction indicated by the Bias array is determined by the

adjusted index for its selected cell. To recap, this cell represents the direction that is

perceived to lead towards the Primary Target with the least amount of medium-range

obstructions. The adjusted index ‘𝑈 + 𝐶𝐵𝜃’ is augmented by a tendency gain ‘𝑘𝐵’. The

tendency gain is applied experimentally for performance adjustments (Eq. 4-33).

𝛽 = 𝑘𝑆(𝑆𝑈) (Eq. 4-33)

The tendency value for favoring the direction indicated by the ranging sensors array is

similarly structured, but it’s defining component is the magnitude of the sensor reading

from the favorable side ‘𝑆𝑈 ’. Thus, the tendency to turn towards the ranging-sensor’s

favored side will be higher if the perceived short-range clearance of that side is bigger. A

tendency gain ‘𝑘𝐵’ is also experimentally applied.

With this model in place, the pathfinding decision making process is reduced to the

following 2 possibilities:

a) There is no obstruction between the robot and the target: align with the sensor nest

and move forwards.

b) Short-range obstruction detected. Check for target location then sample for

ranging and depth arrays. Either:

i. Both tendencies point to one direction: Turn towards that direction to

navigate around the obstruction.

Page 131: Modelling a real-time multi-sensor fusion-based navigation ...

113

ii. Both tendencies point to opposite directions: turn towards the direction

indicated by the highest tendency.

Either outcomes from (b) is then used to re-orientate the robot before initiating any choice

of wall-following or obstacle avoidance algorithm. This ensures less change of a local

method leading into a possibly more complicated path or away from the target. This

research’s proposed model results in suggesting the best direction to initiate maneuvers

but leaves the choice of algorithm implementation open for individual robot customization.

4.4 EXAMPLE MODEL APPLICATION

This section illustrates basic examples of how the navigation model can help decide on

which direction to execute maneuvers to begin towards. Figure 4-21 starts with a baseline

scenario whether there is no obstruction between the robot and the target human. Even

though pathfinding mode is not engaged in this case, the example will help describe how

the decision-making process is done via calculations. The diagram shows the human

entity being within the right-side quadrant of the robot’s Field of View (FOV), so the human

tracking system is assumed to report her relative position as corresponding to index 3.

The target array sized at 10 elements (calculated in Eq. 4-27) shows the contents of

attractive forces as determined by its index distance from element 3. Split into Left and

Right halves, the totaled values (solely determined by relative position of the Primary

Target) indicate a higher tendency for navigation towards the Right.

Figure 4-22 builds upon the previous example by adding input from the proximity sensors

array which detected a single obstruction ahead of the robot. The Ultrasonic Sensors

Array shows the contents of this input (assuming the resolution of the FOV is also 10

elements). Eq. 4-33 is applied here to determine the repulsive forces resultant from the

immediate vicinity. Note that the repulsive forces are strongest in elements corresponding

to the detected obstacle’s proximity. By summing the Left and Right sided forces in

addition to the previously calculated attractive forces, the verdict still leans towards the

Page 132: Modelling a real-time multi-sensor fusion-based navigation ...

114

Right-side. This indicates that the path of least resistance involves navigating towards the

right side of the obstacle.

Figure 4-21: Example Model Application 1 – No Obstruction.

Page 133: Modelling a real-time multi-sensor fusion-based navigation ...

115

Figure 4-22: Example Model Application 2 – Single Primary Obstruction.

Page 134: Modelling a real-time multi-sensor fusion-based navigation ...

116

Figure 4-23: Example Model Application 3 – Obstruction and Clutter Encounter.

Page 135: Modelling a real-time multi-sensor fusion-based navigation ...

117

Figure 4-23 further extends the discussion by adding a clutter field in the Right-quadrant

of the FOV. Rudimentary proximity based local navigation that leans towards a right-sided

route will encounter the clutter field prompting additional routing calculations. This model

acquires the raw Depth Map and processes it into a horizontal average depth histogram,

via Eq. 4-23. This is then used in Eq. 4-25 to create the Depth Array of repulsive forces.

Once again, this is summed in Left and Right halves together with the sensor array

repulsive forces and the relative target position’s attractive forces. The resultant verdict

advices a Leftward route that will bypass having to contend with the detected clutter field

ahead.

4.5 ROBOT CONTROL PROTOTYPE IMPLEMENTATION

Once the multi-sensor fusion-based navigation model is formulated, it is integrated into

the design of the robot system in accordance to this research’s case study. Following the

findings and conclusions of Chapters 1 and 2, the proposed Robot-Based Injury

Prevention Strategy requires the use of a companion robot avatar to acts as a mediator

between a child and caregiver. This Injury Prevention Telepresence System was first

modeled in a Use Case Diagram shown in Figure 4-24.

The model depicts which events or actions in the physical environment will trigger the

functions of the robot, e.g. the child performing a dangerous pose, once detected by the

robot, will prompt it to notify the caregiver. At the time when this model was made, motion

tracking hardware limitations dictated that the subject must be facing the sensor device

for the pose recognition to function optimally. Because of this, the child’s orientation and

position are two separate stimuli that will prompt the robot to relocate itself so that the

target stays within the optimal conditions for the device’s operation. The caregiver may

request to engage telepresence mode at any time, while being provided updated

monitoring logs of the child target. This general expression of the required robot system

was refined into a Use Case model as the first stage for software design.

Page 136: Modelling a real-time multi-sensor fusion-based navigation ...

118

Figure 4-24: Use Case model illustrating the general functions of the required robot

avatar.

4.5.1 Use Case Model

The interaction design of the system assumed that there are three active entities: the child

target, a guardian (parent or caregiver) and the robot (which acts as a pawn). The Injury

Prevention Telepresence System existed as an external component which perceive the

actions from the other entities and interacts with them via the robot pawn. The interaction

model is shown in Figure 4-25.

The child entity does not have direct interaction with the robot pawn. Instead, the robot

passively monitors the child’s activity (and in the event of detecting a dangerous action,

sending an alert) for the guardian entity. On the other hand, the guardian has direct control

over the system, manually initiating telepresence so that a conversation can be had with

the child entity via the robot pawn.

Page 137: Modelling a real-time multi-sensor fusion-based navigation ...

119

Child

GuardianRobot

Monitor Activity

Alert Guardian

<<include>>

Have Conversation

Get Activity Report

Figure 4-25: The interaction model for the Injury Prevention Telepresence System.

The interaction model had helped define a set of six Use Case stories to develop the

required system functionalities, as listed in Table 4-7. The table lists functions both

rudimentary (the starting and stopping of the monitoring process) and primary

(autonomous monitoring, activity reporting, alert notification and initiation of telepresence).

The initiating actor, level of complexity and development priority are indicated for each

case.

Page 138: Modelling a real-time multi-sensor fusion-based navigation ...

120

Table 4-7: Summary of generated Use Case stories.

Use Case ID Use Case Name Primary Actor Scope Complexity Priority

1 Begin Monitoring Guardian In Low 3

2 Monitor Activity Robot In High 1

3 Get Activity Report Guardian In Med 1

4 Alert Guardian Robot In Med 1

5 Have Conversation Guardian In High 2

6 Stop Monitoring Guardian In Low 3

Since this research is ultimately concerned with applying the novel navigation model to

the case study, only the Monitor Activity case will be highlighted. This Use Case can be

found in Table 4-8. The full scope coverage for the Injury Prevention Telepresence

System’s use case model can be found in Appendix A of this thesis.

Table 4-8: Use Case story encompassing the autonomous monitoring function.

Use Case ID 2

Application ANIMA

Name Monitor Activity

Description The Robot attempts to look for the Child, detect the Child’s current

activity and adds the result to the monitoring log.

Primary Actor Robot

Precondition The Robot is currently in Monitoring Mode.

Trigger Activation of Monitoring Mode.

Basic Flow 1. Robot checks to see if Child is visible.

2. Child is visible, so robot locks camera position.

3. Robot checks to see if front of Child is visible.

4. Child’s front is visible. Robot remains stationary.

5. Robot checks to see if Child matches any recognizable

activity.

Page 139: Modelling a real-time multi-sensor fusion-based navigation ...

121

6. Child’s activity is recognized. Robot takes a picture and

adds entry to Monitoring Log.

7. Robot checks to see if entry type is dangerous.

8. Entry type is not dangerous, Robot resumes monitoring.

9. Robot checks to see if Monitoring Mode is deactivated.

10. Monitoring Mode is active, Robot repeats step 1 – 9.

Alternate

Flows

1 – Child is not visible.

2. a. Robot adds ‘Visibility Lost’ entry to Monitoring Log.

2. b. Robot begin wall-following room exploration.

2. c. Go to step 9.

4 - Child’s front is not acquired.

4. a. Robot adds ‘Orbiting’ entry to Monitoring Log.

4. b. Robot orbits locked position and avoid obstacles.

4. c. Go to step 9.

6 – Child’s activity is unrecognizable.

6. a. Robot takes a picture and adds ‘Unrecognized Activity’ entry

to Monitoring Log.

6. b. Go to step 9.

8 – Entry type is listed as dangerous.

8. a. Robot Alert Guardian.

8. b. Go to step 9.

This Use Case story is essential to the research effort as it centers on the robot’s ability

to autonomously follow and maintain the target child within the tracking system’s optimal

zone of detection. The story starts right after calibration is done and the monitoring

process has been started.

Page 140: Modelling a real-time multi-sensor fusion-based navigation ...

122

The robot starts by scanning the environment to see if the target child is visible. Once the

target is acquired, the robot will remain stationary if the child is within the optimal detection

zone. The activity tracking system will continue operating, logging perceived actions. If a

dangerous activity is detected, the robot will search through its database of recorded

trouble profiles. Otherwise, it will continue statically monitoring until manually deactivated.

The story elaborates on the various deviations from the basic flow, e.g. if a child is no

longer visible, the robot will log the loss of tracking and immediately switch to physically

searching around the room. This behavior coincides with the navigation model’s Phase 1,

where the robot will perform a 360° scan of the room until the child’s IR Marker and Body

profile are reacquired for Subject Locking.

Another story deviation occurs when the child is no longer ‘facing’ the correct angle or is

not in the appropriate range of the optimal detection zone. This prompts the robot to enter

‘orbiting mode’ where it is expected maneuver around obstacles to get into a proper

monitoring location. Again, this behavior mirrors the outcome of the navigation model’s

Phase 2, where the robot adjusts its position and makes decisions on how to circumvent

obstacles using the least effort.

The Use Case stories have been instrumental in determining the sequence of actions

necessary for each of the Injury Prevention Telepresence System’s functions. This

information was used to distill the following activity model.

Page 141: Modelling a real-time multi-sensor fusion-based navigation ...

123

4.5.2 Activity Model

The activity model of the system is expressed using UML flow charts. The primary

execution loop flow chart is shown in Figure 4-26.

[Monitoring Mode Active]

Monitor Activity

[Request for Activity Report] Get Activity Report

[Request to Alert Guardian] Alert Guardian

[Request for Conversation] Have Conversation

[None]

[None]

[None]

[Monitoring Mode Deactivated]

Figure 4-26: Main execution loop of the Injury Prevention Telepresence System.

Page 142: Modelling a real-time multi-sensor fusion-based navigation ...

124

[Child Visible]

Lock Camera Position

[Front is Visible]

Set Strategy to Stationary

[Activity Recognizable]

Take Picture and Log "Recognized Activity"

[Activity is not Dangerous]

[Child is not Visible] Log "Visibility Lost" Set Strategy to Roam

[Front is Not Visible] Log "Orbiting"Set Strategy to

Orbiting

[Activity Unrecognizable]

Take a Picture and Log "Unrecognized

Activity"

[Activity is Dangerous]Request Alert

Guardian

Figure 4-27: Expression of the "Monitor Activity" subsystem.

The main execution loop represents the primary software operation sequence, directly

responding to the guardian’s commands and activating the corresponding function

Page 143: Modelling a real-time multi-sensor fusion-based navigation ...

125

subsystems. These subsystems have been abstracted into individual models that can be

found in Appendix B. For furthering the design of the robot navigation system, the “Monitor

Activity” subsystem is excerpted in Figure 4-27.

The deviations in the corresponding Use Case story are graphically expressed clearly in

this subsystem activity model. The activation of this function begins with tracking the

position of the child. If the child is not visible, the robot will switch to a “Roam Strategy”

that can be satisfied using any existing room-searching algorithm.

The legacy requirement of facing the front of the child subject is present in this model.

However, technological advancement at this time has granted the availability of motion

tracking hardware that can operate without needing constantly face the front of the subject.

Hence, this portion of the model triggers whenever the child is outside the optimal tracking

range. This prompts the robot to active an “Approach Mode” instead of Orbiting. Again,

this coincides with the navigation model’s Phase 2 operation of approaching the child and

circumventing obstacles along the way.

Finally, the model encompasses dealing with the event of detecting a possibly injurious

activity. This event leads to a logging and notification action that will exit this subsystem

loop and return focus execution focus to the main operation loop where the appropriate

subsystem will be engaged in turn. Otherwise, the execution of this subsystem loops back

to checking for the child’s visibility once more.

With the conceptual control model defined, there is enough information for integration

with an existing vision-based injury detection model. The product of this integration is a

Class diagram depicting the proposed robot’s object-oriented software structure for the

robot, shown in Figure 4-28.

Page 144: Modelling a real-time multi-sensor fusion-based navigation ...

126

4.5.3 Software Structure Model

The role of the activity tracking system was originally intended to be filled by an existing

vision-based monitoring project for disabled people. The IRESY 2.0 prototype performed

at 91.85% accuracy while mounted statically (Lau, Ong & Putra 2014). This system was

tasked with tracking the child’s activity and triggering the periodic reports to the main robot

control software, renamed to the AutoNomous Injury Mitigation Avatar (ANIMA).

The robot hardware control was modeled to be a separate entity so that ANIMA can be

developed as a generic top-level robot behavior controller that is not tailored to any

specific robotics products. This makes the system modular and workable with an

assortment of consumer and custom robot platforms. The robot hardware control entity is

named “RobotBase”. The robot platform for this case study is modified to have a rotating

sensor nest in its head, actuated using a turn table mechanism shown in the subsequent

hardware development section. The “RobotBase” implementation was planned as an

embedded microcontroller application.

The active IR Marker tracking subsystem from the Human Orientation Tracking system

was extracted and encapsulated into its own standalone subsystem, called the

“IRTrackingSystem”. No longer concerned with template matching for determining the

orientation of the wearer, this system concentrates on visibly locating the marker within

the room and instructing the RobotBase to center the robot head on the target’s position.

At the time of this software structure’s design, the IRTrackingSystem was decoupled from

IRESY 2.0, and was intended to be run using a separate compact PC solution.

The main robot control entity ANIMA is responsible for orchestrating the execution of the

robot’s primary functionalities, which include responding to activation and deactivation of

the monitoring mode, receiving data from IRESY and the IRTrackingSystem via

RobotBase as well as provide the telepresence provisions. ANIMA was also supposed to

directly instruct RobotBase for all actuations required during relocation, obstacle

avoidance and roaming.

Page 145: Modelling a real-time multi-sensor fusion-based navigation ...

127

Figure 4-28: ANIMA System Structure.

Page 146: Modelling a real-time multi-sensor fusion-based navigation ...

128

4.5.4 Development of the RobotBase Prototype Platform

The next step was to develop a physical prototype robot platform to function as a

foundation for rapid testing of the new Active IR Marker Tracking System, integration with

IRESY and cross-device communications with ANIMA on an external computer. The

survey on state-of-the-art indoor companion robots in Chapter 1 has shown that there is

an obvious similarity in design between the various case studies. Most of these robots

rely on a cylindrical multi-layer platform and dual-actuator variable drive systems to

maneuver around common rooms and furniture while minimizing impact in case of

collisions with obstacles and people.

The prototype was developed using a DFRobot HCR Mobile Robot Kit which provided a

basic set of these features. However, due to the planned use of two external computing

and sensor systems, additional levels and modifications are required. In addition, the

development of a custom actuated turn table mechanism and rotating robot head

compartment were also required. The custom parts were designed and fabricated using

laser-cut acrylic sheets. The schematics for these custom parts can be found in Appendix

C.

Figure 4-29: RobotBase development montage. (a) Drive and Controller assembly. (b) Reworked mounting and custom power distribution. (c) Completed ultrasonic sensor array. (d) 3D printed housings for the robot head. (e) Completed actuated turn-table

mechanism.

Page 147: Modelling a real-time multi-sensor fusion-based navigation ...

129

Figure 4-30: Electronics component block diagram.

Page 148: Modelling a real-time multi-sensor fusion-based navigation ...

130

Figure 4-29 shows a picture montage of the various development stages that was

undergone for the development of the RobotBase. Significant effort was spent to plan and

implement a signal network between embedded components, controller and the onboard

computer. A power distribution system was also created to cater to all 3 systems via

battery and hardline tethering during laboratory testing.

The component selection process was extended based on the system block diagram

shown in Table 4-1 and Table 4-2. The resulting electronics component layout is shown

in Figure 4-30. An Arduino Mega microcontroller development board was chosen as the

embedded robot control solution as it possesses the number of Input and Output ports to

facilitate communications between two computers, the variable drive system, the actuated

turn-table mechanism and the ultrasonic sensor array. The active IR marker tracking

system was intended to be managed by a System-On-a-Chip solution such as a

Raspberry Pi or an Intel Galileo system. Meanwhile, the ANIMA system will reside on an

onboard computer package such as a tablet PC or an Intel NUC unit. The resulting

detailed wiring diagram was created using EAGLE and can be found in Appendix D.

Figure 4-31: The completed version 1 of the RobotBase.

Figure 4-31 shows the first version of the RobotBase completed and ready to be used as

a platform for implementing the subsequent subsystems for testing. The platform was

built to accommodate 6 ultrasonic sensors, physical bump sensors, an actuated robot

Page 149: Modelling a real-time multi-sensor fusion-based navigation ...

131

head with space for a depth-based motion capture device, an IR camera, two computing

units, and a variable drive system. The onboard computer provides wireless connectivity

for manual remote control while housing the ANIMA software for autonomous operation.

4.5.5 Development of the Human Activity Tracking System

The first application of RobotBase was to serve as a physical foundation for developing

a new version of the IRESY that is optimized for operating on a mobile platform. The

former IRESY system performs activity identification by manually comparing the body

skeleton to a set of predefined poses. While this may be effective in a statically mounted

setting, the system experiences drastic performance detriment due to changing lighting

conditions, rapid subject movement and shifting of the device’s FOV.

The new activity tracking system makes use of an updated version of the sensor hardware

that comes bundled with a suite of tools including the ability to train a neural network

gesture identification system based on AdaBoostTrigger and RFRProgress (Microsoft

2017). Using this feature, an attempt was made to train the system for identifying key

injurious activities as individual gestures, including punching, pushing, jumping and falling

(Tee, Lau, Siswoyo Jo & Wong 2016).

The most notable difference between IRESY and the new system is the integration of the

multi-sensor fusion-based navigation model’s Phase 1 functionality to augment the

activity tracking system. The depth-based motion capture device identifies humanoid

shapes as Body entities but does not have a reliable way to differentiate the identities

between the humanoids nor eliminate false detections. At the same time, the Active IR

Marker Tracking (AIRMT) system can track the relative position of the marker wearer so

long as the target is within its FOV. To recap, Phase 1 performs Target Locking by using

the AIRMT data to identify which detected Body corresponds to the target. Hence, only

the target will be the subject for human activity tracking. This new human activity tracking

system combined with mobile robot platform was renamed to the Companion Avatar

Robot for the Mitigation of Injuries (CARMI).

Page 150: Modelling a real-time multi-sensor fusion-based navigation ...

132

For the pilot study, the system was trained by instructing a volunteer subject to perform

the injurious activities while being recorded within the device’s optimal zone of detection.

Each action was repeated 4 times per subject orientation (facing front (0°), left (-90°), right

(90°) and backwards (180°)). This ensures that the system’s database includes samples

of the subject performing the actions at different relative angles. A total of 20 samples for

each action were recorded as gestures. In more ideal circumstances, more sampling sets

should translate to better tracking accuracy since the neural network has more training

data.

Figure 4-32 demonstrates how the system operates during runtime. Once Subject

Locking is successful, the target performs an action which is perceived by the motion

capture device. This action is compared to each gesture stored in the database by means

of the neural network examining matches in skeleton pose configurations. The closeness

ratio between the captured action and a recorded gesture is expressed as a “Confidence”

value displayed in the yellow box (usually a value between 0 to 1.0). Similar actions like

punches and pushes may exhibit overriding confidences but can be easily differentiated

from jumps and falls.

Figure 4-32: Example of Visual Gestures sampling for identifying injurious actions. (Tee,

Lau, Siswoyo Jo & Wong 2016)

An experiment session was conducted using the database of 4 gestures, each refined

using only 20 samples. The subject was tasked to perform each of the gestures at varying

angles and distance within the device’s optimal zone of detection (58° cone at distances

Page 151: Modelling a real-time multi-sensor fusion-based navigation ...

133

between 1.2m and 3m). The subject was also instructed to shift locations within a 110°

arch in front of CARMI to test the Subject Locking and turn-table centering operations.

The results are comprised of 30 samples for ‘punches’ and ‘pushes’, while ‘jumps’ and

‘falls’ have 20 recorded samples each. The increased sample sizes for ‘punches’ and

‘pushes’ were for more granular examination of the detection accuracy between the two,

because of the close similarities between the two gestures.

The results are tabulated earlier in Table 4-3, confirming the possible detection accuracy

issues between ‘punches’ and ‘pushes’, each registering only 36.667% and 63.333%

correct matches. Both gestures are largely similar, differing only in punches using a single

arm instead of both. Other aspects such as gait, body posture and head orientation during

execution of both gestures may have caused the dip in accuracy for ‘punches’ which often

get registered as ‘pushes’.

Figure 4-33: Functionality exhibition during PECIPTA 2015.(Borneo Post Online 2016)

‘Jumps’ and ‘falls’ presented favorable detection accuracies, at 90% and 80% tries

resulting in correct estimated gestures. The mean accuracy of the system being able to

identify correctly matching actions is calculated at 58.425%, which is above average.

Given that the samples were collected while the subject continuously moved and changed

orientation in real time, this result is optimistic. Possible optimization can be produced

with more training data to improve and refine the gesture profiles within the database.

The system was exhibited during the PECIPTA 2015 event and garnered positive

feedback from the public (Figure 4-33). The pilot experiment results confirm the

Page 152: Modelling a real-time multi-sensor fusion-based navigation ...

134

improvements from applying the navigation model’s Phase 1 Subject Locking over the

initial performance of the human activity tracking system. The additional use of Visual

Gestures also helps in rapidly adapting the system to identify new actions when compared

to the former hard-coded profile approach.

4.5.6 Navigation System Implementation using MRDS

The successful construction of the RobotBase and the application of Phase 1 into the

CARMI system greenlighted the continued development of this multi-sensor fusion-based

navigation model into a robot control system. While this can be done by working on top

of the existing ANIMA software, a better alternative was required to overcome the

execution overhead resulting from sequential handling of the motion capture device,

cameras, ranging modules, and actuators. The solution was to switch to a decentralized

service-oriented architecture so that the individual subsystems of the robot may be

handled as separate threads concurrently.

Microsoft Robotics Developer Studio (MRDS) was selected as the development platform

because of its ready provisions for harnessing and simulating the functionalities of Kinect

- the motion capture device used during the human activity tracking system project. In

addition, building the CARMI robot system using MRDS will ease installation and usability

amongst the general populace are predominantly familiar with using Microsoft Windows

operating systems.

MRDS is described as a service-oriented robot control architecture consisting of the

Concurrency and Coordination Runtime (CCR) and Decentralized Software Services

(DSS) components. The CCR provides the syntax and a runtime framework that helps

developers create software services to handle individual robot hardware inputs and

outputs (Microsoft 2012a). These services communicate between each other using

CCR’s messaging system. Therefore, the timing issues and multitasking between

services are handled by the CCR.

Page 153: Modelling a real-time multi-sensor fusion-based navigation ...

135

On the other hand, DSS is a communications service architecture that enables web

browsers and software applications interface with the CCR runtime. This allows custom

programs to extract real-time data and issue commands to the robot.

In addition to the CCR and DSS runtime environment, the software suite also includes a

proprietary visual programming language called VPL, and a physics-enabled Visual

Simulation Environment (VSE). VSE was instrumental as the simulations tool used for

testing the functionality and performance of this research’s navigation system.

Figure 4-34: Illustration of a standard telepresence robot services structure in MRDS.

(Microsoft 2012b)

Figure 4-34 is excerpted from the reference platform manual which serves as a guideline

for building indoor telepresence robots optimized for MRDS. The diagram shows an

example structure of services that the robot control system comprises of, including a

Dashboard software interface, interface services for communicating with the robot

Page 154: Modelling a real-time multi-sensor fusion-based navigation ...

136

microcontroller and the Kinect device, as well as one for encapsulating the high-level

behavior of obstacle avoidance.

MRDS provides a modular robot operating system environment that can accommodate

transitions of control services between simulated and actual hardware system interfaces.

However, this research intends to gauge the performance of the navigation model as

implemented in the form of a simulated robot system. MRDS is primarily used as the

implementation vessel and validation tool in the form of the Visual Simulations

Environment (VSE). VSE provides a physics engine and visual editing facilities to rapidly

develop virtual environments for simulation scenarios.

The caveat of using MRDS 4 is that the software product had been discontinued since

2012. Setup and runtime issues arose due to incompatible or malfunctioning software

prerequisites such as legacy versions of DirectX, XNA and Silverlight. Nevertheless,

significant effort was invested in rendering the MRDS runtime sufficiently operational in

the current Windows 10 environment.

4.5.7 CARMI Navigation System State Machine

Following the definition of the navigation model and partial implementation during

development of the CARMI robot’s human activity tracking system, a robot system state

machine design has been established for construction using MRDS 4. Shown in Figure

4-35, the CARMI navigation system consists of two state machine loops that operate

concurrently.

The Subject Locking loop encapsulates the logic behind the existing human activity

tracking system. The robot head will be engaged in scanning mode until both the motion-

capture device and IR marker tracking system acquired the target and successfully

achieves locks onto the target. The robot head will swivel itself so that the sensor nest is

always centered onto the target. This process repeats itself whenever target visibility is

lost. Successful Subject Locking will trigger the execution of the Pathfinding loop.

Page 155: Modelling a real-time multi-sensor fusion-based navigation ...

137

The Pathfinding loop switches between ‘Idle’, ‘Approach’ and ‘Pathfinding’ depending on

the distance of the target and whenever an obstacle is encountered during ‘Approach’. If

the target remains in the motion-capture device’s optimal detection zone, then it remains

‘Idle’. Else, the robot will center its body towards the sensor nest’s heading, then begin

moving towards or away from the target in ‘Approach’. If an obstacle is encountered, the

robot scans the short, mid and long-range vicinity and the ‘Path Decider’ decides on which

direction to begin maneuvering around the obstacle. This process was elaborated in the

navigation system model section.

Figure 4-36 illustrates the transformation of the state machine design into MRDS service

architecture. The implemented robot control and navigation system consists of 15 custom

services. Most of these services are responsible for interacting with built-in Application

Programming Interfaces (API) for the proximity sensors array, depth camera, IR camera,

hardware switches, variable drive system and the turn table actuation. A select handful

of services encapsulate the Phase 1: Subject Locking and Phase 2: Pathfinding

algorithms in the form of a nested state machine structure. The ‘AIRMT’, ‘Subject Locking’

and ‘Path Decider’ services were built to include UI controls that enable manual

adjustment of the variable constants.

Page 156: Modelling a real-time multi-sensor fusion-based navigation ...

138

Figure 4-35: The CARMI navigation system state machine.

Page 157: Modelling a real-time multi-sensor fusion-based navigation ...

139

Figure 4-36: Overview of the custom-build services for CARMI to be simulated using

MRDS and VSE.

Page 158: Modelling a real-time multi-sensor fusion-based navigation ...

140

Figure 4-37: Example of the simulated CARMI and Child entities in VSE.

Finally, several services such as the ‘Simulations Engine’ and ‘Simulations Referee’ are

tasked for facilitating communications with VSE. These help stage custom-created

scenarios in which the CARMI, Child and Decoy entities can be placed into for testing the

performance of the navigation system. An example of a successful scenario set up in

VSE is shown in Figure 4-37.

4.6 CHAPTER SUMMARY

This chapter documented the evolution of the multi-sensor fusion-based navigation

system. Its inception stems from the initial application of the companion robot

development framework proposed in Chapter 1. The combination of COTS components

and amalgamation of existing technologies gave rise to the need for a workable robot

control system that can overcome the challenges of indoor navigation using only those

selections.

This model consists of two phases. Phase 1: Subject Locking utilizes an Active IR Marker

Tracking System (AIRMT) and a motion-capture device to lock onto a target for human

Page 159: Modelling a real-time multi-sensor fusion-based navigation ...

141

following. Phase 2: Pathfinding was inspired by the Potential Field Method, Wandering

Standpoint Algorithm and Coulombs Law, a method of fusing sensor data for close, mid,

and long-range obstruction landscape was established. The landscape is a depth

snapshot of the surrounds between the robot and a human target, which is a product of

fusing data from proximity sensors, AIRMT and a depth camera. This information is used

by the robot to decide on turning left or right at meeting and obstacle, before proceeding

with avoidance maneuvers.

To test the applicability of this system, it was to be integrated into a case study of a

Companion Avatar Robot for the Mitigation of Injuries (CARMI). A pilot development effort

was undertaken to create a physical robot platform so that the application of Phase 1 can

be demonstrated. This was done successfully, with both acceptable human activity

tracking results and public reception.

The model is then implemented into the CARMI navigation system, using Microsoft

Robotics Developer Studio 4. This framework provided the runtime and simulation tools

to help create the CARMI navigation system as a real-time service-based robot control

system. From this point onwards, scenarios can be developed for testing the performance

of the implemented multi-sensor fusion-based navigation system.

Page 160: Modelling a real-time multi-sensor fusion-based navigation ...

142

Chapter 5: TESTING AND BENCHMARKING RESULTS

5.1 INTRODUCTION

The previous chapter encapsulated the process of incepting the multi-sensor fusion-

based navigation model and its implementation using Microsoft Robotics Developer

Studio (MRDS). The model defined a set of two concurrently executing state-machine

loops that oversees the tracking of the target’s relative position and the robot’s relocation

into the Activity Tracking System’s optimal zone of detection, respectively. While the

development of a physical platform has helped validated the possibility of such a system

being built in physical space, the remainder of the model’s implementation and testing

are conducted via software simulation.

The purpose of this chapter is to document the simulation-based testing process, from

scenario design and implementation to runtime results. The objectives of this chapter are

as follows:

a) Identify the common indoor obstacle avoidance challenges.

b) Develop a set of simulation scenarios that reflect the listed challenges.

c) Create a testing plan to examine the functionality of the implemented navigation system.

d) Search and select suitable indoor robot navigation projects for use as performance benchmarks.

e) Develop scenarios equivalent to test environments used in the benchmark studies.

f) Conduct benchmark scenario simulations and compare performance between this system and the benchmark studies.

This research aims to demonstrate that the system can successfully perform autonomous

obstacle avoidance and path selection for human-following via functional testing. Next, it

Page 161: Modelling a real-time multi-sensor fusion-based navigation ...

143

seeks to demonstrate the potential of this navigation model in exceeding the human-

following performance of other studies’ systems. This is carried out by recreating the

benchmark studies’ experiment setups and running this prototype system according to

those mission parameters. Thus, the travel distances of both human and robot entities

from both benchmarks and prototype can be compared to discern performance

differences. The recreation process can be carried out without much difficulty due to

VSE’s visual based editing and real-time simulation environment.

5.2 FUNCTIONAL TESTING PLAN

To recap, an indoor robot’s navigation system must be able to guide the robot through a

room and around its contents without any collision. The contents consist of both static

and dynamic obstacles, so relying on periodic snapshots of the environment is not enough.

The system can only utilize on-board sensor suites and contend with their limited

perception of its immediate surroundings. Since the navigation system is aimed for use

by companion robots, its goal is to circumvent the environment so that the robot can

maintain a set proximity from the target while ensuring line-of-sight.

Figure 5-1: Typical indoor environment simulated with VSE (Microsoft Corporation

2012).

Figure 5-1 shows a default living room scenario from the MRDS simulation suite called

the Visual Simulation Environment (VSE). The contents of this example can represent

most furniture and objects encountered in most rooms in a house where a companion

Page 162: Modelling a real-time multi-sensor fusion-based navigation ...

144

robot can be expected to operate in. Figure 5-2 helps segregate the myriad of common

obstacles into three categories based on size and shape. High obstacles present

maximum hindrance in both mobility and visibility – dining tables and people break line-

of-sight and must be navigated around. Medium obstacles such as stools and shoe racks

also need to be navigated around but does not hinder the visual tracking of the target.

Low obstacles can be driven across or pushed out of the way. The navigation system

must accommodate these encounters, but it is assumed that an environment suited for

companion robot navigation will have minimal number and prudent placement of high

obstacles. Thus, the challenge is less about maintaining line-of-sight and more towards

circumventing obstacles to escort a target.

Figure 5-2: Classification of indoor obstacles. (Tee, Lau, Siswoyo Jo & Lau 2016)

A further classification step is taken by separating the obstacles based on shape. Thus,

some are uniform shaped (such as tables, stools, etc.) while others are non-uniform

(sofas, bookshelves etc.). Figure 5-3 (a) shows the baseline scenario created in VSE.

This empty scenario includes only the CARMI entity, a Child target and two Decoy entities.

The baseline was previously used to test the functionality of the Activity Tracking System

and the Navigation System’s target tracking loop. It was used to fine-tune the ability to

differentiate between the Decoys and Child entities by fusing the Active IR Marker and

Kinect body detection system feeds.

Page 163: Modelling a real-time multi-sensor fusion-based navigation ...

145

Figure 5-3: A baseline and six obstacle scenarios. (Tee, Lau & Siswoyo Jo 2018a)

Building upon the baseline, scenarios (b) and (e) represent the encounters with uniform

and non-uniform obstacles respectively. When faced with a uniform obstacle, navigating

to either side of the block will present negligible difference if effort. But considering a

scattered group of obstacles to the left (c) or right (d) ahead of the uniform block, the

navigation system should be inclined to choose the side with least future impedances.

When facing a non-uniform obstacle (e), there is a big possibility that one or both the

edges of the obstacle are not visible to the robot. If an edge is visible, it should choose

that direction to begin wall-following so that it can be circumvented faster. However, the

presence of scattered obstacles ahead ((f) and (g)) could influence it to choose the longer

edge.

Now that the scenarios have been created via VSE, the testing metrics need to be defined

to validate the system’s ability to help CARMI choose a path of least effort/resistance

while approaching the optimal proximity to the target Child. In this case, the best path

leads CARMI to the Child in the least amount of time or distance. There are only two

possible horizontal routes around any obstacle (left or right of the block).

Page 164: Modelling a real-time multi-sensor fusion-based navigation ...

146

Figure 5-4: Baseline Scenario Sample. Unit for X-Z world coordinates in meters (m).

Page 165: Modelling a real-time multi-sensor fusion-based navigation ...

147

To supply simulation data, the navigation system is implemented with separate logging

facilities for the ‘Referee’ and ‘PathDecider’ services. The ‘Referee’ service logger outputs

the 2D location for both the CARMI and Child entities, while the ‘PathDecider’ service

logs the numerical content of the Left/Right tendency arrays and the path-decisions it

makes during each timestamp. When put together in a scatter-graph and overlaid with a

rough overhead screen-cap of the scenario, it is possible to chart the motion-path of

CARMI towards the Child, and examine the decisions made by the navigation system at

each point. The runtime for each sample can also be extracted from the logs for

documentation purposes.

Since the output results are graphical in nature, a Plot Digitizer is used to approximate

CARMI’s travel distance in each sample. An alternate route is charted and digitized to

present the opposing travel distance so that a performance comparison can be made.

The navigation system is examined for its consistency in deciding on the shortest path in

each scenario. An example of a runtime output can be seen in Figure 5-4. A composite

graph of all samples is shown and overlaid with an overhead view of the scenario in (a).

Scenario (b) shows a single sample log with both ‘Referee’ and ‘PathDecider’ logs

staggered to coincide based on timestamps. The following section will present the

functional testing simulation results for the six obstacle scenarios.

5.3 FUNCTIONAL TESTING SIMULATION RESULTS

All 7 scenarios as described in the previous section (Figure 5-3) were created and used

for an average of 5 runtimes each. Data collection for each runtime takes place in the

form of ‘Referee Log’ (containing positional data for simulated entities) and ‘PathDecider’

Log (containing tendency array contents and ‘PathDecider’ verdict), extracted from the

two similarly named services. The logs are combined and staggered according to a

universal timestamp, so that a record of tendency array contents and the resultant path

decisions can be correlated with the position of the CARMI robot and Child at each interval.

The combined results log and the graphed entity positions for all samples are compiled

in Appendix E. To help illustrate the findings, one of the sample datasets for each scenario

Page 166: Modelling a real-time multi-sensor fusion-based navigation ...

148

is presented here, along with a composite graph showing the CARMI paths from each

sample.

5.3.1 Scenario: Single Uniform Obstruction

For the first attempt at collecting functionality test data, a total of 10 samples had been

collected. The combined paths for the remaining 7 tries can be found in Figure 5-5.

Figure 5-5: Combined CARMI paths during single uniform obstruction test.

While both paths around a uniform obstruction should be indifferent to each other, the

navigation system seems to favour taking the left-side (5 out of 7 samples). Table 5-1

shows an excerpt from the logs of the Trial 1 sample. Areas highlighted in red indicate

timestamps where not all services have successfully loaded yet. Runtime countdown

begins after all services are operational.

9

10

11

12

13

14

15

3456789

Trial 1

Trial 2

Trial 3

Trial 4

Trial 5

Trial 6

Trial 7

Child

Page 167: Modelling a real-time multi-sensor fusion-based navigation ...

149

Table 5-1: Excerpt from the combined logs of Uniform-Obstruction-Clear Sample 1. Unit for X-Z world coordinates in meters (m).

The sets highlighted in yellow indicates the point where the navigation system

acknowledges the presence of an impending obstruction and evaluates the contents of

Page 168: Modelling a real-time multi-sensor fusion-based navigation ...

150

the Left and Right tendency arrays. Note that there was a strong preference towards the

Left between timestamps 4:25:19PM – 4:25:26PM. This may be the result of the system

considering the presence of the right-handed Decoy. This entity was placed slightly ahead

of the Child and the other Decoy, thus was registered as another obstruction ahead. This

prompts the system to favour the left-sided path which will potentially avoid that Decoy.

Table 5-2: Elapsed time for each sample in Uniform-Obstruction-Clear.

Samples Runtime (seconds)

Trial 1 60

Trial 2 43

Trial 3 48

Trial 4 47

Trial 5 53

Trial 6 42

Trial 7 45

Average Runtime 48.286

Finally, the elapsed time from each sample is presented in Table 5-2, indicating that the

average time it took for CARMI to complete an approach towards the Child while

encountering a single uniform obstruction is roughly 48s. Elapsed time in this case, is

largely influenced by the choice of actuators, robot build and the VSE physics settings. It

is recorded from the moment all services are activated until CARMI settles into the final

position. While elapsed time may not be indicative of performance at this stage, it will

become useful for comparing benchmark results in the later sections.

The results of the first scenario show that even in a simple encounter with a uniform object,

the navigation system also considers the proximity of nearby obstructions beyond its

immediate surroundings when deciding on which path direction to take. This alone

indicates marked improvement over most rudimentary navigation alternatives which

would have produced random outcomes since both paths do not present obvious

Page 169: Modelling a real-time multi-sensor fusion-based navigation ...

151

differences. The datasets for the remaining samples can be found in Appendix E. The

next few scenarios explores this capability further.

5.3.2 Scenario: Uniform Obstruction with Scattered Obstacles on the Left

The next scenario is a modification of the uniform obstruction, with inclusion of scattered

objects ahead of the primary one. This scenario’s scatter field is placed towards the left

of the field, so that the navigation system can contend with the additional depth objects

in mid-range. Presumably, the CARMI robot is expected to approach the Child and

encounter the uniform obstruction as before. However, with the inclusion of the scatter

field, it should consider the proximity of all perceived mid-range objects then produce a

right-biased tendency profile.

This behaviour was confirmed upon the completion of the five runtime sampling rounds,

resulting in the combined motion path graph as shown in Figure 5-6 (a). This time, all five

samples showed that CARMI veered towards the right-side of the main obstruction, even

with inclusion of the right-handed Decoy. Examining the excerpt log in Figure 5-6 (b), the

timestamps around the yellow-highlighted portion shows a strong inclination towards the

right. This indicates that the combined sensor fusion data perceived which the clearer

side is long before reaching the point of decision-making.

Page 170: Modelling a real-time multi-sensor fusion-based navigation ...

152

Figure 5-6: Combined motion graph and sample dataset for Uniform Obstruction with

Leftward Scatter. Unit for X-Z world coordinates in meters (m).

Page 171: Modelling a real-time multi-sensor fusion-based navigation ...

153

Figure 5-6 (a) also shows the results from Plot Digitizing both the average sample route

and the longer alternative. The favoured right-side route average at 3.234m when

compared to the route that had to travel 4.185m to circumvent the scatter field before

reaching the position for the optimal zone of detection. This behaviour clearly confirms

the sensitivity of the navigation system as first observed in the previous scenario.

Table 5-3: Elapsed time for each sample in Uniform-Obstruction-with-Left-Scatter.

Samples Runtime (seconds)

Trial 1 26

Trial 2 48

Trial 3 47

Trial 4 57

Trial 5 52

Average Runtime 46

The elapsed time for each of the five samples are tabulated in Table 5-3, along with an

average runtime of 46s. As before, all datasets can be found in Appendix E.

5.3.3 Scenario: Uniform Obstruction with Scattered Obstacles on the Right

The third scenario was also created to examine the robot navigation system’s response

to a scatter field ahead of the primary object. This time, the field is positioned to the right.

Since the system’s behaviour correctly guided the robot towards the side that mirrors the

scatter field, this scenario will help reinforce this validity by switching the side of the field

placement. Again, it is intended that the system help guide CARMI towards the left side

of the uniform obstacle, thus ending up on the shorter path.

As with the previous attempt, a total of five successful samples were collected to

supplement this functionality examination. The resultant motion paths from all samples

Page 172: Modelling a real-time multi-sensor fusion-based navigation ...

154

can be seen in the combined graph in Figure 5-7 (a). As anticipated, the CARMI entity

sided towards the left for all samples. Overviewing the excerpt from the log in Figure 5-7

(b), it can be observed that the system had a clear bias towards the left once the mid-

range view-space acquired the presence of the field as well as the right-most Decoy. This

predictably resulted in a strong bias towards the clearer left side of the obstruction. The

sample datasets are in Appendix E.

Table 5-4: Elapsed time for each sample in Uniform-Obstruction-with-Right-Scatter.

Samples Runtime (seconds)

Trial 1 45

Trial 2 44

Trial 3 39

Trial 4 45

Trial 5 39

Average Runtime 42.4

The plot digitization of the combined graph also shows that the average travelled path

measures 2.815m vs the estimated alternate route of 4.226m. Compared to the Left-

Scatter Scenario’s obstacle placement, this setup presented a much clearer shortest-path

to the robot. The elapsed time for all samples can be found in Table 5-4. The average

period of this scenario’s samples is 42.4s. The runtime differences between all three

uniform obstacle scenarios were not expected to be significant since only the

circumvention of the primary object is necessary before CARMI reaches the optimal

proximity to the Child entity. However, the following three scenarios are expected to

produce clearly differing runtime as a non-uniform object will be used as the primary

obstacle.

Page 173: Modelling a real-time multi-sensor fusion-based navigation ...

155

Figure 5-7: Combined motion graph and sample dataset for Uniform Obstruction with

Rightward Scatter. Unit for X-Z world coordinates in meters (m).

Page 174: Modelling a real-time multi-sensor fusion-based navigation ...

156

5.3.4 Scenario: Single Non-Uniform Obstruction

The third and subsequent scenarios will help examine the navigation system’s capability

of assessing the short and mid-range obstacle landscape when encountering a non-

uniformly shaped large obstruction, such as a bench or a short counter. To begin, a bare

environment with just the non-uniform obstruction is presented, along with the usual setup

for CARMI, Child and Decoy entities.

Figure 5-8: Combined CARMI paths during the single non-uniform obstruction scenario, along with plot digitization of the alternate route and their approximated travel distance

comparisons.

This time, when CARMI approached the obstacle in the attempt to close the distance

between itself and the Child, it arrived at the decision-making trigger point near one end.

This setting and the composite travel paths for all five samples are presented in Figure

5-8.

Page 175: Modelling a real-time multi-sensor fusion-based navigation ...

157

Table 5-5: A runtime sample excerpt from the combined logs of the Single Non-Uniform-Obstruction scenario. Unit for X-Z world coordinates in meters (m).

Page 176: Modelling a real-time multi-sensor fusion-based navigation ...

158

The results show an 80% tendency for the system to favour the right-sided route, as

portions of the right-end of the obstacle was visible within the sensor’s Field of View.

(FOV). However, one in five samples show that CARMI opted for the left-side route. One

possible explanation for this outcome is that the robot approached the obstacle at an

angle which obscured the nearer right-end. Given that the contents of the FOV showed

no clear ways around the immediate obstruction, the navigation system evaluated the

proximity of the Child and Decoy entities. Since the right-sided Decoy is nearer to the

robot, the navigation system decided to take the opposite direction. Table 5-5 shows an

excerpt from one of the sample’s adjusted logs. The highlighted timestamps indicate the

moment when the navigation system approached the trigger point for decision-making. It

appears that while the robot was sufficiently farther away, it sees the right-end of the

obstruction and tendency array verdicts favour the right-sided route. However, as it

approaches the obstruction, verdicts sway between left and right since no clear option is

present. With the presence of additional scatter fields, this result may be influenced, as

observed in the following scenarios.

Table 5-6: Elapsed time for each sample in Single Non-Uniform-Obstruction scenario.

Samples Runtime (seconds)

Trial 1 60

Trial 2 40

Trial 3 92

Trial 4 41

Trial 5 40

Average Runtime 54.6

Table 5-6 presents the elapsed time for all five samples collected throughout the single-

non-uniform obstruction scenario runs. Due to a momentary service delay during Trial 3,

the average runtime for this experiment is 54.6s. The average runtime is only slightly

longer than the previous uniform obstruction scenario tests, but this is due to the small

gap between the right-end of the obstruction and the robot. Figure 5-8 also includes the

Page 177: Modelling a real-time multi-sensor fusion-based navigation ...

159

digitized route for the alternate path, which shows a clear difference in travel distance

between both choices (2.815 vs 4.226m). An increase in travel distance and a standard

robot travel speed translates to longer time needed for CARMI to circumvent obstacles.

The raw datasets for all samples can be found in Appendix E.

5.3.5 Scenario: Non-Uniform Obstruction with Scattered Obstacles on the Left

This scenario continues the examination of the navigation system on circumventing a

non-uniform obstruction, this time with the addition of a scatter field ahead. The scatter

field is placed on the left-side of the entities. This should help reinforce the robot’s

decision-making to favour the closer right-end of the obstruction to begin wall-following.

Figure 5-9 shows the combined motion path results from a total of six sample runtimes

for this scenario. All dataset results are in line with this expectation, as CARMI decided

on only taking the right-side route. The diagram also includes the plot digitized alternate

route, showing a much larger difference in travel distances due to inclusion of the scatter

field. If the robot opted for the left-side route, it will need to travel 5.829m before reaching

the correct proximity while achieving clear line of sight to the Child. Table 5-7 shows an

excerpt from one of the combined logs. Timestamp records leading to the decision-point

(highlighted entries) show a strong tendency towards a right-sided route. The presence

of the left scatter field and an indeterminate left-end of the obstruction resulted in a

composite left tendency array that overcomes the closer proximity of the right-handed

Decoy.

Page 178: Modelling a real-time multi-sensor fusion-based navigation ...

160

Figure 5-9: Combined CARMI paths during the non-uniform obstruction scenario with

left scatter field, along with plot digitization of the alternate route and their approximated travel distance comparisons.

Table 5-8 presents the elapsed time for each sample for this scenario. Like the single

non-uniform obstruction scenario, the average runtime for this scenario is 51s despite

presence of the scatter field. This is because in both scenarios, the robot strongly opted

for the same route (right-side), ignoring the lengthier left-end of the obstruction and the

left scatter field entirely. The next scenario attempts to induce a left-side route decision

by repositioning the scatter field to the right-half of the room.

Page 179: Modelling a real-time multi-sensor fusion-based navigation ...

161

Table 5-7: A runtime sample excerpt from the combined logs of the Non-Uniform-Obstruction-with-Left-Scatter-Field scenario. Unit for X-Z world coordinates in meters

(m).

Page 180: Modelling a real-time multi-sensor fusion-based navigation ...

162

Table 5-8: Elapsed time for each sample in the Non-Uniform-Obstruction-with-Left-Scatter-Field scenario.

Samples Runtime (seconds)

Trial 1 67

Trial 2 45

Trial 3 46

Trial 4 51

Trial 5 50

Trial 6 47

Average Runtime 51

5.3.6 Scenario: Non-Uniform Obstruction with Scattered Obstacles on the Right

The final functionality testing scenario involves placing the scatter field in the right-half of

the room, between the Child and Decoy entities as well as the main non-uniform

obstruction. Figure 5-10 shows this arrangement, along with the combined motion paths

taken by CARMI throughout four sample runtimes. In contrast to the previous scenarios,

CARMI exhibited strong decisions on taking the left-side route around the main

obstruction. Considering the placement of the robot at the beginning of the test, it will

perceive a hint of the right-end being the closest edge, before reaching the position for

decision-making.

Page 181: Modelling a real-time multi-sensor fusion-based navigation ...

163

Figure 5-10: Combined CARMI paths during the non-uniform obstruction scenario with

right scatter field, along with plot digitization of the alternate route and their approximated travel distance comparisons.

Page 182: Modelling a real-time multi-sensor fusion-based navigation ...

164

Table 5-9: A runtime sample excerpt from the combined logs of the Non-Uniform-Obstruction-with-Right-Scatter-Field scenario. Unit for X-Z world coordinates in meters

(m).

Referee Log PathDecider Log

Time-Stamp

CARMI Child

Time-Stamp

Tendencies

Verdict X Z X Z L R

10:48:28 AM 6 9.5 6 14.5 10:48:29 AM 6 9.5 6 14.5 10:48:30 AM 6 9.5 6 14.5 10:48:31 AM 5.99592 9.500831 6.000019 14.4998 10:48:32 AM 6.009068 9.629572 6.000019 14.4998

10:48:33 AM -132.71 -121.923 Right

10:48:34 AM 6.018241 9.722364 6.000019 14.49996 10:48:34 AM -134.793 -120.342 Right

10:48:35 AM 6.018951 9.722609 6.000019 14.49997 10:48:35 AM -132.379 -120.669 Right

10:48:36 AM 6.019325 9.72325 6.000019 14.49997 10:48:36 AM -131.267 -122.602 Right

10:48:37 AM 6.020663 9.723968 6.000019 14.49996 10:48:37 AM -132.982 -121.241 Right

10:48:38 AM 6.021132 9.724427 6.000019 14.4998 10:48:38 AM -132.61 -121.392 Right

10:48:39 AM 6.022233 9.724319 6.000019 14.49996 10:48:39 AM -130.948 -122.342 Right

10:48:40 AM 6.022468 9.724886 6.000019 14.4998 10:48:40 AM -130.999 -121.761 Right

10:48:41 AM 6.022222 9.767302 6.000019 14.49985 10:48:41 AM -130.463 -121.9 Right

10:48:42 AM 6.016362 10.10283 6.000019 14.49985 10:48:42 AM -130.08 -125.872 Right

10:48:43 AM -125.861 -126.63 Left

10:48:44 AM 6.00977 10.48014 6.000019 14.49996 10:48:44 AM -117.647 -125.408 Left

10:48:45 AM 6.00391 10.81541 6.000019 14.49985 10:48:45 AM -108.618 -122.762 Left

10:48:46 AM 5.987459 11.06953 6.000019 14.49996 10:48:46 AM -109.612 -123.527 Left

10:48:47 AM 5.973427 11.06703 6.000019 14.49985 10:48:47 AM -116.899 -114.422 Right

10:48:48 AM 5.96084 11.07514 6.000019 14.49985 10:48:49 AM 5.952713 11.08417 6.000019 14.49997 10:48:49 AM -125.39 -112.528 Right

10:48:50 AM 5.985577 11.10983 6.000019 14.4998 10:48:50 AM -134.342 -108.657 Right

10:48:51 AM 6.028952 11.12534 6.000019 14.49985 10:48:51 AM -140.865 -110.024 Right

10:48:52 AM 6.087961 11.13871 6.000019 14.49996 10:48:52 AM -148.442 -114.168 Right

10:48:53 AM 6.13534 11.14533 6.000019 14.4998 10:48:53 AM -146.531 -109.406 Right

10:48:54 AM 6.191909 11.13982 6.000019 14.49997 10:48:54 AM -149.177 -109.559 Right

10:48:55 AM -155.428 -109.39 Right

10:48:56 AM 6.240949 11.13314 6.000019 14.49997 10:48:56 AM -158.071 -108.683 Right

10:48:57 AM 6.282056 11.11869 6.000019 14.49997 10:48:57 AM -159.71 -107.75 Right

10:48:58 AM 6.332334 11.09471 6.000019 14.4998 10:48:58 AM -159.748 -107.205 Right

10:48:59 AM 6.392854 11.07143 6.000019 14.49985 10:48:59 AM -158.088 -107.451 Right

10:49:00 AM 6.441599 11.0538 6.000019 14.4998 10:49:00 AM -158.056 -108.785 Right

10:49:01 AM 6.488751 11.03888 6.000019 14.49997 10:49:01 AM -159.709 -108.889 Right

Page 183: Modelling a real-time multi-sensor fusion-based navigation ...

165

10:49:02 AM 6.535773 11.02227 6.000019 14.49996 10:49:02 AM -160.04 -108.38 Right

10:49:03 AM 6.588978 11.00392 6.000019 14.4998 10:49:04 AM 6.636698 10.98684 6.000019 14.4998 10:49:04 AM -159.712 -109.505 Right

10:49:05 AM 6.680408 10.96762 6.000019 14.4998 10:49:05 AM -159.402 -108.284 Right

10:49:06 AM 6.72623 10.95802 6.000019 14.49997 10:49:06 AM -159.332 -107.888 Right

10:49:07 AM 6.77289 10.94839 6.000019 14.49997 10:49:07 AM -158.034 -108.114 Right

10:49:08 AM -157.738 -108.273 Right

10:49:09 AM 6.828001 10.93324 6.000019 14.49985 10:49:09 AM -157.738 -108.324 Right

10:49:10 AM 6.892042 10.92458 6.000019 14.49996 10:49:10 AM -157.022 -107.175 Right

10:49:11 AM 6.935096 10.91855 6.000019 14.49985 10:49:11 AM -156.743 -108.717 Right

10:49:12 AM 6.980083 10.90996 6.000019 14.4998 10:49:12 AM -157.483 -108.025 Right

10:49:13 AM 7.017516 10.90872 6.000019 14.49997 10:49:13 AM -155.416 -109.184 Right

10:49:14 AM 7.063556 10.90722 6.000019 14.49997 10:49:14 AM -155.407 -108.072 Right

10:49:15 AM 7.118571 10.89903 6.000019 14.4998 10:49:15 AM -156.236 -109.088 Right

10:49:16 AM 7.174786 10.89298 6.000019 14.4998 10:49:16 AM -156.419 -109.466 Right

10:49:17 AM 7.215933 10.88938 6.000019 14.49996 10:49:17 AM -156.236 -110.046 Right

10:49:18 AM 7.264478 10.89126 6.000019 14.49985 10:49:18 AM -155.73 -111.292 Right

10:49:20 AM 7.317478 10.88759 6.000019 14.49985 10:49:20 AM -156.758 -110.175 Right

10:49:21 AM 7.370412 10.88661 6.000019 14.4998 10:49:21 AM -156.802 -109.903 Right

10:49:22 AM 7.431563 10.88682 6.000019 14.49996 10:49:22 AM -156.407 -109.974 Right

10:49:23 AM 7.481662 10.88469 6.000019 14.49985 10:49:23 AM -156.403 -111.606 Right

10:49:24 AM 7.534972 10.89207 6.000019 14.49997 10:49:24 AM -157.08 -110.23 Right

10:49:25 AM 7.587667 10.88978 6.000019 14.4998 10:49:25 AM -156.744 -112.294 Right

10:49:26 AM 7.630478 10.89282 6.000019 14.49996 10:49:26 AM -157.09 -112.263 Right

10:49:27 AM 7.681082 10.89302 6.000019 14.49985 10:49:27 AM -157.731 -110.933 Right

10:49:28 AM 7.729269 10.88848 6.000019 14.4998 10:49:28 AM -159.713 -110.795 Right

10:49:29 AM 7.784951 10.88866 6.000019 14.4998 10:49:29 AM -159.737 -111.53 Right

10:49:30 AM -158.392 -110.936 Right

10:49:31 AM 7.830725 10.88566 6.000019 14.4998 10:49:31 AM -158.401 -110.239 Right

10:49:32 AM 7.890501 10.89056 6.000019 14.49985 10:49:32 AM -158.374 -110.504 Right

10:49:33 AM 7.939007 10.88898 6.000019 14.49997 10:49:33 AM -159.051 -111.423 Right

10:49:34 AM 7.986757 10.88636 6.000019 14.49985 10:49:34 AM -159.405 -111.284 Right

10:49:35 AM 8.039586 10.88745 6.000019 14.49996 10:49:35 AM -158.385 -109.397 Right

10:49:36 AM 8.094811 10.88987 6.000019 14.49985 10:49:37 AM 8.131251 10.89508 6.000019 14.49997 10:49:37 AM -158.523 -110.836 Right

10:49:38 AM 8.176563 10.90366 6.000019 14.49997 10:49:38 AM -155.523 -111.281 Right

10:49:39 AM 8.231293 10.91576 6.000019 14.49985 10:49:39 AM -152.779 -110.483 Right

10:49:40 AM 8.289344 10.94089 6.000019 14.49997 10:49:40 AM -153.349 -112.228 Right

10:49:41 AM -150.696 -110.81 Right

10:49:42 AM 8.345519 10.96112 6.000019 14.49996 10:49:42 AM -151.937 -110.94 Right

Page 184: Modelling a real-time multi-sensor fusion-based navigation ...

166

10:49:43 AM 8.389972 10.98054 6.000019 14.49985 10:49:43 AM -148.683 -111.759 Right

10:49:44 AM 8.431435 11.00328 6.000019 14.49996 10:49:44 AM -145.437 -111.531 Right

10:49:45 AM 8.469328 11.03376 6.000019 14.49985 10:49:45 AM -143.15 -109.378 Right

10:49:46 AM 8.503297 11.06351 6.000019 14.49997 10:49:46 AM -143.65 -110.372 Right

10:49:47 AM 8.532343 11.09614 6.000019 14.49996 10:49:47 AM -139.962 -113.601 Right

10:49:48 AM 8.55698 11.13955 6.000019 14.4998 10:49:48 AM -136.877 -107.859 Right

10:49:49 AM 8.586799 11.1834 6.000019 14.49997 10:49:49 AM -136.966 -110.191 Right

10:49:50 AM 8.60596 11.22488 6.000019 14.49996 10:49:50 AM -134.525 -112.419 Right

10:49:51 AM 8.620514 11.27305 6.000019 14.49997 10:49:51 AM -129.555 -106.039 Right

10:49:52 AM 8.636838 11.32127 6.000019 14.49996 10:49:52 AM -135.688 -106.03 Right

10:49:53 AM -138.026 -106.087 Right

10:49:54 AM 8.660669 11.37096 6.000019 14.49985 10:49:55 AM 8.680889 11.4193 6.000019 14.49997 10:49:55 AM -138.332 -105.745 Right

10:49:56 AM 8.696641 11.45739 6.000019 14.49985 10:49:56 AM -138.194 -105.767 Right

10:49:57 AM 8.714813 11.50523 6.000019 14.49997 10:49:57 AM -137.605 -105.56 Right

10:49:58 AM 8.733751 11.55114 6.000019 14.49997 10:49:58 AM -138.131 -105.691 Right

10:49:59 AM 8.753076 11.59394 6.000019 14.49985 10:49:59 AM -138.673 -105.829 Right

10:50:00 AM 8.768402 11.63455 6.000019 14.49985 10:50:00 AM -137.724 -105.804 Right

10:50:01 AM 8.777872 11.67525 6.000019 14.49996 10:50:01 AM -134.969 -106.888 Right

10:50:02 AM 8.787521 11.71886 6.000019 14.49996 10:50:02 AM -134.061 -107.205 Right

10:50:03 AM 8.797791 11.76655 6.000019 14.4998 10:50:03 AM -133.896 -106.367 Right

10:50:04 AM 8.798552 11.81849 6.000019 14.49996 10:50:04 AM -133.149 -107.397 Right

10:50:05 AM -131.328 -107.105 Right

10:50:06 AM 8.801516 11.87918 6.000019 14.49996 10:50:06 AM -128.457 -108.196 Right

10:50:07 AM 8.792281 11.92729 6.000019 14.49996 10:50:07 AM -125.662 -109.687 Right

10:50:08 AM 8.782799 11.97552 6.000019 14.4998 10:50:08 AM -125.944 -108.996 Right

10:50:09 AM 8.772814 12.01702 6.000019 14.49997 10:50:09 AM -120.094 -109.701 Right

10:50:10 AM 8.753618 12.06511 6.000019 14.4998 10:50:10 AM -120.897 -109.682 Right

10:50:11 AM 8.738868 12.10582 6.000019 14.49997 10:50:12 AM 8.71206 12.1451 6.000019 14.4998 10:50:12 AM -119.367 -109.212 Right

10:50:13 AM 8.68756 12.1848 6.000019 14.4998 10:50:13 AM -118.161 -111.606 Right

10:50:14 AM 8.671518 12.20762 6.000019 14.49997 10:50:14 AM -109.393 -112.875 Left

10:50:15 AM 8.502569 12.34295 6.000019 14.49996 10:50:15 AM -111.599 -112.478 Left

10:50:16 AM 8.243203 12.55266 6.000019 14.49996 10:50:16 AM -109.938 -114.899 Left

10:50:17 AM 8.059642 12.70112 6.000019 14.49996 10:50:17 AM -108.31 -115.247 Left

10:50:18 AM 8.059722 12.70107 6.000019 14.4998 10:50:18 AM -109.198 -115.031 Left

10:50:19 AM -109.188 -115.021 Left

10:50:20 AM 8.059722 12.70107 6.000019 14.49985 10:50:20 AM -109.185 -115.018 Left

10:50:21 AM 8.059722 12.70107 6.000019 14.49996 10:50:21 AM -109.191 -115.031 Left

10:50:22 AM 8.059722 12.70107 6.000019 14.49997

Page 185: Modelling a real-time multi-sensor fusion-based navigation ...

167

Table 5-10: Elapsed time for each sample in the Non-Uniform-Obstruction-with-Right-Scatter-Field scenario.

Samples Runtime (seconds)

Trial 1 91

Trial 2 104

Trial 3 104

Trial 4 63

Average Runtime 90.5

The left-side of the obstruction extends beyond the boundaries of the system’s FOV, so

close-range depth evaluations will point strongly towards the right-side route. However,

the mid-range components register the presence of the scatter field and right-hand Decoy.

This can be observed by examining the timestamps leading to the highlighted decision-

point entries in the excerpted logs (Table 5-9). During this point, the tendency arrays show

that the combined presence of the scatter field and nearest Decoy is enough elements to

motivate the ‘PathDecider’ to attempt exploring the left-side route despite the indefinite

length of the main obstruction’s left-end.

The plot digitization in Figure 5-10 shows that both the paths present almost negligible

difference in distance (6.069 vs 6.756m). However, graphical examination of both paths

show that the right-side route will require more bouts of reorientation when compared to

the left-side. Elapsed time records for all four runtimes can be found in Table 5-10,

averaging at 90.5s. The sample datasets can be found in Appendix E.

5.3.7 Functional Testing Simulation Findings and Discussion

The baseline scenario was created to serve as the testbed for fine-tuning the behaviour

of the navigation system to perform Subject Locking in addition to serving as the initial

template for spawning subsequent obstacle avoidance situations. Repeated tests and

adjustments made during the baseline runtime has ensured that the simulated CARMI

robot, armed with the navigation system, can search and identify the correct target, before

Page 186: Modelling a real-time multi-sensor fusion-based navigation ...

168

proceeding to approach the Child entity until she is within the programmed zone of optimal

tracking.

The first scenario recreates a basic situation where the robot encounters a medium

obstacle while performing the ‘Approach’ action. The target Child is still being tracked by

CARMI, but a direct route to reach the appropriate proximity between them is not possible.

The test presented a uniform block, examining the state machine’s ability to interrupt the

‘Approach’ state, engage the ‘PathDecider’ to evaluate the environment via all sensor

feeds, decide on a direction, then begin wall-following. This test was largely successful,

with 7 out of 10 runtime attempts completed without service failures. It was found that the

failures were due to insufficient time between runtimes to properly terminate and restart

every single service. In actual physical operation, this will not be an issue if the robot’s

operating system is correctly set up at the beginning of use.

The next two scenarios were created to examine the navigation system’s tendency array

functionality, and how the close-to-mid-range sensors feed are transformed, fused and

used to help decide on a direction that is least impeded. Since the main obstruction is

uniformly shaped, the close-range perception data is negligible. The scatter fields and

incidental closer proximity of the right-hand Decoy presents mid-range depth obstacles

that were factored into the tendency array calculations. Thus, the resulting decisions were

translated into the motion path results presented in the previous sections. All ten runtime

attempts resulted in the robot choosing the side opposite of the scatter field, indicating

100% confirmed functionality of the navigation system’s ‘Subject Locking’ and

‘Pathfinding’ state machines.

The next set of scenarios were aimed at examining the tuning reliability of the navigation

system’s performance variables. These affect the modifiers to each sensor feed as well

as how the tendency arrays’ contents are used to make a path decision. First, the non-

uniform obstruction scenario helps in testing the navigation system’s ability at

contemplating the trade-off between a direction that has an indefinite end and the other

which has a visible end, and a single Decoy obstruction ahead. The results show that the

Page 187: Modelling a real-time multi-sensor fusion-based navigation ...

169

system has an 80% tendency to favour the closer right-end despite the presence of the

single Decoy. The single sample runtime that resulted in the left-side decision could have

occurred because the robot entity arrived before the obstruction at a skewed angle,

leading it to attempt the indefinitely long left-end.

The next two scenarios attempt to influence the navigation system’s ‘PathDecider’ in

reinforcing its decision on taking the nearer right-end or attempting the indefinitely long

left-end. The former is achieved by adding the left scatter field. On top of the close-range

sensors perceiving the nearer right-end, mid-range components register the leftwards

scatter field, solidifying the right-side route as the least impeded choice. The addition of

a right-side scatter field is aimed at convincing the navigation system to attempt a left-

side route. In every sample for all three scenarios involving a non-uniform obstacle, the

navigation system has performed according expectation. This translates to 100%

functionality correctness and reliability.

However, it must be noted that during the freeform simulation tests, a peculiar navigation

problem was discovered. Part of the navigation model’s implementation process includes

consideration of when to abort a wall-following routine in case the maneuver started to

lead the robot farther from the Primary Target. After deciding on a direction then initiating

wall-following, if the robot’s heading exceeds a threshold of 120°, the operation will be

terminated. The robot will be reoriented to face the child then reattempt evaluation of the

environment before deciding and reattempting the maneuver. This theoretically helps in

aborting a fruitless obstruction-circumvention from a side with indefinite end.

However, if the encountered obstruction has a plateau/lagoon shaped profile, the robot

may potentially get stuck in an infinite loop. To examine the existence and extent of this

condition, a scenario is built to represent the Local Minima problem. The layout is shown

in Figure 5-11, along with the runtime motion path of the robot. This result confirms the

existence of this condition in which the robot’s navigation system could not exit the lagoon

profile and fail to reposition itself so that the Child is within the optimal zone for tracking.

Page 188: Modelling a real-time multi-sensor fusion-based navigation ...

170

Currently, there are two possible solutions to address this condition. First, the heading-

based abortion threshold can be removed entirely, allowing the robot to eventually

complete its wall-following routine despite the process resulting in a much longer journey.

The next solution is to include a timer-based leeway, allowing the robot to continue

pursuing the wall-following routine away from the child, but only until a set grace period.

Beyond that, the system will abort the operation and reattempt the pathfinding process.

Implementing any of these solutions will require a retooling of the existing scenarios and

is considered outside the planned scope for this research effort. It is assumed that in the

future physical implementation of the navigation system will include a requirement that

the furniture within operating environment be arranged to avoid such a plateau/lagoon

profile.

Figure 5-11: CARMI motion path logged during the Local Minima Problem recreation

scenario.

9

10

11

12

13

14

15

3456789

CARMI Child

Page 189: Modelling a real-time multi-sensor fusion-based navigation ...

171

5.4 EXISTING INDOOR ROBOT NAVIGATION STUDIES AND BENCHMARK SCENARIOS SELECTION

The functional testing simulations have shown that a companion robot entity equipped

with the multi-sensor fusion-based navigation system is able to select a shortest (least

impeded) route around an immediate obstacle between itself and the target, by means of

considering a combination of close and mid-range depth profiles of its surroundings. The

system has been operating according to expected outcomes in 31 out of 35 simulation

runtimes, translating to a cognition-dependant decision-making accuracy of 88.571%.

However, this result only indicates the system’s ability to function correctly when

encountering single obstacles as separate events. Its pathfinding capability can only be

gauged by observing its behaviour when put through a more elaborate environment. In

addition, its actions need to be measured against suitable alternative robot navigation

projects so that a quantifiable measure of performance difference can be seen and

indicative of its advantages to developing future autonomous robot companions.

To carry this out, a literature survey was conducted and identified three studies that

presented alternative sensor-fusion methods and navigational algorithms which

contribute towards bettering autonomous indoor robot navigation. Each of the selected

works include a pathfinding and obstacle avoidance experiment to observe the

performance of their systems. This research attempts to recreate the setup of each

experiment in hopes of inducing a similar or better performance by using this proposed

navigation system instead.

5.4.1 Benchmark Study 1

The first benchmark study selected was a multimodal navigation system to enable

autonomous person-following for telepresence robots used in audio-video calls (Pang,

Seet & Yao 2013). The aim was to alleviate the robot controller from having to manually

drive the robot via telepresence, in addition to concentrating on speaking with an escorted

target. Doing this adds more cognitive load on the caller and can be avoided if the robot

Page 190: Modelling a real-time multi-sensor fusion-based navigation ...

172

autonomously escorts the target from a leading, flanking or following position. This

navigation system operates on a fully vision-based combination of three depth cameras

that simultaneously records the colour HSV histogram of the target and uses this

information to perform subject tracking. Their system was able to perform the escort

function while avoiding obstacles using a version of potential field like how this research’s

navigation system processes perceive objects as sources of repelling forces.

Figure 5-12: Graphical test results of the Multimodal Person-Following System. (Pang,

Seet & Yao 2013)

Figure 5-12 shows the experiment setup that was used to test the escort and obstacle

avoidance capabilities of the telepresence robot. The navigation system was able to

emulate an escorting human by keeping the depth cameras centered on the target while

maintaining a set social distance between them. The outcome of the experiment showed

that the motion of the robot does not imitate the human target because of the need to

maintain it within the system’s optimal zone and distance for tracking. This caused the

motion path to be less conservative than necessary. The movement rate of the robot and

maintained distance between itself and the target were not stable because of fluctuations

in the navigation system’s calculations of the person’s orientation.

While the performance of this benchmark system is satisfactory for telepresence robots

escorting a human target and differentiating between him and other subjects by means

Page 191: Modelling a real-time multi-sensor fusion-based navigation ...

173

of color histogram profiles, this method is highly susceptible to dynamic lighting conditions.

Changing lighting conditions and a crowded environment may jeopardize its following

performance if it loses LOS or erroneously follows the wrong person.

5.4.2 Benchmark Study 2

The next benchmark study attempts to tackle this challenge by presenting a robot

navigation system which tracks and follows a person in a crowded environment (Harada

et al. 2017). This is achieved by using yet another sensor-fusion technique involving a

depth camera and a Laser Range Finder (LRF). The implementation of this project is

done via RT Middleware which helps encapsulate the functionalities of this navigation

system into reusable packages that can be applied for other robots that support the same

framework.

Their navigation algorithm perceives and records only the torso depth data for use in

tracking the human target. Once this information is acquired, the system actively scans

each frame for a matching pattern that corresponds to that torso profile. This method

helps the robot identify the correct target from other adjacent human entities. The LRF is

used to acquire horizontal depth snapshots of the immediate surrounds, registering

furniture, humans and miscellaneous objects as obstacles to be avoided.

Figure 5-13: Experimental pathfinding test results of the Meemo robot for tracking a

person in a gathering. (Harada et al. 2017)

Page 192: Modelling a real-time multi-sensor fusion-based navigation ...

174

Figure 5-13 shows the results of the study’s implemented robot navigation system at

following a designated human target around several obstacles. It was noted that while the

human-following action was consistent, the set proximity between both entities fluctuated

and could not be maintained due to the robot taking too long to reorient itself towards the

human target after avoiding obstructions.

This benchmark study helps represent a suitable comparison for examining the prototype

navigation system’s ability to improve pathfinding by use of a separately rotating robot

head which simultaneously tracks the human target while the body maneuvers around

obstacles. Also, it appears that this benchmark system relies solely on the visual depth

component for identifying the correct target human. This method is also susceptible to

environmental lighting interferences and hardware lens occlusions.

5.4.3 Benchmark Study 3

The third selected benchmark study concentrates on presenting a fuzzy logic robot

controller that autonomously makes path-direction decisions while exploring ambiguous

environments (Montaner & Ramirez-Serrano 1998). It only relies on a fusion of seven

ultrasonic ranging sensors but perceives the obstacles in the immediate environment as

repulsive forces, like the first benchmark and this research’s navigation system. The fuzzy

logic controller uses an inference engine to sort through the sensor array feed and

decides on the best path direction to take after encountering an obstruction.

While this benchmarks study does not involve any component of human following, it

presents an applicable navigation model for exploration robots that roam disaster

environments alongside human rescue workers. In this case, an explorer robot will mimic

the travel path of a human target while its fuzzy logic controller oversees circumventing

the obstacles around them. As a benchmark study, its test results represent how an

autonomously roaming robot will behave when following a human worker who is exploring

an unknown environment. The robot will attempt to stay close while choosing the best

Page 193: Modelling a real-time multi-sensor fusion-based navigation ...

175

heading to avoid being stuck, however, the proximity and behavior of the controller forces

it to mimic the human’s motion path.

Figure 5-14: Experimental pathfinding results of the fuzzy logic controller's performance

simulation. (Montaner & Ramirez-Serrano 1998)

The research’s navigation system can track a target human’s position beyond simply

mimicking his motion path, but its performance over the latter must be proven through

comparative experimentation data. The study’s simulation scenario and robot path

(shown in Figure 5-14) will be used as a baseline sample to achieve this.

5.5 PERFORMANCE BENCHMARK SCENARIO DESIGN AND RESULTS

The testing environment in all three selected benchmark studies have been replicated via

the Visual Simulation Environment (VSE). The distances and size of the obstructions

used are approximated to varying degree depending on the availability of data garnered

from the results report from those studies. The three benchmark scenarios are shown in

Figure 5-15. Diagram (a) corresponds to the autonomous navigation system for

telepresence which examines escort functionality with simultaneous obstacle avoidance

(Pang, Seet & Yao 2013). Diagram (b) shows the replicated testing scenario for

examining the robot’s ability to track and follow a person through a crowded area (Harada

et al. 2017). Finally, Diagram (c) recreates the exploration zone for the robot to roam

Page 194: Modelling a real-time multi-sensor fusion-based navigation ...

176

similar to the one used for testing the fuzzy-based controller (Montaner & Ramirez-

Serrano 1998).

Figure 5-15: Attempted replication of benchmark testing environments for performance

measurement.

Since the exact dimensions of the entities and the distances between them are not able

to be replicated, a direct route-to-route length comparison is not possible without adoption

of a standard scale. Instead, a motion ratio of robot-to-human travel path is used,

measuring the robot’s conservation of movement against the escorted human total

movement. For most rudimentary follower routines, the robot’s travel path will be almost

identical to the human’s, as what is determined to be the case for Benchmark 3.

Improvements can be made to reduce robot motion by idling while the human lingers

within close distance, approximating an interception heading to coincide with a walking

target and more. Hence, the benchmark tests aim to compare the degree of motion

conservation between the selected studies and this research’s navigation system. As

before, each scenario is run to collect a total of five samples. Since each scenario’s

motion path is elaborate, only one sample will be selected for discussion in this chapter.

The remaining results are situated in Appendix E.

Page 195: Modelling a real-time multi-sensor fusion-based navigation ...

177

5.5.1 Benchmark 1 Simulation Results

First, the test results from the Multimodal Navigation System for Telepresence Robot

project (Pang, Seet & Yao 2013) is extracted and applied with plot digitization to

approximate the path lengths of both the Robot and Human. Part of this process is shown

in Figure 5-16, indicating that the study’s Human target estimated travel distance of

10.782m throughout the test. The full travel distances for both Robot and Human can be

found in Table 5-12.

Figure 5-16: Plot digitization of the Human motion path from the Multimodal

Telepresence Robot project. (Pang, Seet & Yao 2013)

Figure 5-17 shows travel paths from CARMI and the Child entities of a selected sample.

The Child entity was manually driven to follow the Human’s route from the benchmark

study. It can be observed that while the Human and Child routes are roughly similar, the

motion between both Robot entities are clearly different. This research’s navigation

system helped produced a more linear path while satisfying the proximity requirement for

escorting the target.

Page 196: Modelling a real-time multi-sensor fusion-based navigation ...

178

Figure 5-17: A graphical path result from a selected runtime sample of Benchmark 1

performance tests.

Table 5-11 shows the plot digitized path lengths for both Robot and Human entities

throughout the 5 sample datasets. The average travel lengths are then calculated to show

the human-following performance in Table 5-12.

Due to the differences in system implementation, testing environment characteristics and

experiment conditions between the benchmark studies and this simulated prototype, a

direct comparison between entity travel distances will not yield suitable data to compare

human following performance. Instead, this research proposes using a ratio of Robot

travel distance per unit of Human travel distance. For instance, a direct Robot imitation of

Human travel will result in the baseline performance value of 1. An ideal robot will aim to

Page 197: Modelling a real-time multi-sensor fusion-based navigation ...

179

maintain human-following while minimizing its motion when compared to the followed

subject (lower Following-Ratio being better).

The benchmark study’s system achieved a human-following value of 0.786, indicating

that the Robot moves a total of 78.6% of the escorted Human’s motion. This research’s

navigation system managed to complete the challenge with 0.6197, or almost 62% of the

Human motion. This shows that the multi-sensor fusion-based navigation system has

21.52% performance advantage over the benchmark study.

Table 5-11: Plot digitized travel distances for both Robot and Child entities for all samples of Benchmark 1 performance tests. Travel distances are measured in meters

(m).

Sample 1 2 3 4 5 Mean

Robot

Path

18.16 16.124 19.057 18.073 17.05 17.6928

Human

Path

26.553 28.594 30.587 28.321 28.713 28.5496

Table 5-12: Tabulated calculations of Robot-To-Human travel distances for both Benchmark 1 study and simulation results.

Digitized Path Benchmark Path Mean Simulations Path

Robot 8.475 17.6928

Human 10.782 28.5496

Following Ratio 0.786 (78.6%) 0.6197 (61.97%)

5.5.2 Benchmark 2 Simulation Results

The second performance benchmarking test is done using the recreated setup presented

in the study for robot navigation in crowded human gatherings (Harada et al. 2017). This

scenario includes a more elaborate obstacle placement and motion path for the Human

Page 198: Modelling a real-time multi-sensor fusion-based navigation ...

180

entity. As before, the travel paths for both the Human target and the Robot was estimated

using plot digitization, as partially shown in Figure 5-18.

Figure 5-19 presents a graphical view of this research’s runtime travel paths from one of

the sample attempts. The Child entity was also manually driven to imitate the motion of

the Human target from the benchmark study. Observation of these results shows that the

manual mimicry was carried out slightly better than Benchmark 1, but the general Robot

travel paths between the benchmark version and CARMI does not present much

difference in routing.

Figure 5-18: Plot digitization of the Robot motion path from the Human-Following in

Crowded Environment project. (Harada et al. 2017)

Page 199: Modelling a real-time multi-sensor fusion-based navigation ...

181

Figure 5-19: A graphical path result from a selected runtime sample of Benchmark 2

performance tests.

Table 5-13 lists the travel distances for both CARMI and the Child entities in all five

samples collected from the benchmark scenario simulations. The mean distances are

then used in Table 5-14 to determine the human-following performance difference

between benchmark study and simulation results. The benchmark study’s robot was able

to perform its escort duties at a performance value of 0.6938, or total movement

equivalent to 69.4% of the Human target’s. However, this research’s navigation system

had helped the CARMI entity complete the same challenge at a performance value of

0.4632. This translates to only moving 46.32% of the Child entity’s total motion, bringing

a performance increase of 33.26% over its benchmark study counterpart.

Page 200: Modelling a real-time multi-sensor fusion-based navigation ...

182

Table 5-13: Plot digitized travel distances for both Robot and Child entities for all samples of Benchmark 2 performance tests.

Sample 1 2 3 4 5 Mean

Robot

Path

16.141 15.324 13.902 14.723 16.429 15.3038

Human

Path

31.803 32.141 32.453 34.337 34.467 33.0402

Table 5-14: Tabulated calculations of Robot-To-Human travel distances for both Benchmark 2 study and simulation results.

Digitized Path Benchmark Path Mean Simulations Path

Robot 12.936 15.3038

Human 18.645 33.0402

Following Ratio 0.6938 0.4632

5.5.3 Benchmark 3 Simulation Results

Benchmark 3 is intended to be an attempt to gauge the motion conservation performance

between this research’s navigation system and the benchmark study’s fuzzy knowledge-

based controller (Montaner & Ramirez-Serrano 1998) when applied to an exploration

robot armed with rudimentary human-following capabilities. It is assumed that this

assumed system would perfectly imitate the Human target’s heading and speed, resulting

in a human-following performance value of 1.0. Thus, the aim is to prove that the human-

following performance metric is effective in showing differences in terms of direct motion

conservation and pathfinding optimization.

Page 201: Modelling a real-time multi-sensor fusion-based navigation ...

183

Figure 5-20 Plot digitization of the Robot motion path from the Fuzzy Knowledge-based

Controller project. (Montaner & Ramirez-Serrano 1998)

Figure 5-20 shows the excerpted test setup results from the benchmark study, along with

plot digitization of its travel path for approximating the total distance travelled. Figure 5-21

shows one of the sample results from running the benchmark simulations using CARMI

equipped with the multi-sensor fusion-based navigation system. As with the previous

benchmarks, the Child entity was piloted manually to roughly emulate the Human travel

route in the study. It can be observed that the route taken between the Robot and CARMI

entities differ in terms of amplitude and shape. The CARMI route seems to be of a

smoother, linear path when compared with the benchmark study version.

Page 202: Modelling a real-time multi-sensor fusion-based navigation ...

184

Figure 5-21 A graphical path result from a selected runtime sample of Benchmark 3 performance tests.

Page 203: Modelling a real-time multi-sensor fusion-based navigation ...

185

Table 5-15: Plot digitized travel distances for both Robot and Child entities for all samples of Benchmark 3 performance tests.

Sample 1 2 3 4 5 Mean

Robot

Path

15.15 12.893 16.021 16.183 15.685 15.1864

Human

Path

20.352 22.621 22.907 22.495 22.504 22.1758

Table 5-16: Tabulated calculations of Robot-To-Human travel distances for both Benchmark 3 study and simulation results.

Digitized Path Benchmark Path Mean Simulations Path

Robot 23.268 15.1864

Human Full Mimic – 23.268 22.1758

Following Ratio 1.0 0.6848

Table 5-15 lists the travel distances for both CARMI and Child entities in all samples

carried out in the simulation exercise. The mean values are used to populate Table 5-16

and calculate the human-following performance values for both the benchmark study and

this research’s navigation systems. As the benchmark system is assumed to be applied

with a perfectly imitative follower logic, its resultant motion is assumed to be the same as

the Human target, indicated by a value of 1.0 (the robot moves 100% equivalent to the

Human’s travel distance). This research’s system achieved a performance value of

0.6848, moving CARMI only 68.50% of the escorted Child’s total movement. This

improvement of 31.52% over the benchmark study directly corresponds to the graphical

difference in routing shape. Note that the simulation’s CARMI path in Figure 5-21 is less

pronounced and linear than the Robot’s path in Figure 5-20. Hence, this result shows that

the navigation system helps in reducing the travel length and navigation effort by a

magnitude of 31.52% lesser than rudimentary robot tethering or imitative human-following.

Page 204: Modelling a real-time multi-sensor fusion-based navigation ...

186

5.6 CHAPTER SUMMARY

This chapter is instrumental in gauging the degree of functionality and performance

capabilities of the multi-sensor fusion-based navigation model with implementation in a

simulated environment. The simulation scenarios were designed to examine the model’s

response when encountering obstructions while performing human following. Thus, a set

of scenarios were created to examine the implementation’s ability to circumvent uniform

and non-uniform obstacles, as well as how its decision-making process is influenced by

the presence of a scattering of obstacles beyond the immediate object. The conclusion

of the functional testing scenario simulations reports that the system was capable of

88.57%. Accuracy in choosing the path directions that result in the least impeded travel

before reaching the proximity threshold for optimal tracking. Furthermore, three external

studies had been chosen for benchmarking human-following capabilities. Scenarios were

made to recreate the testing environments used in those three projects and the simulation

routines are carried out, this time, with the Child entity being manually controlled to imitate

the travel route of the benchmark Human counterparts. The results reveal a pathfinding

performance advantage of 21.52%, 33.26% and 31.52% respectively. This shows that

sensor-fusion of multiple person-tracking and proximity sensors at varying ranges can

beget a clear benefit to autonomous standalone human-following robots, over relying on

compartmentalized methods for separate person tracking and obstacle avoidance.

Publications had been carried out over the results from both function testing scenarios

(Tee, Lau & Siswoyo 2018a) and performance benchmarking simulations (Tsun, Theng

& Jo 2017; Tee, Lau & Siswoyo Jo 2018b).

Page 205: Modelling a real-time multi-sensor fusion-based navigation ...

187

Chapter 6: CONCLUSION

6.1 INTRODUCTION

This research has come a long way, from initial literature surveys to overview the state-

of-the-art for companion robots and identifying common challenges in implementing their

human-following capabilities to developing and testing a potential navigational solution

for them. This chapter discusses the accomplishment of the research questions and

objectives and addresses them according to the findings and efforts invested throughout

the project. Following this, a discussion over the limitations of the implemented system

and possible improvements as future work is presented.

6.2 CONTRIBUTIONS

Recapping the inception of this research, the aim was to find out what was the reason

behind companion robotics not being Commercial-Off-the-Shelf (COTS) despite the

prevalence of Assistive Technologies applied to elderly care, aiding the disabled and

accompanying children with cognitive disabilities. This occurrence contrasts with

domestic access to novelty telepresence robots and various autonomous robots

marketed as intelligent household appliances. Early literature survey revealed that

although there are a multitude of research publications related to assistive robotics and

autonomous robot operation, there are rarely any documented build characteristics

shared amongst them. Without knowledge of enough common characteristics in hand, it

is difficult to implement any form of indoor companion robot platform that can cater to the

variety of applications. The contributions are summarized in the following sub-sections.

6.2.1 Identification of Navigational Challenges for Indoor Companion Robots

This research studied how existing research works were using assistive robotics. The

literature review explored the projects that were applying general Assistive Technologies

to prevent injuries in cognitively disabled children. Those diagnosed with Cerebral Palsy

Page 206: Modelling a real-time multi-sensor fusion-based navigation ...

188

(CP) and Autism Spectrum Disorder (ASD) had a tendency of experiencing self-injuries

due to inadequate motor control or stereotypy, falls being chief amongst these. The

automation hardware came in the form of smart wearables, embedded environments and

vision-based systems. Smart wearables were good for constant monitoring but risk of

being obtrusive to the user. Embedded environments were useful for overall environment-

wide localization but expensive and often unfeasible to implement. Vision-based

monitoring seemed to offer the best compromise but were plagued by optical hardware

weaknesses and the limited Field-of-View (FOV). These limitations could possibly be

lifted via integration into mobile robots.

Thus, the focus is on robotics as an Assistive Technology. Robotics, or simply automation,

were first commonly employed as technological augmentation to existing therapeutic

practices for the elder, disabled and cognitively impaired. Intelligent machines that

emulate dynamic challenges and adjust the difficulty levels by learning the user’s

performance histories helped revolutionize physiotherapy sessions. Some systems were

developed experimentally to one day allow patients to participate in sessions at home

without the physical need for a physician.

While studying about automated therapies, it was found that robotics gained popular use

in the field of social interaction rehabilitation for children with ASD. These children were

observed to have more affinity towards robotic constructs than with fellow human beings,

leading these machines to being used as surrogate therapists. The Triadic Interactions

model describes how a robot “puppet” can be remotely controlled by a therapist to gain

the child’s trust before gradually introducing a human playmate into the session.

Humanoid robots were also used to elicit responses in active play and gesture mimicking

exercises aimed at improving a cognitively impaired child’s social interaction skill.

Finally, the classification of companion robots arose as indoor autonomous mobile robots

that are intended for long term accompaniment to a user. These robots had to fully

function safely without full-time control by a human operator, able to maneuver around

typical human-populated rooms, and stay within a designated proximity from the primary

Page 207: Modelling a real-time multi-sensor fusion-based navigation ...

189

user while carrying out its intended duties. These may include provision of iterative

therapy-reinforcement games, communication, and help in performing Activities of Daily

Living (ADL) or passive monitoring in case of injuries.

To identify the challenges faced during implementation and operation of companion

robots, a robot planning template was proposed. This template championed the inception

of applications built around a multi-purpose mobile robot that consists of COTS

components, making their construction rapid, economical and domestically possible by

the average consumer. However, there are two challenges that cannot be easily

overcome by this template: the need for a robust navigation system and a means for

reliable human tracking. These two problems come together as the main navigational

challenges that conventionally require exclusively developed solutions on project-to-

project basis. Incidentally, both these challenges are combined into what is referred to as

the autonomous human-following problem.

The indoor navigation problem consists of three dilemmas: location-specific, autonomous

wayfinding and hardware. The Location-specific dilemma describes the common indoor

operating environment as cluttered, dynamically shifting and populated by erratically

moving people. The Autonomous wayfinding dilemma encapsulates the difficulties in

applying theoretical pathfinding algorithms to inherently incomplete and rapidly

invalidated maps. Due to the location-specific dilemmas, it is practically impossible to rely

on techniques that work on full a priori knowledge. Utilizing reflex methods can be equally

ineffective because the randomness of shifting obstacles in real-world locations can lead

a robot into dead ends or be unable to maintain consistent proximity to the primary target.

Finally, the hardware dilemma dictates that real-world implementations of sensor and

actuation hardware have device-specific offsets and nuances that skew the input and

output of the robot’s wayfinding system. Some measure of real-time adjustment to both

the environment and robot hardware characteristics are required.

The human tracking challenge is described as a need for a way to consistently perceive

the primary user’s position, relative to the companion robot. When acquired effectively

Page 208: Modelling a real-time multi-sensor fusion-based navigation ...

190

and consistently, this allows the robot to engage its navigation system and drive itself so

that it can always remain at a set proximity from its target. Outdoor unmanned drones

capitalize on a satellite-network to perform near-accurate localizations on Earth, called

GPS. Unfortunately, indoor environments lack clear access to the sky and must rely on

alternate means of tracking individual human position. This challenge is largely related to

the availability of less-obtrusive, easily acquirable motion capture devices that work

indoors, along with the inherent hardware limitations that these devices suffer from. The

solution to this problem is to find a way to mitigate said limitations, either via sensor

redundancy or an alternate means of acquiring indoor individual human position.

6.2.2 Design of a Novel Indoor Robot Navigation Model to Perform Real-time Human-following and Obstacle Avoidance

Beginning with the indoor navigation problem, possible navigation systems are usually

operated as either global or local methods. Global methods involve acquiring a complete

map of the operating environment, then utilizing a search or wayfinding algorithm to derive

a shortest path if at least one exists. There are several caveats to using this approach.

First, acquiring a full snapshot of the environment is difficult and requires possibly costly

technologies such as LIDAR. Despite numerous attempts at optimizations, the wayfinding

algorithms have always been heavily mathematical processes that require significant

computation. Finally, processing a complete snapshot only yields a path that stays valid

only while portions of the environment remains unchanged. Afterwards, the entire process

must be repeated. In the real-world, environments shift and change rapidly, thus making

global methods hard to justify. A notable global technique is the Potential Field Method

that represents the goal position as a source of positive force surrounded by obstacles

emanating repulsive forces. A snapshot of the environment translates into a potential field

where a trail of most positive sum of forces represents the best travel path between the

origin and the goal.

Local methods are inspired by biological creatures and how they find their way through

natural biomes while only perceiving their immediate vicinity. It is assumed that each

entity has sensory over a limited sphere around itself and must decide on the next

Page 209: Modelling a real-time multi-sensor fusion-based navigation ...

191

direction to move towards based on this information. Multiple techniques range from

random selection to learning algorithms that build upon previous pathfinding history and

alternate interpretations of the Potential Field Method such as Vector Field Histogram

(VFH). This class of techniques incur less computation requirements and basic sensors

but suffer from the limited perception scope.

This research was inspired by the alternate local method adaptation of the Potential Field

Method, where perception of the immediate vicinity is translated into repulsive and

attractive forces. The potential path to take is indicated by the direction corresponding to

highest sum of forces. This technique is sufficiently reactive to the dynamic changes in

human-populated environments while being decoupled from defining any actuator or

sensor specific variables. However, current versions of this method such as VFH is every

sensitive to changing thresholds and still affected by the short-range limitations of its

proximity sensors. Additional layers of sensory scope are needed as input into this

Potential Field adaptation.

Meanwhile, the human tracking problem has been understood to be mostly caused by

hardware limitations. The advantage of modern companion robot projects today is the

commercial availability of vision-based motion capture devices. Often implemented as

monocular, stereoscopic or RGB-D depth cameras, these devices capture video frames

of the world as depth maps – 2D photographs with a distance component in each pixel.

Accompanying proprietary software makes use of the depth maps to identify human

bodies and estimate gesture profiles for a wide variety of human activity tracking. These

offerings provide a huge leap forwards from traditional systems that used to be exclusively

professional equipment that require its subjects to wear bodysuits mounted with active

markers. Unfortunately, these motion capture devices suffer from limited FOV, lens

occlusions and latency that results in frequently losing sight of the primary subject, false

detections and switched body targets.

This research recognizes that a vision-based motion capture device is an integral part of

the navigational solution, but it cannot be considered as reliable on its own. A survey of

alternative human tracking systems was carried out to find possible candidate

Page 210: Modelling a real-time multi-sensor fusion-based navigation ...

192

technologies that can augment the motion capture devices. One option was to induce

redundancy by combining multiple sets of the same motion capture product. However,

this method may not present significant improvement because the same inherent

hardware limitation is shared across all redundant sets.

The survey covered three areas of human tracking: self-localization, body tracking and

biomonitoring. Overviewing localization has helped categorize the methods to discern

entity locations as being range-based and range-free. Range-based methods produce

more accurate positions reports but require calibration to estimate distances based on

units of RF signal strength, pixel depth and similar metrics. Dynamically changing

environmental attributes affect the long-term validity of these calibrations so periodic

adjustments are required. Range-free system approximate proximity between nodes

rather than depend on set calibration values. Using the depth component from depth

maps to approximate the heading and position of an object relative to the host is one

example of range-free localization method. This solidifies the choice of using a vision-

based motion capture device as the primary human tracking solution because localization

based relative to rooms is not necessary in purely human-following applications.

Other body tracking and biomonitoring methods involve using wearable devices

embedded with Microelectromechanical systems (MEMS) that help monitor discrete

motions the wearer makes. While these miscellaneous systems do not directly aid in

tracking the spatial position of the wearer, these projects demonstrated that wearables

can be used to mount active markers. Thus, a separate system that combines vision-

based IR tracking and a wearable installed with active IR markers could offer wider FOV,

longer tracking range and better detection threshold against changing lighting conditions

when compared to RGB-D-equipped motion tracking suites.

Thus, this research proposed a solution that consists of combining a COTS RGB-D

motion capture device with a redundant active IR marker tracking system to supersede

the inherent hardware limitations of the former’s human tracking performance. Meanwhile,

the raw depth map from the motion capture device can be adapted to provide the mid to

Page 211: Modelling a real-time multi-sensor fusion-based navigation ...

193

long-range depth component for a modified Potential Field pathfinding algorithm.

Alongside the proximity sensor array data that forms the short-range component, and the

body tracking data as the goal component, a possible indoor navigation solution can be

formed.

The solution development began with applying the companion robot planning template

proposed in Chapter 1. The template helped defined a structure to arrange all required

functionalities as a collection of self-contained components, inspired by the Object-

Oriented Approach. The functionality of each component had to be realized using COTS

material so that the robot system can be recreated rapidly and domestically. As a case

study, this template was used to guide the conceptual development of CARMI – a

Companion Avatar Robot for the Mitigation of Injuries. CARMI is intended for use as the

robot entity in a Robot-Based Injury Prevention Strategy that involves a participating

caregiver and the cognitively impaired child as part of the intervention method. Every

component of the robot template was able to be catered for via readily available parts,

except for the robot navigation system.

Following a conducive process of literature survey into existing assistive robot companion

projects, two major challenges were identified as the common source of complications

preventing the possibility of a standard companion robot platform for general purpose

indoor use. Both indoor autonomous navigation and human tracking functionalities could

not be easily portable between applications because each of them requires tailored

solutions based on applied platforms and specific environment characteristics.

Further literature survey into possible technologies that could help formulate a solution,

had yielded enough results. The existing works inspired the inception of a navigation

system adapted from the Potential Field Method. Instead of using a fixed goal position

and map-wide obstacles, the current human position translates into an attraction force

while the combined depth map obstacles and readings from the close-range proximity

sensors array form the repulsive force. The resultant Potential Field array indicates the

best direction to turn to, in a local wayfinding method fashion. This way, the robot will

Page 212: Modelling a real-time multi-sensor fusion-based navigation ...

194

select the next best path in a reflex-like method, but the perception data is generated after

considering the goal position (current position of the primary target), the mid to long-range

landscape (depth map) and the immediate surroundings (proximity array readings).

Several studies were carried out to design, develop and experiment on how portions of

this solution can be realized. The first attempt was to gain first-hand experience over how

a selected motion capture suite called Microsoft Kinect could be utilized simultaneously

for tracking both human activity and position while extracting the raw depth map feed. It

was also aimed at proving that COTS solutions can be depended on to provide human

activity tracking, which was done using the proprietary Visual Gestures studio. Four

injurious actions (falls, punches, kicks and jumps) were processed into gesture profiles

via a brief series of recording and neural network training sessions. During runtime, the

system demonstrated above average performance (58.425% accuracy) at identifying the

correct actions as reenacted by test subjects. Higher success rates were estimated if

more vigorous training sessions were to be carried out. However, the study also

confirmed the effects of the limited FOV, detection distance and changing environmental

lighting.

The next study attempted to develop an IR Active Marker Tracking (AIRMT) system as

the redundant system to complement the motion capture device. The system consists of

a worn active marker vest and a repurposed monocular IR camera. The vest was fitted

with IR emitting LED modules across its surface while the camera’s high-pass filter

removes all background images except for visible LED modules. The original intent for

the system was to track a specific pattern alignment so that the robot could steer itself to

an optimal position via visual servoing. This attempt failed to function optimally beyond

the confines of the motion capture device. However, it was found that raw detection of

modules could be carried out at longer distances and wider FOV than the Kinect. Thus,

the system was repurposed to utilize this ability.

The first stage at solving the navigation challenges was to address the human tracking

portion using the proposed sensor fusion technique. Called the ‘Subject Locking’ phase,

Page 213: Modelling a real-time multi-sensor fusion-based navigation ...

195

the viewpoint coordinate systems of the AIRMT was transformed to align its view-space

to the Kinect’s. This allows displacement of the Active Marker’s centroid to correspond to

a displacement of the primary subject’s body. Other detected bodies that do not coincide

with that centroid are eliminated from subsequent computations. This algorithm helped

reduce the effects of false detections, decoy bodies and loss of LOS during shifting

environmental lighting. Via compartmentalization, the Subject Locking phase could be

carried out consistently by nesting the Kinect and AIRMT in a rotating “sensor nest” head.

By using the offset between the primary subject’s position and the view-space midpoint

as the error, a PID controller was used to actuate the turntable so that the robot head

always centers on the child. This allows the lower half of CARMI to move around

obstacles without having to account for human tracking during maneuvers.

The second stage of the navigation algorithm is dubbed the ‘Pathfinding’ phase. Here,

the orientation of the turntable is used to indicate the relative position of the primary

subject. This data is transformed into the attractive force component in the algorithm. Next,

the depth map and data from the proximity sensors array are transformed and merged

into the repulsive force component. When the array indexes were appropriately aligned,

all force components were summed to determine whether the highest total “tendency”

was situated on the ‘Left’ or ‘Right’ half of the robot.

When the robot is outside the escort distance between itself and the child (primary target),

it will engage an ‘Approach’ mode and attempt to move towards the target. However,

encountering an obstacle will prompt it to assume ‘Obstacle Avoidance’ mode, which can

be accomplished using any of the wide available wall-following or avoidance strategies.

However, the robot must choose whether to begin maneuvers from the ‘Left’ or ‘Right’

side of the obstacle. Existing projects involve examining the landscape beforehand (a

priori) or randomly decide on a side to begin maneuvers (local method). The robot cannot

have access to complete a priori maps due to the limitations of its implementation but

relying on random number generator could potentially lead it away from the subject or

enter a dead end. This algorithm supplies a “best guess tendency” towards a direction

based on considering short-mid-long range and subject position components.

Page 214: Modelling a real-time multi-sensor fusion-based navigation ...

196

6.2.3 Evaluation of the Effectiveness of the Proposed Navigation Model in Indoor Human-following and Obstacle Avoidance

To gauge the performance of this proposed navigation algorithm, a prototype robot

system was developed using Microsoft Robotics Developer Studio (MRDS) and simulated

via the Visual Simulation Environment (VSE). The complete robot software is structured

as two state machines (one for each phase) that run concurrently. The implemented

system consists of a total of 15 developed robot services that interlock and interact with

the other stock subsystem services. The resulting robot system is technically portable

between hardware and simulated entities, but this research’s scope covers just the

proving of concept through simulation.

The design of the simulation scenarios starts with incepting the typical contents of a living

room. It was determined that all possible obstacles can be classified as Low, Medium and

High types based on whether it obstructs visibility of the target and if it is passable by the

robot. The scenarios will be populated by medium obstacles that are impassable but do

not obscure visibility. The entire simulation experiment consists of two types of scenarios:

functional tests and performance benchmarks.

Seven functional tests were created, one being a baseline scenario with only the robot,

primary subject and decoy targets. This baseline is used as the debug test to help develop

and calibrate the ‘Subject Locking’ phase of the navigation algorithm. Scenarios were

made by adding a uniform and non-uniform entity each. These test the functionality of the

immediate decision-making performance upon encountering and obstacle. Subsequent

scenarios examine the effects of clutter in influencing the decisions. The goal of the

functional tests was to ensure that in single instances, the navigation system was able to

pick sides that will most likely lead to a less impeded, shorter path between itself and the

primary subject. Observations of the scenario runtimes report that 3 out of 10 attempts

were disqualified due to MRDS malfunctions causing service load failures. However, the

remaining runtimes resulted in 100% selection of sides that lead to shorter travel before

reaching the optimal proximity between robot and child. Graphical approximation of travel

distances was derived from the performance logs to compare distances from both choices

Page 215: Modelling a real-time multi-sensor fusion-based navigation ...

197

in each scenario. In the scenario with non-uniform obstruction and clutter, the inability to

perceive the other end of the obstruction lead to a 20% chance of picking the longer travel

path. Even including cases with these uncertainties, the navigation system has shown

acceptable decision-making reliability under simulated conditions (accuracy of 88.571%

in 31 out of 35 attempts).

While carrying out the functional tests, it was deemed prudent that the extent of local

minima problem faced by other Potential Field-related systems be determined for this

navigation system. A scenario was built, and the experiment was carried out. It was

observed to also affect this system, although furniture in the real world will have to be

arranged in a specific lagoon configuration before the plateau effect can take place. It is

assumed that extent of this problem is negligible following mindful arrangement of

medium and high obstacles in the event of future physical field testing.

The next category of simulations is for benchmarking the performance of the navigation

system when applied to a series of obstacle avoidance instances while following a mobile

human target. Three existing human-following and exploration research projects were

selected so their test environment can be replicated in VSE as individual benchmark

scenarios. Each scenario simulation was carried out by manually controlling the child

entity and following the same route reported in these published works. The goal was to

determine if the navigation system can consistently maintain proximity to the target and

do so under less travel distances compared to the benchmark studies. Because of

implementation differences, the environment scale between the benchmark studies and

VSE are different. Thus, performance is measured using a ratio of Robot to Human travel

distances. A more efficient human following attempt is presented by a robot that moves

markedly less than the human subject while escorting, as opposed to a basic system that

mimics the targets motion in 1:1 ratio.

The first benchmark scenario showed an improvement of 21.52% by the navigation

system over the existing study (62% vs 78.6%). The second scenario reported 33.26%

better human following performance by the navigation system (46.32% vs 69.4%). The

Page 216: Modelling a real-time multi-sensor fusion-based navigation ...

198

final scenario was based on an autonomous exploration method, so it was assumed as

applied to a fully mimicking robot. Against a 1:1 benchmark, the results show that this

navigation system travelled 31.52% less distances while following the subject.

The simulation results show that the implemented model presents a viable potential as a

solution to the navigational challenges which could be encapsulated as a component in

the proposed robot planning template. This implementation consists of using COTS

hardware and range-free localization fed into a Potential Field Method inspired algorithm

that provides informed decisions on which direction to begin executing any choice

obstacle avoidance approaches. This research has provided a documented journey from

incepting to examining the performance of a solution to the indoor robot navigational

challenges.

6.3 LIMITATIONS AND FUTURE WORK

Both the functional tests and benchmark scenarios have shown that the implemented

navigation system had achieved its objective of providing a means to perform indoor

autonomous maneuvering and human following using COTS components. This system

itself could be encapsulated as a standalone module within the Companion Robot

Planning Template, making it adaptable to other indoor companion robot projects so long

as the necessary requirements for motion capture devices, proximity sensors, active

marker tracker and computing resources are met. However, there are issues and

drawbacks surrounding the solution implementation that could not be addressed due to

the limits of this research’s scope.

Firstly, the performance metric used in this research is a graphically approximated travel

distance. Runtime logs contain traveled waypoints which were projected in a top-down

map fashion when transformed into a 2D graph. The pathway that the waypoints form

could be traced and digitized to obtain a best-guess distance value. Performance

comparison can also be carried out by measuring elapsed operation time. Elapsed time

is garnered from the recorded sessions by measuring the timestamp difference from the

Page 217: Modelling a real-time multi-sensor fusion-based navigation ...

199

moment all services are initiated to the end of the journey. Unfortunately, elapsed time

varies depending on the platform construction, implementation of actuation, and the

choice of obstacle avoidance algorithm used. In addition, the benchmark studies did not

publish comprehensive elapsed time data for their test runs, so it could not be

incorporated as part of the measurement process.

Another issue related to the design of the test and benchmark scenarios is the use of

medium size obstacles. As the aim was to create a range-free non-exclusive human

following navigation model, the development process requires consistent LOS between

the robot and primary target. Hence, the obstacles used during unit, functional and

benchmark tests were of the type that obstructs movement but not visibility. Low and high

obstacles are also present in real-world environments, so the presence of these varieties

should also be factored into future tests. They will represent additional layers of difficulty

because the system must contend with the loss of LOS and determine which obstacles

could be safely driven over. Consideration of these types of obstacles will be done if the

system can be augmented with an additional layer of sensors.

The navigation system created in this research is still susceptible to the local minima

problem that is faced by almost all Potential Field Method related applications. The

situation exists because a summation of transformed data feed can result in a plateau

effect - a combination of weighed sensors input evening out each other. In this system’s

case, the state machine’s failsafe method of preventing the robot moving farther away

while pathfinding was the leading cause of the local minima. This can be solved with a

longer watchdog timer setting or rearrangement of furniture.

However, a better solution is to implement an adaptable weight-adjustment system for

changing the significance of sensor data sources based on trend. This can be done via

machine learning techniques such as neural networks or fuzzy logic, so that the robot

recognizes it is in a local minimum after several instances of wandering within a plateau.

The navigation system was developed in mind to perform human following and mitigate

any needed deviations from the approach vector. However, it also had to account for

Page 218: Modelling a real-time multi-sensor fusion-based navigation ...

200

situations where the primary subject moves too near the robot, placing him outside the

optimal detection zone. In these cases, CARMI was able to reverse itself. However, the

reverse maneuver only has access to physical bump switches as a means for collision

detection. In the event of a collision, the robot stops the actuation and attempts to turn

sideways. This could be further improved in terms of sensory FOV and more degrees of

freedom.

The introduction of the Active IR Marker Tracking (AIRMT) system can potentially be point

of weakness in the system. While it consists of COTS components, it must be built to

provide the type of tracking performance as indicated in this research. Perhaps a better

COTS alternative suite is available or could be adapted to provide similar functionality

with less effort. In addition, the tracking system is still limited by optical sensor hardware.

An appropriate replacement should also be capable of more visible markers and

identifying them at longer range and wider FOV.

The overall purpose of this research is to contribute towards making companion robots a

consumer-accessible mass producible technology. It does this by identifying how one can

be planned and developed using as many standalone products and solutions as possible.

However, homogenous human following, consisting of autonomous wayfinding and

human tracking, could not be easily realized. The research aimed to explore how a

solution to this problem could be made, demonstrated by its development process from

a model to implemented robot control system. The research could only accommodate

functional and benchmark tests using software simulation that came bundled with the

development kit. Although the results were optimistic, there is no substitute for actual

physical tests in validating real-world performance. However, aside from limited project

scope, the reason a physical test was not included was because the optimal configuration

for tuning the navigation system varies from case to case (due to different hardware and

build specifications). Unless there is some form of adaptable recalibration added into the

algorithm, a physical test could only present performance values exclusive to that

machine involved.

Page 219: Modelling a real-time multi-sensor fusion-based navigation ...

201

A physical implementation is still an obvious next step to follow up on this research. Doing

so can be useful in studying the extent of obtrusiveness the robot imposes on its user.

Factors such as approach speed and direction, cosmetic design, and possible escalating

levels of interaction could influence the tendencies for its user to react towards a

companion robot in aggression or acceptance. In addition, separate studies with physical

implementations can better gauge the performance of additional navigational features

such as localization between rooms or floor as well as connectivity. This project presents

an ideal leap board into robot networks and interconnected healthcare services that can

be incorporated as part of the Robot-Based Injury Prevention Strategy. Finally, future

works can also explore alternative augmentations to the motion capture system and

AIRMT, such as facial and voice recognition. Additional system components such as

these can also simultaneously provide input for interactive play or direct communications

between robot and user, as opposed to it acting fully as a passive observer.

6.4 RESEARCH SUMMARY

This research has investigated on the reasons why companion robots have not become

widely available as Commercial Off-the-Shelf (COTS) products even though the use of

robotics as an Assistive Technology has made positive breakthroughs over the years.

The literature surveys helped narrow down the factors that lead to this outcome, including

the difficulty in determining similar characteristics amongst projects studying various

forms of robotic application. The research proposed a robot planning template that helps

in organizing indoor companion robot functionalities and implementations in an Object-

Oriented Approach, emphasizing on creating each module as a standalone subsystem

built using COTS components. However, the navigational challenges of companion

robots could not be easily fulfilled in a homogenous way. Thus, the research embarked

on an effort to survey for possible technologies that can help provide a solution to the

identified challenges: a robust indoor pathfinding method and a more reliable human

tracking method. These two challenges combine to form the human-following capability

integral in every companion robot.

Page 220: Modelling a real-time multi-sensor fusion-based navigation ...

202

A solution was formed using a fusion of redundant motion capture device, Active IR

Marker tracking, proximity sensors array, and an adapted Potential Field Method. This

algorithm accounts for close, mid and long-range obstacle proximity in addition to the

relative position of the primary subject. The subject’s identification and tracking are

carried out via the ‘Subject Locking’ phase, while the cumulative force calculations are

transformed into a ‘tendency’ array in the ‘Pathfinding’ phase. The ‘tendency’ array

indicates which direction around an obstacle will most likely lead to a less impeded travel

path.

The proposed model was implemented using Microsoft Robotics Developer Studio

(MRDS) and tested in the included Visual Simulations Environment (VSE). The navigation

system was designed as two simultaneously interacting state machines, reflecting the two

phases of the model. A total of 15 custom robot services was developed to realize this

system. 7 base scenarios were created to test and adjust the navigation system’s

functionality for mitigating individually encountered obstacles. An additional scenario was

made to investigate the extent of the local minima problem distinct to projects related to

the Potential Field Method.

Three existing robot navigation studies were selected to adopt their experimentation

results as a performance benchmark. Their test environment was modeled using VSE as

scenarios. Running all functional tests and benchmark scenarios have generated results

that indicate the navigation system being capable of significant improvements in human

following efficiency. The outcome of this experiment shows that the proposed model is a

viable solution to the indoor robot navigation challenges identified during the literature

survey.

Future work includes consideration of more obstacle varieties, reverse motion obstacle

avoidance, machine learning and curbing the local minima problem. Physical tests

involving navigation system implementation in a robot body, is a definite next step, which

opens more avenues for research ranging from further validation of the system’s

Page 221: Modelling a real-time multi-sensor fusion-based navigation ...

203

functionality, development of assistive robot networks and alternate redundant systems

that can replace the AIRMT.

Page 222: Modelling a real-time multi-sensor fusion-based navigation ...

204

REFERENCES

A. Jung Moon 2014, ‘NAO Next Gen now available for a wider audience’, Robohub,

viewed 19 December, 2017, <http://robohub.org/nao-next-gen-now-available-for-

the-consumer-market/>.

Aldebaran Robotics 2014, ‘NAO Key Features’, Aldebaran Robotics, viewed 25 February,

2014, <http://www.aldebaran-robotics.com/en/Discover-NAO/Key-

Features/hardware-platform.html>.

Alsaab, A & Bicker, R 2014, ‘Improving velocity obstacle approach for obstacle avoidance

in indoor environments’, 2014 UKACC International Conference on Control,

CONTROL 2014 - Proceedings, no. July, pp. 325–330.

Aly, H & Youssef, M 2013, ‘Dejavu: An Accurate Energy-Efficient Outdoor Localization

System’, Proceedings of the 21st ACM SIGSPATIAL International Conference on

Advances in Geographic Information Systems - SIGSPATIAL’13, ACM Press, New

York, New York, USA, pp. 154–163, viewed 13 September, 2015,

<http://swinburnedb.librarynet.com.my:2300/citation.cfm?id=2525314.2525338>.

Amirabdollahian, F, Robins, B, Dautenhahn, K & Ji, Z 2011, ‘Investigating tactile event

recognition in child-robot interaction for use in autism therapy.’, Conference

proceedings : ... Annual International Conference of the IEEE Engineering in

Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society.

Conference, vol. 2011, IEEE, pp. 5347–51, viewed 28 January, 2014,

<http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=6091323>.

Anderson, RF & Schweitzer, H 2009, ‘Fixed time template matching’, 2009 IEEE

International Conference on Systems, Man and Cybernetics, IEEE, pp. 1359–1364,

viewed 30 January, 2015,

<http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5346256>.

ASUSTeK Computer Inc. 2014, ‘Xtion PRO Specifications’, viewed 24 January, 2015,

<http://www.asus.com/Multimedia/Xtion_PRO/specifications/>.

Atia, MM, Liu, S, Nematallah, H, Karamat, TB & Noureldin, A 2015, ‘Integrated Indoor

Navigation System for Ground Vehicles With Automatic 3-D Alignment and Position

Initialization’, IEEE Transactions on Vehicular Technology, vol. 64, no. 4, pp. 1279–

Page 223: Modelling a real-time multi-sensor fusion-based navigation ...

205

1292, viewed 18 April, 2018, <http://ieeexplore.ieee.org/document/7027835/>.

Balaguer, C & Gimenez, A 2006, ‘The MATS robot: service climbing robot for personal

assistance’, IEEE Robotics & Automation Magazine, no. March, pp. 51–58, viewed

6 April, 2014, <http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1598053>.

Balaguer, C, Gimenez, A, Huete, AJ, Sabatini, AM, Topping, M & Bolmsjo, G 2006, ‘The

MATS robot: service climbing robot for personal assistance’, IEEE Robotics &

Automation Magazine, vol. 13, no. 1, IEEE, pp. 51–58, viewed 30 March, 2014,

<http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=1598053>.

Bloom, R, Przekop, A & Sanger, TD 2010, ‘Prolonged electromyogram biofeedback

improves upper extremity function in children with cerebral palsy.’, Journal of child

neurology, vol. 25, no. 12, pp. 1480–4, viewed 27 January, 2014,

<http://www.ncbi.nlm.nih.gov/pubmed/20525944>.

Boccanfuso, L & O’Kane, JM 2011, ‘CHARLIE : An Adaptive Robot Design with Hand and

Face Tracking for Use in Autism Therapy’, International Journal of Social Robotics,

vol. 3, no. 4, pp. 337–347, viewed 22 January, 2014,

<http://link.springer.com/10.1007/s12369-011-0110-2>.

Borneo Post Online 2016, ‘Swinburne student’s robot wins bronze medal’, Borneo Post

Online, 3 February, Kuching, Sarawak, Malaysia, viewed

<http://www.theborneopost.com/2016/02/03/swinburne-students-robot-wins-bronze-

medal/>.

Bräunl, T 2006, Embedded Robotics: Mobile Robot Design and Applications with

Embedded Systems, 2nd ed, Springer Berlin Heidelberg, Berlin, Heidelberg, viewed

<http://link.springer.com/10.1007/3-540-34319-9>.

Budiharto, W, Purwanto, D & Jazidie, A 2011, ‘A robust obstacle avoidance for service

robot using Bayesian approach’, International Journal of Advanced Robotic Systems,

vol. 8, no. 1, pp. 37–44.

Cabibihan, J-J, Javed, H, Ang, M & Aljunied, SM 2013, ‘Why Robots? A Survey on the

Roles and Benefits of Social Robots in the Therapy of Children with Autism’,

International Journal of Social Robotics, vol. 5, no. 4, pp. 593–618, viewed 21 March,

2014, <http://link.springer.com/10.1007/s12369-013-0202-2>.

Canavese, G, Stassi, S, Fallauto, C, Corbellini, S, Cauda, V, di Donato, M, Pirola, M &

Page 224: Modelling a real-time multi-sensor fusion-based navigation ...

206

Pirri, FC 2014, ‘Stretchable and Wearable Piezoresistive Insole for Continuous

Pressure Monitoring’, Key Engineering Materials, vol. 605, pp. 474–477, viewed 14

April, 2014, <http://www.scientific.net/KEM.605.474>.

Carnegie Mellon University 2018, ‘CMUcam: Open Source Programmable Embedded

Color Vision Sensors’, viewed 20 April, 2018, <http://www.cmucam.org/>.

Cavallo, F, Aquilano, M, Bonaccorsi, M, Limosani, R, Manzi, A, Carrozza, MC & Dario, P

2013, ‘On the design, development and experimentation of the ASTRO assistive

robot integrated in smart environments’, 2013 IEEE International Conference on

Robotics and Automation, vol. 2, Ieee, pp. 4310–4315, viewed

<http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6631187>.

Chen, H, Sezaki, K, Deng, P & So, HC 2008, ‘An improved DV-Hop localization algorithm

for wireless sensor networks’, 2008 3rd IEEE Conference on Industrial Electronics

and Applications, pp. 1557–1561, viewed

<http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=4582780>.

Chen, Z, Xia, F, Huang, T, Bu, F & Wang, H 2011, ‘A localization method for the Internet

of Things’, The Journal of Supercomputing, vol. 63, no. 3, Springer US, pp. 657–674,

viewed 3 July, 2015,

<http://swinburnedb.librarynet.com.my:2367/article/10.1007/s11227-011-0693-2>.

Choi, W, Pantofaru, C & Savarese, S 2011, ‘Detecting and tracking people using an RGB-

D camera via multiple detector fusion’, 2011 IEEE International Conference on

Computer Vision Workshops (ICCV Workshops), IEEE, Barcelona, Spain, pp. 1076–

1083, viewed 18 April, 2018, <http://ieeexplore.ieee.org/document/6130370/>.

Cohen, IL, Gardner, JM, Karmel, BZ & Kim, S-Y 2014, ‘Rating scale measures are

associated with Noldus EthoVision-XT video tracking of behaviors of children on the

autism spectrum’, Molecular Autism, vol. 5, no. 1, pp. 1–27, viewed

<http://cl5jj5ug3c.search.serialssolutions.com/?ctx_ver=Z39.88-

2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-

8&rfr_id=info:sid/summon.serialssolutions.com&rft_val_fmt=info:ofi/fmt:kev:mtx:jour

nal&rft.genre=article&rft.atitle=Rating+scale+measures+are+associated+with>.

Colton, M, Ricks, D & Goodrich, M 2009, ‘Toward therapist-in-the-loop assistive robotics

for children with autism and specific language impairment’, autism, pp. 1–5, viewed

Page 225: Modelling a real-time multi-sensor fusion-based navigation ...

207

2 February, 2014,

<http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.160.125&rep=rep1&typ

e=pdf>.

Cosgun, A, Florencio, DA & Christensen, HI 2013, ‘Autonomous person following for

telepresence robots’, 2013 IEEE International Conference on Robotics and

Automation, IEEE, pp. 4335–4342, viewed 25 June, 2014,

<http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6631191>.

Costa, S, Santos, C, Soares, F, Ferreira, M & Moreira, F 2010, ‘Promoting interaction

amongst autistic adolescents using robots.’, Conference proceedings : ... Annual

International Conference of the IEEE Engineering in Medicine and Biology Society.

IEEE Engineering in Medicine and Biology Society. Conference, vol. 2010, pp. 3856–

9, viewed <http://www.ncbi.nlm.nih.gov/pubmed/21097267>.

Costa, S, Soares, F, Santos, C, Ferreira, MJ, Moreira, F, Pereira, AP & Cunha, F 2011,

‘An approach to promote social and communication behaviors in children with autism

spectrum disorders: Robot based intervention’, 2011 RO-MAN, IEEE, pp. 101–106,

viewed 21 February, 2014,

<http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=6005244>.

Creative Technology Ltd. 2014, ‘Intel RealSense 3D Camera Specifications’, viewed 24

January, 2015, <http://support.creative.com/kb/ShowArticle.aspx?sid=124661>.

David Feil-seifer, MJM 2011, ‘Ethical Principles for Socially Assistive Robotics’, viewed

10 February, 2014,

<http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.185.3202>.

Delft Haptics Lab 2018, ‘HapticMaster’, Moog FCS Robotics, viewed 25 February, 2014,

<http://www.delfthapticslab.nl/device/hapticmaster/>.

Demiris, Y 2009, ‘Knowing when to assist: Developmental issues in lifelong assistive

robotics’, Engineering in Medicine and Biology Society, 2009 …, pp. 3357–3360,

viewed 27 January, 2014,

<http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5333182>.

Dickstein-Fischer, L, Alexander, E, Yan, X, Su, H, Harrington, K & Fischer, GS 2011, ‘An

affordable compact humanoid robot for Autism Spectrum Disorder interventions in

children.’, Conference proceedings : ... Annual International Conference of the IEEE

Page 226: Modelling a real-time multi-sensor fusion-based navigation ...

208

Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and

Biology Society. Conference, vol. 2011, pp. 5319–22, viewed

<http://www.ncbi.nlm.nih.gov/pubmed/22255539>.

Doherty, ST, Lemieux, CJ & Canally, C 2014, ‘Tracking human activity and well-being in

natural environments using wearable sensors and experience sampling’, Social

science & medicine, vol. 106, pp. 83–92, viewed 12 June, 2014,

<http://www.sciencedirect.com/science/article/pii/S0277953614000756>.

Double Robotics 2014, ‘Double Telepresence Robot’, Apple Inc., viewed 7 April, 2014,

<http://store.apple.com/us/product/HE494LL/A/double-telepresence-robot>.

Ferrari, E, Robins, B & Dautenhahn, K 2009, ‘Therapeutic and educational objectives in

robot assisted play for children with autism’, RO-MAN 2009 - The 18th IEEE

International Symposium on Robot and Human Interactive Communication, Ieee, pp.

108–114, viewed

<http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5326251>.

Fluet, GG, Saleh, S, Ramirez, D, Adamovich, S, Kelly, D & Parikh, H 2009, ‘Robot-

assisted virtual rehabilitation (NJIT-RAVR) system for children with upper extremity

hemiplegia’, 2009 Virtual Rehabilitation International Conference, Ieee, pp. 189–192,

viewed

<http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5174230>.

Garmin Ltd. 2015, ‘Garmin Vivoactive Smartwatch’, viewed 13 September, 2015,

<http://sites.garmin.com/en-US/vivo/vivoactive/>.

Gaspar, T, Member, S, Oliveira, P & Member, S 2015, ‘New Depth From Focus Filters in

Active Monocular Vision Systems for Indoor 3-D Tracking’, IEEE Transactions on

Control Systems Technology, vol. 23, no. 5, pp. 1827–1839.

Goulart, C, Castillo, J, Valadão, C, Bastos, T & Caldeira, E 2014, ‘EEG analysis and

mobile robot as tools for emotion characterization in autism’, BMC Proceedings, vol.

8, no. Suppl 4, p. P85, viewed 20 December, 2014,

<http://www.biomedcentral.com/1753-6561/8/S4/P85>.

Guo, X, Zhang, D, Wu, K & Ni, LM 2014, ‘MODLoc: Localizing Multiple Objects in Dynamic

Indoor Environment’, IEEE Transactions on Parallel and Distributed Systems, vol. 25,

no. 11, IEEE, pp. 2969–2980, viewed 30 June, 2015,

Page 227: Modelling a real-time multi-sensor fusion-based navigation ...

209

<http://swinburnedb.librarynet.com.my:2179/articleDetails.jsp?arnumber=6662344>.

Harada, AC, Rolim, R, Fujimoto, K, Suzuki, K, Matsuhira, N & Yamaguchi, T 2017,

‘Development of basic functions for a following robot in a human gathering

environment’, SII 2016 - 2016 IEEE/SICE International Symposium on System

Integration, pp. 717–722.

Hu, C, Ma, X, Dai, X & Qian, K 2010, ‘Reliable people tracking approach for mobile robot

in indoor environments’, Robotics and Computer-Integrated Manufacturing, vol. 26,

no. 2, Elsevier, pp. 174–179, viewed <http://dx.doi.org/10.1016/j.rcim.2009.07.004>.

Jones, M, Trapp, T & Jones, N 2011, ‘Engaging Children with Severe Physical Disabilities

via Teleoperated Control of a Robot Piano Player’, … Design and Children ( …,

viewed 21 February, 2014,

<http://vip.gatech.edu/wiki/images/6/65/Robotpianoplayer.pdf>.

Khatib, O 1986, ‘Real-Time Obstacle Avoidance for Manipulators and Mobile Robots’,

The International Journal of Robotics Research, vol. 5, no. 1, pp. 90–98.

Khosla, R, Chu, M-T & Nguyen, K 2013, ‘Affective Robot Enabled Capacity and Quality

Improvement of Nursing Home Aged Care Services in Australia’, 2013 IEEE 37th

Annual Computer Software and Applications Conference Workshops, Ieee, pp. 409–

414, viewed 8 April, 2014,

<http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6605825>.

Kidd, CD, Taggart, W & Turkle, S 2006, ‘A sociable robot to encourage social interaction

among the elderly’, Proceedings 2006 IEEE International Conference on Robotics

and Automation, 2006. ICRA 2006., IEEE, pp. 3972–3976, viewed 30 March, 2014,

<http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=1642311>.

Kim, H & Lee, J 2013, ‘Stereo AoA system for indoor SLAM’, 13th International

Conference on Control, Automation and Systems (ICCAS), IEEE, Gwangju, pp.

1164–1169, viewed <http://cl5jj5ug3c.search.serialssolutions.com/?ctx_ver=Z39.88-

2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-

8&rfr_id=info:sid/summon.serialssolutions.com&rft_val_fmt=info:ofi/fmt:kev:mtx:boo

k&rft.genre=proceeding&rft.title=2013+13th+International+Conference+on+Cont>.

Lapierre, L & Zapata, R 2012, ‘A guaranteed obstacle avoidance guidance system the

safe maneuvering zone’, Autonomous Robots, vol. 32, no. 3, pp. 177–187.

Page 228: Modelling a real-time multi-sensor fusion-based navigation ...

210

Lau, BT, Ong, CA & Putra, FA 2014, ‘Non-Invasive Monitoring of People with Disabilities

via Motion Detection’, International Journal of Signal Processing Systems, vol. 2, no.

1, pp. 37–41.

Liu, C, Conn, K, Sarkar, N & Stone, W 2008, ‘Online Affect Detection and Robot Behavior

Adaptation for Intervention of Children With Autism’, IEEE Transactions on Robotics,

vol. 24, no. 4, pp. 883–896.

Majumder, AJA, Zerin, I, Ahamed, SI & Smith, RO 2014, ‘A multi-sensor approach for fall

risk prediction and prevention in elderly’, ACM SIGAPP Applied Computing Review,

vol. 14, no. 1, ACM, pp. 41–52, viewed 14 April, 2014,

<http://dl.acm.org/citation.cfm?id=2600617.2600621>.

Mankoff, KD & Russo, TA 2013, ‘The Kinect: a low-cost, high-resolution, short-range 3D

camera’, Earth Surface Processes and Landforms, vol. 38, no. 9, pp. 926–936,

viewed 10 April, 2014, <http://doi.wiley.com/10.1002/esp.3332>.

Marti, P, Pollini, A & Rullo, A 2009, ‘Creative interactive play for disabled children’, …

Design and Children, pp. 3–6, viewed 27 January, 2014,

<http://dl.acm.org/citation.cfm?id=1551871>.

McCann, E, Medvedev, M, Brooks, DJ & Saenko, K 2013, ‘“Off the grid”: Self-contained

landmarks for improved indoor probabilistic localization’, 2013 IEEE Conference on

Technologies for Practical Robot Applications (TePRA), IEEE, pp. 1–6, viewed 9

September, 2015,

<http://swinburnedb.librarynet.com.my:2179/articleDetails.jsp?arnumber=6556349>.

McMurrough, C, Ferdous, S, Papangelis, A, Boisselle, A & Heracleia, FM 2012, ‘A survey

of assistive devices for cerebral palsy patients’, Proceedings of the 5th International

Conference on PErvasive Technologies Related to Assistive Environments -

PETRA ’12, ACM Press, New York, New York, USA, p. 1, viewed

<http://dl.acm.org/citation.cfm?doid=2413097.2413119>.

Michaud, FÝ, Salter, TÜ, Duquette, AÞ & Mercier, HÞ 2006, ‘Assistive Technologies and

Children-Robot Interaction’, American Association for Artificial Intelligence, pp. 0–2.

Microsoft 2014, ‘Kinect for Windows’, viewed 31 July, 2014,

<http://www.microsoft.com/en-us/kinectforwindows/meetkinect/default.aspx>.

Microsoft 2012a, ‘Microsoft Robotics Developer Studio 4 Overview’, Microsoft

Page 229: Modelling a real-time multi-sensor fusion-based navigation ...

211

Corporation, viewed 1 December, 2017, <https://msdn.microsoft.com/en-

us/library/bb483024.aspx>.

Microsoft 2012b, Robotics Developer Studio Reference Platform Design, Microsoft

Corporation.

Microsoft 2017, ‘Visual Gesture Builder: Overview’, Microsoft Developer Network, viewed

29 September, 2016, <https://msdn.microsoft.com/en-us/library/dn785529.aspx>.

Microsoft Corporation 2013, ‘Kinect for Windows - Human Interface Guidelines v1.8’,

Microsoft Corporation, pp. 1–142, viewed <https://msdn.microsoft.com/en-

us/library/jj663791.aspx>.

Microsoft Corporation 2012, ‘Welcome to Robotics Developer Studio’, viewed 25 January,

2015, <https://msdn.microsoft.com/en-us/library/bb648760.aspx>.

Mohammad Khansari-Zadeh, S & Billard, A 2012, ‘A dynamical system approach to

realtime obstacle avoidance’, Autonomous Robots, vol. 32, no. 4, pp. 433–454.

Montaner, MB & Ramirez-Serrano, A 1998, ‘Fuzzy knowledge-based controller design for

autonomous robot navigation’, Expert Systems with Applications, vol. 14, no. 1–2,

pp. 179–186, viewed

<http://www.sciencedirect.com/science/article/pii/S0957417497000596>.

Montesano, L, Díaz, M, Bhaskar, S & Minguez, J 2010, ‘Towards an intelligent wheelchair

system for users with cerebral palsy.’, IEEE transactions on neural systems and

rehabilitation engineering : a publication of the IEEE Engineering in Medicine and

Biology Society, vol. 18, no. 2, pp. 193–202, viewed

<http://www.ncbi.nlm.nih.gov/pubmed/20071276>.

Morato, C, Kaipa, KN, Zhao, B & Gupta, SK 2014, ‘Toward Safe Human Robot

Collaboration by Using Multiple Kinects Based Real-time Human Tracking’, Journal

of Computing and Information Science in Engineering, vol. 14, no. 1, American

Society of Mechanical Engineers, p. 11006, viewed 2 July, 2014,

<http://computingengineering.asmedigitalcollection.asme.org/article.aspx?articleid=

1763548>.

Mubashir, M, Shao, L & Seed, L 2013, ‘A survey on fall detection: Principles and

approaches’, Neurocomputing, vol. 100, Elsevier, pp. 144–152, viewed 25 March,

2014, <http://linkinghub.elsevier.com/retrieve/pii/S0925231212003153>.

Page 230: Modelling a real-time multi-sensor fusion-based navigation ...

212

Neishaboori, A & Harras, K 2013, ‘Energy saving strategies in WiFi indoor localization’,

Proceedings of the 16th ACM international conference on Modeling, analysis &

simulation of wireless and mobile systems - MSWiM ’13, ACM Press, New York, New

York, USA, pp. 399–404, viewed 9 September, 2015,

<http://swinburnedb.librarynet.com.my:2300/citation.cfm?id=2507924.2507997>.

Northern Digital Inc. 2015, ‘NDI Optotrak Certus’, viewed 13 September, 2015,

<http://www.ndigital.com/msci/products/optotrak-certus/>.

Oguejiofor, OS, Okorogu, VN, Adewale, A & Osuesu, BO 2013, ‘Outdoor Localization

System Using RSSI Measurement of Wireless Sensor Network’, International

Journal of Innovative Technology and Exploring Engineering, vol. 2, no. 2, pp. 1–6.

Ong, CA, Lau, BT & Bagha, H 2013, ‘Real Time Injury and Related Activities Monitoring

with Single Rotatable Infrared Sensor’, International Journal of New Computer

Architectures and their Applications (IJNCAA), vol. 3, no. 1, The Society of Digital

Information and Wireless Communication, pp. 11–21, viewed 25 January, 2015,

<http://sdiwc.net/digital-library/real-time-injury-and-related-activities-monitoring-

with-single-rotatable-infrared-sensor>.

Pang, WC, Seet, G & Yao, X 2013, ‘A multimodal person-following system for

telepresence applications’, Proceedings of the 19th ACM Symposium on Virtual

Reality Software and Technology - VRST ’13, ACM Press, New York, New York,

USA, p. 157, viewed 25 June, 2014,

<http://dl.acm.org/citation.cfm?id=2503713.2503722>.

Peters, C, Hermann, T, Wachsmuth, S & Hoey, J 2014, ‘Automatic Task Assistance for

People with Cognitive Disabilities in Brushing Teeth - A User Study with the TEBRA

System’, ACM Transactions on Accessible Computing, vol. 5, no. 4, ACM, pp. 1–34,

viewed 9 September, 2015,

<http://swinburnedb.librarynet.com.my:2300/citation.cfm?id=2599989.2579700>.

Pham, A 2009, ‘E3: Microsoft shows off gesture control technology for Xbox 360’, Los

Angeles Times, viewed 25 January, 2015,

<http://latimesblogs.latimes.com/technology/2009/06/microsofte3.html>.

Polar Electro 2015, ‘Polar M400’, viewed 13 September, 2015,

<http://www.polar.com/en/products/improve_fitness/running_multisport/M400>.

Page 231: Modelling a real-time multi-sensor fusion-based navigation ...

213

Quattrone, A, Kulik, L & Tanin, E 2015, ‘Combining range-based and range-free methods’,

Proceedings of the 23rd SIGSPATIAL International Conference on Advances in

Geographic Information Systems - GIS ’15, ACM Press, New York, New York, USA,

pp. 1–4, viewed 1 May, 2018,

<http://dl.acm.org/citation.cfm?doid=2820783.2820839>.

Rammer, JR, Krzak, JJ, Riedel, SA & Harris, GF 2014, ‘Evaluation of upper extremity

movement characteristics during standardized pediatric functional assessment with

a Kinect®-based markerless motion analysis system.’, Conference proceedings : ...

Annual International Conference of the IEEE Engineering in Medicine and Biology

Society. IEEE Engineering in Medicine and Biology Society. Annual Conference, vol.

2014, IEEE, pp. 2525–8, viewed 9 September, 2015,

<http://swinburnedb.librarynet.com.my:2179/articleDetails.jsp?arnumber=6944136>.

Regh, JM, Rozga, A, Abowd, GD & Goodwin, MS 2014, ‘Behavioral Imaging and Autism’,

Pervasive Computing, IEEE, no. Apr.-June., pp. 84–87, viewed 9 September, 2015,

<http://swinburnedb.librarynet.com.my:2179/xpls/icp.jsp?arnumber=6818509&tag=

1>.

Reha Technology AG 2012, ‘Geo Gait System’, Ectron, viewed 25 February, 2014,

<http://www.ectron.co.uk/neuro-rehabilitation-geo-gait-system>.

Ricks, DJ & Colton, MB 2010, ‘Trends and considerations in robot-assisted autism

therapy’, 2010 IEEE International Conference on Robotics and Automation, Ieee, pp.

4354–4359, viewed

<http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5509327>.

Riek, LD, Rabinowitch, T-C, Chakrabarti, B & Robinson, P 2009, ‘Empathizing with

robots: Fellow feeling along the anthropomorphic spectrum’, 2009 3rd International

Conference on Affective Computing and Intelligent Interaction and Workshops, Ieee,

pp. 1–6, viewed

<http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5349423>.

Robins, B, Dautenhahn, K & Dickerson, P 2009, ‘From Isolation to Communication: A

Case Study Evaluation of Robot Assisted Play for Children with Autism with a

Minimally Expressive Humanoid Robot’, 2009 Second International Conferences on

Advances in Computer-Human Interactions, Ieee, pp. 205–211, viewed 27 January,

Page 232: Modelling a real-time multi-sensor fusion-based navigation ...

214

2014, <http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=4782516>.

Seo, W, Hwang, S, Park, J & Lee, J-M 2012, ‘Precise outdoor localization with a GPS–

INS integration system’, Robotica, vol. 31, pp. 1–9.

Sgorbissa, A & Zaccaria, R 2012, ‘Planning and obstacle avoidance in mobile robotics’,

Robotics and Autonomous Systems, vol. 60, no. 4, Elsevier B.V., pp. 628–638,

viewed <http://dx.doi.org/10.1016/j.robot.2011.12.009>.

Shamsuddin, S, Yussof, H, Ismail, LI, Mohamed, S, Hanapiah, FA & Zahari, NI 2012,

‘Humanoid Robot NAO Interacting with Autistic Children of Moderately Impaired

Intelligence to Augment Communication Skills’, Procedia Engineering, vol. 41, pp.

1533–1538, viewed 21 February, 2014,

<http://www.sciencedirect.com/science/article/pii/S1877705812027464>.

Shamsuddin, S, Yussof, H, Mohamed, S, Hanapiah, FA & Ismail, LI 2013, ‘Stereotyped

behavior of autistic children with lower IQ level in HRI with a humanoid robot’, 2013

IEEE Workshop on Advanced Robotics and its Social Impacts, IEEE, pp. 175–180,

viewed 21 February, 2014,

<http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=6705525>.

Shen, S, Michael, N & Kumar, V 2013, ‘Obtaining Liftoff Indoors: Autonomous Navigation

in Confined Indoor Environments’, IEEE Robotics & Automation Magazine, vol. 20,

no. 4, pp. 40–48, viewed 18 April, 2018,

<http://ieeexplore.ieee.org/document/6678604/>.

Singleton, G, Warren, S & Piersel, W 2014, ‘Clinical overview of the need for technologies

for around-the-clock monitoring of the health status of severely disabled autistic

children’, 2014 36th Annual International Conference of the IEEE Engineering in

Medicine and Biology Society, IEEE, pp. 789–791, viewed 20 December, 2014,

<http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=6943709>.

Sony Computer Entertainment Inc 2013, ‘PlayStation 4 Eye Press Release’, viewed 25

January, 2015, <http://www.scei.co.jp/corporate/release/130221b_e.html>.

Spasova, V & Iliev, I 2014, ‘A Survey on Automatic Fall Detection in the Context of

Ambient Assisted Living Systems 2 . Ambient Assisted Living – an Personalized

Healthcare’, International Journal of Advanced Computer Research, vol. 4, no. 14.

Stephanidis, C & Antona, M (eds) 2013, Universal Access in Human-Computer

Page 233: Modelling a real-time multi-sensor fusion-based navigation ...

215

Interaction. Design Methods, Tools, and Interaction Techniques for eInclusion,

Springer Berlin Heidelberg, Berlin, Heidelberg, viewed 21 February, 2014,

<http://link.springer.com/10.1007/978-3-642-39188-0>.

Su, C 2013, ‘Personal rehabilitation exercise assistant with kinect and dynamic time

warping’, International Journal of Information and Education Technology, vol. 3, no.

4, p. 448, viewed 9 September, 2015,

<http://swinburnedb.librarynet.com.my:2155/docview/1441228964/fulltextPDF?acco

untid=14205>.

Tapus, A, Peca, A, Aly, A, Pop, C, Jisa, L, Pintea, S, Rusu, AS & David, DO 2012,

‘Children with autism social engagement in interaction with Nao, an imitative robot:

A series of single case experiments’, Interaction Studies, vol. 13, no. 3, John

Benjamins Publishing Company, pp. 315–347, viewed 21 February, 2014,

<http://www.ingentaconnect.com/content/jbp/is/2012/00000013/00000003/art00001

>.

Taraporewalla, JN 2014, ‘Integrated Autonomous Emergency Detection and Warning

Systems’, Journal of Medical and Bioengineering, vol. 4, no. 1, pp. 54–58.

Tee, MKT, Lau, BT & Siswoyo, HJ 2018a, ‘An improved indoor robot human-following

navigation model using depth camera, active IR marker and proximity sensors fusion’,

Robotics, vol. 7, no. 1.

Tee, MKT, Lau, BT & Siswoyo, HJ 2018b, ‘An Improved Indoor Robot Human-Following

Navigation Model Using Depth Camera, Active IR Marker and Proximity Sensors

Fusion’, Robotics, vol. 7, no. 1, Multidisciplinary Digital Publishing Institute, p. 4,

viewed 24 February, 2018, <http://www.mdpi.com/2218-6581/7/1/4>.

Tee, MKT, Lau, BT & Siswoyo, HJ 2018c, ‘Exploring the Performance of a Sensor-Fusion-

based Navigation System for Human Following Companion Robots’, International

Journal of Mechanical Engineering and Robotics Research (IJMERR).

Tee, MKT, Lau, BT & Siswoyo, HJ 2014, ‘Exploring the Possibility of Companion Robots

for Injury Prevention for People with Disabilities’, The 19th International Conference

on Transformative Science & Engineering, Business & Social Innovation (SDPS

2014), Kuching, Sarawak, Malaysia, pp. 199 – 210.

Tee, MKT, Lau, BT & Siswoyo, HJ 2017, ‘Pathfinding decision-making using proximity

Page 234: Modelling a real-time multi-sensor fusion-based navigation ...

216

sensors, depth camera and active IR marker tracking data fusion for human following

companion robot’, ACM International Conference Proceeding Series.

Tee, MKT, Lau, BT, Siswoyo, HJ & Lau, SL 2015, ‘A Human Orientation Tracking System

using Template Matching and Active Infrared Marker’, 2015 International Conference

on Smart Sensors and Application (ICSSA 2015), Kuala Lumpur, Malaysia.

Tee, MKT, Lau, BT, Siswoyo, HJ & Lau, SL 2016, ‘Potential of Human Tracking in

Assistive Technologies for Children with Cognitive Disabilities’, Supporting the

Education of Children with Autism Spectrum Disorders, IGI Global, pp. 245–247,

viewed

<https://books.google.com/books?hl=en&lr=&id=hxwRDQAAQBAJ&oi=fnd&pg=PA

245&dq=potential+of+human+tracking+in+assistive+technologies+for+children+wit

h+cognitive+disabilities&ots=Tns3PcOKbt&sig=Ndby64oOnd7mKypAVSskjXiurhs>.

Tee, MKT, Lau, BT, Siswoyo, HJ & Then, PHH 2015, ‘Robotics for Assisting Children with

Physical and Cognitive Disabilities’, in LB Theng (ed.), Assistive Technologies for

Physical and Cognitive Disabilities, IGI Global, pp. 78–120, viewed 20 February,

2015, <http://www.igi-global.com/chapter/robotics-for-assisting-children-with-

physical-and-cognitive-disabilities/122905>.

Tee, MKT, Lau, BT, Siswoyo, HJ & Wong, DML 2016, ‘Integrating Visual Gestures for

Activity Tracking in the Injury Mitigation Strategy using CARMI’, RESKO Technical

Conference 2016: The 2nd Asian Meeting on Rehabilitation Engineering and

Assistive Technology (AMoRE AT), Rehabilitation Engineering & Assistive

Technology Society of Korea (RESKO), Goyang, Korea, pp. 61–62.

Tee, MKT, Lau, BT, Siswoyo Jo, H & Lau, SL 2016, ‘Proposing a Sensor Fusion

Technique Utilizing Depth and Ranging Sensors for Combined Human Following and

Indoor Robot Navigation’, Proceedings of the Fifth International Conference on

Network, Communication and Computing (ICNCC 2016), ACM Press, New York,

USA, pp. 331–335, viewed 20 June, 2017,

<http://dl.acm.org/citation.cfm?doid=3033288.3033345>.

Thrun, S 1998, ‘Learning metric-topological maps for indoor mobile robot navigation’,

Artificial Intelligence, vol. 99, no. 1, pp. 21–71.

Trevor, a. JB, Howard, a. M & Kemp, CC 2009, ‘Playing with toys: Towards autonomous

Page 235: Modelling a real-time multi-sensor fusion-based navigation ...

217

robot manipulation for therapeutic play’, 2009 IEEE International Conference on

Robotics and Automation, Ieee, pp. 2139–2145, viewed

<http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5152589>.

Wainer, J, Ferrari, E, Dautenhahn, K & Robins, B 2010, ‘The effectiveness of using a

robotics class to foster collaboration among groups of children with autism in an

exploratory study’, Personal and Ubiquitous Computing, vol. 14, no. 5, pp. 445–455,

viewed 27 January, 2014, <http://link.springer.com/10.1007/s00779-009-0266-z>.

Wood, LJ, Dautenhahn, K, Rainer, A, Robins, B, Lehmann, H & Syrdal, DS 2013, ‘Robot-

mediated interviews--how effective is a humanoid robot as a tool for interviewing

young children?’, PloS one, vol. 8, no. 3, Public Library of Science, p. e59448, viewed

30 January, 2014,

<http://www.plosone.org/article/info:doi/10.1371/journal.pone.0059448#pone-

0059448-g003>.

Wu, Y-N, Hwang, M, Ren, Y, Gaebler-Spira, D & Zhang, L-Q 2011, ‘Combined passive

stretching and active movement rehabilitation of lower-limb impairments in children

with cerebral palsy using a portable robot.’, Neurorehabilitation and neural repair, vol.

25, no. 4, pp. 378–85, viewed 22 January, 2014,

<http://www.ncbi.nlm.nih.gov/pubmed/21343525>.

Yang, Y, Pu, F, Li, Y, Li, S, Fan, Y & Li, D 2014, ‘Reliability and Validity of Kinect RGB-D

Sensor for Assessing Standing Balance’, IEEE Sensors Journal, vol. 14, no. 5, IEEE,

pp. 1633–1638, viewed 24 August, 2015,

<http://swinburnedb.librarynet.com.my:2179/articleDetails.jsp?arnumber=6695766>.

Yu, X 2008, ‘Approaches and principles of fall detection for elderly and patient’,

HealthCom 2008 - 10th International Conference on e-health Networking,

Applications and Services, IEEE, pp. 42–47, viewed 30 March, 2014,

<http://ieeexplore.ieee.org/articleDetails.jsp?arnumber=4600107>.

Zeng, Q, Burdet, E & Teo, CL 2009, ‘Evaluation of a collaborative wheelchair system in

cerebral palsy and traumatic brain injury users.’, Neurorehabilitation and neural

repair, vol. 23, no. 5, pp. 494–504, viewed 27 January, 2014,

<http://www.ncbi.nlm.nih.gov/pubmed/19074687>.

Page 236: Modelling a real-time multi-sensor fusion-based navigation ...

218

Page 237: Modelling a real-time multi-sensor fusion-based navigation ...

219

APPENDICES

APPENDIX A – USE CASE MODELLING

Use Case ID Use Case Name Primary Actor Scope Complexity Priority

1 Begin Monitoring Guardian In Low 3

2 Monitor Activity Robot In High 1

3 Get Activity Report Guardian In Med 1

4 Alert Guardian Robot In Med 1

5 Have Conversation Guardian In High 2

6 Stop Monitoring Guardian In Low 3

Use Case ID 1

Application ANIMA

Name Begin Monitoring

Description The Guardian sets the Robot to Monitoring Mode.

Primary Actor Guardian

Precondition The Robot is not currently monitoring the Child.

Trigger The Guardian requests the Robot to toggle Monitoring Mode.

Basic Flow 1. Guardian toggles the mode setting of the Robot to Monitoring Mode.

2. Robot checks all prerequisites of the tracking system. 3. All prerequisites to begin tracking are met, so Robot

switches to Monitoring Mode. 4. Robot sends Monitoring Mode activation result to

Guardian.

Alternate

Flows

3 - Failed tracking prerequisites.

3. a. Update activation result with list of failed prerequisites.

3. b. Go to step 4.

Page 238: Modelling a real-time multi-sensor fusion-based navigation ...

220

Use Case ID 2

Application ANIMA

Name Monitor Activity

Description The Robot attempts to look for the Child, detect the Child’s current

activity and adds the result to the monitoring log.

Primary Actor Robot

Precondition The Robot is currently in Monitoring Mode.

Trigger Activation of Monitoring Mode.

Basic Flow 1. Robot checks to see if Child is visible. 2. Child is visible, so robot locks camera position. 3. Robot checks to see if front of Child is visible. 4. Child’s front is visible. Robot remains stationary. 5. Robot checks to see if Child matches any recognizable

activity. 6. Child’s activity is recognized. Robot takes a picture and

adds entry to Monitoring Log. 7. Robot checks to see if entry type is dangerous. 8. Entry type is not dangerous, Robot resumes monitoring. 9. Robot checks to see if Monitoring Mode is deactivated. 10. Monitoring Mode is active, Robot repeats step 1 – 9.

Alternate

Flows

1 – Child is not visible.

2. a. Robot adds ‘Visibility Lost’ entry to Monitoring Log.

2. b. Robot begin wall-following room exploration.

2. c. Go to step 9.

4 - Child’s front is not acquired.

4. a. Robot adds ‘Orbiting’ entry to Monitoring Log.

4. b. Robot orbits locked position and avoid obstacles.

4. c. Go to step 9.

6 – Child’s activity is unrecognizable.

6. a. Robot takes a picture and adds ‘Unrecognized Activity’ entry

to Monitoring Log.

6. b. Go to step 9.

Page 239: Modelling a real-time multi-sensor fusion-based navigation ...

221

8 – Entry type is listed as dangerous.

8. a. Robot Alert Guardian.

8. b. Go to step 9.

Use Case ID 3

Application ANIMA

Name Get Activity Report

Description The Guardian requests for a status update and the Robot sends a

summary of the current monitoring log to the Guardian.

Primary Actor Guardian

Precondition The Robot is currently monitoring the Child.

Trigger The Guardian requests the Robot to generate Current Status

Report.

Basic Flow 1. Guardian requests for an Activity Report. 2. Robot receives request and sends the Monitoring Log

report to Guardian. 3. Guardian console displays report results.

Alternate

Flows

2 – Robot does not receive request.

2. a. Guardian console timeout.

2. b. Guardian console reports timeout as result.

2. c. Go to step 3.

Use Case ID 4

Application ANIMA

Name Alert Guardian

Description The Robot sends an Alert message to the Guardian when a

possible injury or danger entry is added into the Monitoring Log.

Primary Actor Robot

Page 240: Modelling a real-time multi-sensor fusion-based navigation ...

222

Precondition The Robot is currently monitoring the Child.

List of Dangerous Report Types is available.

Trigger A Current Report listed as Dangerous, is added into the Monitoring

Log.

Basic Flow 1. Robot sends an Alert notification to the Guardian. 2. Guardian receives alert notification and confirms success. 3. Robot adds “Alert Success” entry to Monitoring Log.

Alternate

Flows

2 – Guardian did not receive alert notification

2. a. Robot alert timeout.

2. b. Go to step 1.

Use Case ID 5

Application ANIMA

Name Have Conversation

Description The Guardian establishes a video call to the Robot to talk to the

Child.

Primary Actor Guardian

Precondition The Robot is currently monitoring the Child.

The Robot is not currently being used for Conversation.

Trigger The Guardian requests the Robot to Establish Telepresence.

Basic Flow 1. Guardian sends request for Video Link. 2. Robot receives Video Link request and video calls

Guardian. 3. Guardian accepts video call and establish communication. 4. Guardian communicates to Child and terminates

communication. 5. Robot adds “Video Link” entry to Monitoring Log.

Alternate

Flows

2 – Robot did not receive Video Link request.

2. a. Guardian console timeout.

2. b. Guardian console reports request failure.

Page 241: Modelling a real-time multi-sensor fusion-based navigation ...

223

3 – Guardian did not accept video call.

3. a. Robot video call timeout.

3. b. Robot adds “Video Link Failure” entry to Monitoring Log.

3. c. Robot sends call failure notification to Guardian.

Use Case ID 6

Application ANIMA

Name Stop Monitoring

Description The Guardian sets the Robot to stop tracking the Child.

Primary Actor Guardian

Precondition The Robot is currently monitoring the Child.

The Robot is not currently being used for Conversation.

Trigger The Guardian requests the Robot to toggle Monitoring Mode.

Basic Flow 1. Guardian toggles the mode setting of the Robot to Idle Mode.

2. Robot sends Monitoring Mode Deactivation result to Guardian.

Alternate

Flows

Page 242: Modelling a real-time multi-sensor fusion-based navigation ...

224

APPENDIX B – ACTIVITY MODEL FLOW CHARTS

[Monitoring Mode Active]

Monitor Activity

[Request for Activity Report] Get Activity Report

[Request to Alert Guardian] Alert Guardian

[Request for Conversation] Have Conversation

[None]

[None]

[None]

[Monitoring Mode Deactivated]

Main Execution Loop

Page 243: Modelling a real-time multi-sensor fusion-based navigation ...

225

[Child Visible]

Lock Camera Position

[Front is Visible]

Set Strategy to Stationary

[Activity Recognizable]

Take Picture and Log "Recognized Activity"

[Activity is not Dangerous]

[Child is not Visible] Log "Visibility Lost" Set Strategy to Roam

[Front is Not Visible] Log "Orbiting"Set Strategy to

Orbiting

[Activity Unrecognizable]

Take a Picture and Log "Unrecognized

Activity"

[Activity is Dangerous]Request Alert

Guardian

Monitor Activity

Page 244: Modelling a real-time multi-sensor fusion-based navigation ...

226

[Request Received]

Send Monitoring Log Summary as Report

[Timeout]Guardian Console Report Timeout

Guardian Console display report result

Get Activity Report

Send Alert Notification to

Guardian Console

[Receipt Confirmed]

Log "Alert Success"

[Timeout]Log "Alert Failure

and Retry"

Alert Guardian

Page 245: Modelling a real-time multi-sensor fusion-based navigation ...

227

[Video Link Request Received]

Begin Video Call to Guardian Console

[Call Picked Up]

Commence Conversation

[Call Ended] Log "Video Link"

[Timeout]Guardian Console

report Timeout

[Timeout]Log "Video Link

Timeout"

[Continue]

Have Conversation

Page 246: Modelling a real-time multi-sensor fusion-based navigation ...

228

APPENDIX C - CUSTOM STRUCTURAL SCHEMATICS

Arduino Mounting Plate

Page 247: Modelling a real-time multi-sensor fusion-based navigation ...

229

Motor Driver Platform

Page 248: Modelling a real-time multi-sensor fusion-based navigation ...

230

Robot Platform Template

Page 249: Modelling a real-time multi-sensor fusion-based navigation ...

231

Turn-Table Bottom Platform

Page 250: Modelling a real-time multi-sensor fusion-based navigation ...

232

Turn-Table Bottom Platform Bearing Spacer

Page 251: Modelling a real-time multi-sensor fusion-based navigation ...

233

Turn-Table Top Platform

Page 252: Modelling a real-time multi-sensor fusion-based navigation ...

234

Ultrasonic Sensor Bracket

Page 253: Modelling a real-time multi-sensor fusion-based navigation ...

235

Ultrasonic Sensor Mounting Plate

Page 254: Modelling a real-time multi-sensor fusion-based navigation ...

236

APPENDIX D – WIRING DIAGRAM

Signal and Power Distribution Wiring.

Page 255: Modelling a real-time multi-sensor fusion-based navigation ...

237

APPENDIX E – FUNCTIONAL TESTING SCENARIOS SIMULATION RESULTS

Single Uniform Obstruction Sample 1

Motion Path Graph - Single Uniform obstruction - Sample 1.

8

8.5

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

22-5-16-24

CARMI

Child

Page 256: Modelling a real-time multi-sensor fusion-based navigation ...

238

Combined logs - Single Uniform obstruction - Sample 1.

X Z X Z L R

4:24:51 PM 6 9.5 6 14.5

4:24:52 PM 6 9.5 6 14.5

4:24:53 PM 6 9.5 6 14.5

4:24:54 PM 6 9.5 6 14.5

4:24:55 PM 5.992651 9.502109 5.999981 14.49985

4:24:56 PM 6.004373 9.558583 5.999981 14.49996

4:24:57 PM -26.9501 -13.718 Right

4:24:58 PM 6.013047 9.573608 5.999981 14.49996 4:24:58 PM -42.6381 -13.214 Right

4:24:59 PM 6.017072 9.574073 5.999981 14.49996 4:24:59 PM -23.2525 -14.9393 Right

4:25:00 PM 6.017901 9.634454 5.999981 14.49985 4:25:00 PM -13.029 -34.673 Left

4:25:01 PM 6.015293 9.633926 5.999981 14.49997 4:25:01 PM -15.0249 -18.7145 Left

4:25:02 PM 6.015908 9.633844 5.999981 14.49996 4:25:02 PM -25.7751 -15.3791 Right

4:25:03 PM 6.017391 9.635232 5.999981 14.49985 4:25:03 PM -13.856 -15.1758 Left

4:25:04 PM 6.015132 9.724289 5.999981 14.49997 4:25:04 PM -15.4861 -19.5559 Left

4:25:05 PM 6.015524 9.79536 5.999981 14.49997 4:25:05 PM -15.4458 -13.8589 Right

4:25:06 PM 6.02018 10.01487 5.999981 14.49985 4:25:06 PM -21.3677 -14.6147 Right

4:25:07 PM 6.019746 10.05604 5.999981 14.49985 4:25:07 PM -16.6169 -16.5297 Right

4:25:08 PM 6.01063 10.31189 5.999981 14.49996 4:25:08 PM -16.0176 -22.0945 Left

4:25:09 PM -17.9192 -18.4521 Left

4:25:10 PM 6.009624 10.32631 5.999981 14.49985 4:25:10 PM -23.0018 -18.736 Right

4:25:11 PM 6.017926 10.66066 5.999981 14.4998

4:25:12 PM 6.029659 10.9949 5.999981 14.49997 4:25:12 PM -32.5512 -29.028 Right

4:25:13 PM 6.042346 11.00828 5.999981 14.49996 4:25:13 PM -41.4539 -42.951 Left

4:25:14 PM 6.061162 11.01945 5.999981 14.49997 4:25:14 PM -136.882 -76.3328 Right

4:25:15 PM 6.065367 11.02473 5.999981 14.49985 4:25:15 PM -131.378 -14.7197 Right

4:25:16 PM 6.05588 11.01888 5.999981 14.49985 4:25:16 PM -138.943 -40.0176 Right

4:25:17 PM 6.050369 11.01087 5.999981 14.4998 4:25:17 PM -127.644 -42.9589 Right

4:25:18 PM 6.042248 11.00822 5.999981 14.4998 4:25:18 PM -114.333 -44.0849 Right

4:25:19 PM 6.040653 10.97397 5.999981 14.49985 4:25:19 PM -37.4561 -59.8465 Left

4:25:20 PM 6.029764 10.94127 5.999981 14.4998 4:25:20 PM -36.9719 -37.0549 Left

4:25:21 PM 6.015032 10.93412 5.999981 14.4998 4:25:21 PM -60.8264 -92.0799 Left

4:25:22 PM -76.7321 -95.0659 Left

4:25:23 PM 6.044283 10.98443 5.999981 14.49996 4:25:23 PM -66.0759 -98.3046 Left

4:25:24 PM 6.072146 11.02569 5.999981 14.4998 4:25:24 PM -76.3523 -128.273 Left

4:25:25 PM 6.103318 11.05598 5.999981 14.49997 4:25:25 PM -95.8822 -167.888 Left

4:25:26 PM 6.155082 11.08659 5.999981 14.49996 4:25:26 PM -110.905 -200.067 Left

4:25:27 PM 6.198971 11.1044 5.999981 14.49997 4:25:27 PM -118.299 -12.7816 Right

4:25:28 PM 6.264235 11.11884 5.999981 14.49985 4:25:28 PM -113.612 -12.9004 Right

4:25:29 PM 6.315813 11.13479 5.999981 14.49997 4:25:29 PM -98.7733 -13.0937 Right

4:25:30 PM 6.368991 11.16461 5.999981 14.49985 4:25:31 PM -86.5439 -12.9257 Right

4:25:31 PM 6.406761 11.18642 5.999981 14.4998 4:25:32 PM -78.1498 -365.212 Left

4:25:32 PM 6.444426 11.22542 5.999981 14.4998 4:25:33 PM -71.7668 -12.8892 Right

4:25:34 PM 6.489717 11.27814 5.999981 14.49985 4:25:34 PM -70.7147 -13.0342 Right

4:25:35 PM 6.521106 11.31754 5.999981 14.4998 4:25:35 PM -61.6177 -13.6078 Right

4:25:36 PM 6.544494 11.38179 5.999981 14.49996 4:25:36 PM -45.608 -13.0661 Right

4:25:37 PM 6.56184 11.42163 5.999981 14.4998 4:25:37 PM -46.6599 -188.117 Left

4:25:38 PM 6.578347 11.46073 5.999981 14.49997 4:25:38 PM -58.5396 -178.498 Left

4:25:39 PM 6.602067 11.51317 5.999981 14.49996 4:25:39 PM -55.8121 -13.2812 Right

4:25:40 PM 6.618358 11.55351 5.999981 14.49997 4:25:40 PM -55.0514 -13.2783 Right

4:25:41 PM 6.639855 11.60311 5.999981 14.49996 4:25:41 PM -54.9736 -103.072 Left

4:25:42 PM 6.67061 11.6648 5.999981 14.49985 4:25:42 PM -54.8665 -13.3548 Right

4:25:43 PM 6.688007 11.72417 5.999981 14.49996 4:25:43 PM -53.9276 -13.4583 Right

4:25:44 PM 6.705129 11.76586 5.999981 14.4998 4:25:44 PM -51.1037 -13.4935 Right

4:25:46 PM 6.705971 11.83268 5.999981 14.49997 4:25:45 PM -36.1457 -15.6595 Right

4:25:47 PM 6.712987 11.87186 5.999981 14.49997 4:25:46 PM -32.4774 -13.6567 Right

4:25:48 PM 6.716211 11.93488 5.999981 14.49985 4:25:47 PM -36.3891 -13.3958 Right

4:25:49 PM 6.709541 11.96891 5.999981 14.49985 4:25:48 PM -25.1876 -15.4558 Right

4:25:50 PM 6.708804 11.96955 5.999981 14.4998 4:25:50 PM -12.6525 -23.4199 Left

4:25:51 PM 6.707758 11.96898 5.999981 14.49996 4:25:51 PM -15.1018 -21.1385 Left

4:25:52 PM 6.70869 11.96882 5.999981 14.4998 4:25:52 PM -20.356 -13.7829 Right

4:25:53 PM 6.711358 11.97085 5.999981 14.49996 4:25:53 PM -17.6539 -15.4597 Right

4:25:54 PM 6.710649 11.97074 5.999981 14.49985 4:25:54 PM -12.9871 -32.4171 Left

4:25:55 PM 6.70836 11.97029 5.999981 14.49985 4:25:55 PM -14.684 -19.5544 Left

4:25:56 PM 6.707629 11.97108 5.999981 14.49997 4:25:56 PM -14.718 -14.8763 Left

4:25:57 PM -17.5134 -13.8212 Right

4:25:58 PM 6.707671 11.97116 5.999981 14.49996 4:25:58 PM -15.9359 -15.4044 Right

4:25:59 PM 6.707596 11.97114 5.999981 14.4998 4:25:59 PM -15.9327 -15.3972 Right

4:26:00 PM 6.707594 11.97114 5.999981 14.49996

Verdict

PathDecider LogReferee Log

CARMI Child

TimeStamp TimeStamp

Tendencies

Page 257: Modelling a real-time multi-sensor fusion-based navigation ...

239

Single Uniform Obstruction Sample 2

Motion Path Graph - Single Uniform obstruction - Sample 2.

8

8.5

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

23-5-13-42

CARMI

Child

Page 258: Modelling a real-time multi-sensor fusion-based navigation ...

240

Combined logs - Single Uniform obstruction - Sample 2.

X Z X Z L R

1:42:16 PM 6 9.5 6 14.5

1:42:17 PM 6 9.5 6 14.5

1:42:18 PM 6 9.5 6 14.5

1:42:19 PM 6 9.5 6 14.5

1:42:20 PM 5.991889 9.50129 5.999981 14.49997

1:42:21 PM 5.990463 9.50124 5.999981 14.49997 1:42:21 PM -28.7972 -13.6605 Right

1:42:22 PM 5.992763 9.501761 5.999981 14.49997 1:42:22 PM -41.7807 -14.7239 Right

1:42:23 PM 5.995845 9.50164 5.999981 14.49985 1:42:23 PM -20.0164 -14.9934 Right

1:42:24 PM 6.017421 9.782832 5.999981 14.4998 1:42:24 PM -15.2665 -14.8591 Right

1:42:25 PM -19.0469 -14.9901 Right

1:42:26 PM 6.043427 10.12263 5.999981 14.49997 1:42:26 PM -24.0836 -15.7712 Right

1:42:27 PM 6.046795 10.13761 5.999981 14.49996 1:42:27 PM -20.8383 -17.1498 Right

1:42:28 PM 6.049041 10.13826 5.999981 14.49985 1:42:28 PM -15.8732 -19.119 Left

1:42:29 PM 6.037561 10.42277 5.999981 14.4998 1:42:29 PM -18.808 -22.2914 Left

1:42:30 PM 6.034553 10.47023 5.999981 14.49996

1:42:31 PM 6.034654 10.5471 5.999981 14.49985 1:42:31 PM -19.3341 -19.8957 Left

1:42:32 PM 6.039503 10.89222 5.999981 14.49997 1:42:32 PM -25.6076 -21.6171 Right

1:42:33 PM 6.053028 11.01005 5.999981 14.49996 1:42:33 PM -43.9809 -41.4203 Right

1:42:34 PM 6.063476 11.0228 5.999981 14.49985 1:42:34 PM -117.705 -51.5432 Right

1:42:35 PM 6.071899 11.03582 5.999981 14.49985 1:42:35 PM -140.651 -81.0693 Right

1:42:36 PM 6.055428 11.061 5.999981 14.49996 1:42:36 PM -159.29 -93.5426 Right

1:42:37 PM -172.461 -86.5653 Right

1:42:38 PM 5.99304 11.10075 5.999981 14.49997 1:42:38 PM -235.424 -98.7037 Right

1:42:39 PM 5.937294 11.1189 5.999981 14.49997 1:42:39 PM -252.907 -108.965 Right

1:42:40 PM 5.896004 11.13508 5.999981 14.49985 1:42:40 PM -13.0811 -121.093 Left

1:42:41 PM 5.845477 11.13525 5.999981 14.49985 1:42:41 PM -12.8851 -128.729 Left

1:42:42 PM 5.790585 11.13493 5.999981 14.49996 1:42:42 PM -12.6299 -116.001 Left

1:42:43 PM 5.741357 11.14796 5.999981 14.49996 1:42:43 PM -12.9799 -98.6647 Left

1:42:44 PM 5.688997 11.16628 5.999981 14.4998 1:42:44 PM -12.973 -96.8418 Left

1:42:45 PM 5.64477 11.18364 5.999981 14.49996 1:42:45 PM -12.6531 -97.3358 Left

1:42:46 PM 5.598882 11.20877 5.999981 14.49997 1:42:46 PM -12.7989 -88.1293 Left

1:42:47 PM 5.570353 11.23466 5.999981 14.49996 1:42:47 PM -12.823 -73.1488 Left

1:42:48 PM 5.529398 11.27607 5.999981 14.49997 1:42:48 PM -12.7637 -61.9158 Left

1:42:49 PM -12.6985 -56.2428 Left

1:42:50 PM 5.500767 11.32429 5.999981 14.49985

1:42:51 PM 5.478791 11.38216 5.999981 14.49996 1:42:51 PM -12.7173 -52.5571 Left

1:42:52 PM 5.457008 11.44357 5.999981 14.49996 1:42:52 PM -315.588 -52.6775 Right

1:42:53 PM 5.437121 11.50699 5.999981 14.49996 1:42:53 PM -12.821 -50.8265 Left

1:42:54 PM 5.421289 11.56236 5.999981 14.4998 1:42:54 PM -162.654 -54.5584 Right

1:42:55 PM 5.39052 11.61724 5.999981 14.49996 1:42:55 PM -145.791 -56.4564 Right

1:42:56 PM 5.367041 11.66942 5.999981 14.49985 1:42:56 PM -114.881 -56.5946 Right

1:42:57 PM 5.353395 11.70844 5.999981 14.49997 1:42:57 PM -12.7932 -56.5943 Left

1:42:58 PM 5.331506 11.75788 5.999981 14.49996 1:42:58 PM -12.9362 -54.8904 Left

1:42:59 PM 5.318673 11.80679 5.999981 14.49985 1:42:59 PM -14.7588 -47.7623 Left

1:43:00 PM 5.310825 11.87943 5.999981 14.4998 1:43:00 PM -15.2694 -29.8043 Left

1:43:01 PM -185.096 -21.6661 Right

1:43:02 PM 5.307692 11.88899 5.999981 14.4998 1:43:02 PM -185.083 -17.9435 Right

1:43:03 PM 5.306461 11.88945 5.999981 14.49996 1:43:03 PM -185.07 -16.0929 Right

1:43:04 PM 5.306663 11.88937 5.999981 14.49985 1:43:04 PM -185.042 -15.1816 Right

1:43:05 PM 5.306663 11.88937 5.999981 14.4998 1:43:05 PM -185.248 -14.967 Right

1:43:06 PM 5.306663 11.88937 5.999981 14.4998 1:43:06 PM -184.686 -15.5179 Right

1:43:07 PM 5.306663 11.88937 5.999981 14.49997 1:43:07 PM -185.248 -14.9656 Right

1:43:08 PM 5.306663 11.88937 5.999981 14.49996

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 259: Modelling a real-time multi-sensor fusion-based navigation ...

241

Single Uniform Obstruction Sample 3

Motion Path Graph - Single Uniform obstruction - Sample 3.

8

8.5

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

23-5-14-7

CARMI

Child

Page 260: Modelling a real-time multi-sensor fusion-based navigation ...

242

Combined logs - Single Uniform obstruction - Sample 3.

X Z X Z L R

2:07:04 PM 6 9.5 6 14.5

2:07:05 PM 6 9.5 6 14.5

2:07:06 PM 6 9.5 6 14.5

2:07:08 PM 6 9.5 6 14.5

2:07:09 PM 5.992098 9.502001 5.999981 14.4998

2:07:10 PM 5.999874 9.553609 5.999981 14.49997 2:07:10 PM -26.9946 -13.7645 Right

2:07:11 PM 6.002772 9.554681 5.999981 14.49985 2:07:11 PM -40.8162 -14.9504 Right

2:07:12 PM 6.005116 9.554918 5.999981 14.49997 2:07:12 PM -19.0335 -15.2253 Right

2:07:13 PM 6.021955 9.785067 5.999981 14.49996 2:07:13 PM -15.2758 -13.8297 Right

2:07:14 PM 6.035243 9.957998 5.999981 14.49985 2:07:14 PM -21.876 -14.5452 Right

2:07:15 PM 6.038996 9.959197 5.999981 14.49985 2:07:15 PM -20.184 -16.2563 Right

2:07:16 PM 6.034621 10.09502 5.999981 14.49997 2:07:16 PM -16.3634 -16.3941 Left

2:07:17 PM 6.021325 10.43603 5.999981 14.49996 2:07:17 PM -15.6022 -18.7412 Left

2:07:18 PM 6.007824 10.78228 5.999981 14.4998 2:07:18 PM -20.1268 -23.5267 Left

2:07:19 PM 5.992911 11.03858 5.999981 14.49997 2:07:19 PM -37.5482 -42.0444 Left

2:07:20 PM -50.0936 -125.638 Left

2:07:21 PM 5.980575 11.04763 5.999981 14.4998 2:07:21 PM -74.8932 -155.861 Left

2:07:22 PM 5.968372 11.05799 5.999981 14.49996 2:07:22 PM -93.6173 -185.655 Left

2:07:23 PM 5.998254 11.09252 5.999981 14.4998 2:07:23 PM -94.0929 -229.112 Left

2:07:24 PM 6.048455 11.12658 5.999981 14.49996 2:07:24 PM -106.298 -283.338 Left

2:07:25 PM 6.09778 11.14987 5.999981 14.49997

2:07:26 PM 6.158193 11.16004 5.999981 14.4998 2:07:26 PM -114.542 -13.569 Right

2:07:27 PM 6.231093 11.16616 5.999981 14.49997 2:07:27 PM -124.802 -12.7983 Right

2:07:28 PM 6.285115 11.18408 5.999981 14.49985 2:07:28 PM -109.864 -12.9048 Right

2:07:29 PM 6.343492 11.20709 5.999981 14.4998 2:07:29 PM -99.6987 -13.0458 Right

2:07:30 PM 6.393831 11.23446 5.999981 14.49996 2:07:30 PM -94.9677 -12.9784 Right

2:07:31 PM -86.5289 -12.9591 Right

2:07:32 PM 6.432129 11.26231 5.999981 14.4998 2:07:32 PM -79.0818 -12.9892 Right

2:07:33 PM 6.473826 11.30286 5.999981 14.49997 2:07:33 PM -72.5969 -13.0913 Right

2:07:34 PM 6.506058 11.34505 5.999981 14.49996 2:07:34 PM -64.0725 -13.0308 Right

2:07:35 PM 6.53012 11.38914 5.999981 14.49985 2:07:35 PM -52.8344 -334.488 Left

2:07:36 PM 6.541178 11.43418 5.999981 14.49985 2:07:36 PM -47.5939 -229.459 Left

2:07:37 PM 6.561899 11.48053 5.999981 14.4998 2:07:37 PM -54.104 -200.719 Left

2:07:38 PM 6.57794 11.54364 5.999981 14.49996 2:07:38 PM -55.0867 -159.123 Left

2:07:39 PM 6.594798 11.59196 5.999981 14.4998 2:07:39 PM -54.8515 -128.873 Left

2:07:40 PM 6.616853 11.63945 5.999981 14.49996 2:07:40 PM -56.7885 -13.2659 Right

2:07:41 PM 6.637027 11.68808 5.999981 14.49996 2:07:41 PM -58.5293 -13.3884 Right

2:07:42 PM 6.658021 11.74481 5.999981 14.49985 2:07:42 PM -49.182 -28.2239 Right

2:07:43 PM 6.670767 11.80093 5.999981 14.4998 2:07:43 PM -39.8064 -13.578 Right

2:07:44 PM -38.0427 -13.5266 Right

2:07:45 PM 6.678015 11.85639 5.999981 14.4998

2:07:46 PM 6.677867 11.91977 5.999981 14.49997 2:07:46 PM -30.6544 -15.3457 Right

2:07:47 PM 6.659755 11.97205 5.999981 14.49997 2:07:47 PM -13.0477 -20.3378 Left

2:07:48 PM 6.656477 11.9768 5.999981 14.49997 2:07:48 PM -15.2798 -21.5138 Left

2:07:49 PM 6.656309 11.97697 5.999981 14.49985 2:07:49 PM -21.2677 -13.8963 Right

2:07:50 PM 6.658786 11.97789 5.999981 14.49997 2:07:50 PM -23.2763 -15.4816 Right

2:07:51 PM 6.659901 11.97934 5.999981 14.49985 2:07:51 PM -12.9914 -16.4808 Left

2:07:52 PM 6.657941 11.98021 5.999981 14.4998 2:07:52 PM -14.7018 -19.6848 Left

2:07:53 PM 6.657022 11.98034 5.999981 14.49985 2:07:53 PM -14.7392 -14.9733 Left

2:07:54 PM 6.65702 11.98034 5.999981 14.49997 2:07:54 PM -17.5463 -13.8637 Right

2:07:55 PM 6.65702 11.98034 5.999981 14.49997 2:07:55 PM -17.1516 -14.2501 Right

2:07:56 PM 6.657019 11.98034 5.999981 14.49996 2:07:56 PM -16.1527 -15.2481 Right

2:07:57 PM -16.1528 #NAME? Left

2:07:58 PM 6.65702 11.98034 5.999981 14.49997 2:07:58 PM -16.153 -15.2493 Right

2:07:59 PM 6.65702 11.98034 5.999981 14.49996 2:07:59 PM -16.1528 -15.2482 Right

2:08:00 PM 6.65702 11.98034 5.999981 14.49997 2:08:00 PM -16.153 -15.2487 Right

2:08:01 PM 6.65702 11.98034 5.999981 14.49997 2:08:01 PM -16.1526 -15.248 Right

2:08:02 PM 6.65702 11.98034 5.999981 14.49997

2:08:03 PM 6.65702 11.98034 5.999981 14.49997 2:08:03 PM -16.1528 -15.2485 Right

2:08:04 PM 6.65702 11.98034 5.999981 14.4998 2:08:04 PM -16.1531 -15.2484 Right

2:08:05 PM 6.65702 11.98034 5.999981 14.49997 2:08:05 PM -16.153 -15.2483 Right

2:08:06 PM 6.657021 11.98034 5.999981 14.4998 2:08:06 PM -16.1529 -15.2484 Right

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 261: Modelling a real-time multi-sensor fusion-based navigation ...

243

Single Uniform Obstruction Sample 4

Motion Path Graph - Single Uniform obstruction - Sample 4.

8

8.5

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.25.45.65.866.26.46.66.87

23-5-14-23

CARMI

Child

Page 262: Modelling a real-time multi-sensor fusion-based navigation ...

244

Combined logs - Single Uniform obstruction - Sample 4.

X Z X Z L R

2:24:00 PM 6 9.5 6 14.5

2:24:01 PM 6 9.5 6 14.5

2:24:02 PM 6 9.5 6 14.5

2:24:03 PM 6 9.5 6 14.5

2:24:04 PM 5.991642 9.503012 5.999981 14.4998

2:24:05 PM 5.989827 9.503398 5.999981 14.49997 2:24:05 PM -30.7473 -13.5811 Right

2:24:06 PM 5.991612 9.503591 5.999981 14.4998

2:24:07 PM -34.0078 -15.0101 Right

2:24:08 PM 5.993536 9.503521 5.999981 14.4998 2:24:08 PM -22.5796 -15.2161 Right

2:24:09 PM 5.994484 9.504256 5.999981 14.49996 2:24:09 PM -23.5277 -13.3261 Right

2:24:10 PM 5.995536 9.50525 5.999981 14.4998 2:24:10 PM -23.5784 -13.2763 Right

2:24:11 PM 5.999249 9.505614 5.999981 14.49997 2:24:11 PM -21.7762 -15.0817 Right

2:24:12 PM 6.000371 9.551789 5.999981 14.4998 2:24:12 PM -15.0524 -15.0388 Right

2:24:13 PM 5.997384 9.887675 5.999981 14.4998 2:24:13 PM -14.2233 -15.9 Left

2:24:14 PM 5.994301 10.23417 5.999981 14.49996 2:24:14 PM -15.5388 -17.3487 Left

2:24:15 PM 5.991281 10.57542 5.999981 14.49985 2:24:15 PM -17.9376 -19.9814 Left

2:24:16 PM 5.988273 10.91667 5.999981 14.4998 2:24:16 PM -26.4472 -29.0557 Left

2:24:17 PM 5.980721 11.0435 5.999981 14.4998 2:24:17 PM -47.255 -47.368 Left

2:24:18 PM 5.964969 11.05122 5.999981 14.49996 2:24:18 PM -60.3566 -146.316 Left

2:24:19 PM -87.7807 -166.652 Left

2:24:20 PM 5.954296 11.06089 5.999981 14.49985 2:24:20 PM -73.5484 -167.724 Left

2:24:21 PM 5.957585 11.07703 5.999981 14.49997 2:24:21 PM -88.5119 -228.897 Left

2:24:22 PM 6.004911 11.11397 5.999981 14.49996 2:24:22 PM -97.8165 -283.308 Left

2:24:23 PM 6.050199 11.14217 5.999981 14.49985 2:24:23 PM -112.701 -334.207 Left

2:24:24 PM 6.094389 11.16249 5.999981 14.49997

2:24:25 PM 6.146997 11.16607 5.999981 14.49997 2:24:25 PM -124.794 -12.8637 Right

2:24:26 PM 6.208316 11.16912 5.999981 14.49985 2:24:26 PM -127.605 -12.9631 Right

2:24:27 PM 6.249608 11.17535 5.999981 14.49997 2:24:27 PM -114.574 -13.0288 Right

2:24:28 PM 6.301901 11.1987 5.999981 14.49985 2:24:28 PM -96.9561 -27.1994 Right

2:24:29 PM 6.357595 11.23395 5.999981 14.49997 2:24:29 PM -92.1243 -12.9502 Right

2:24:30 PM 6.404997 11.26928 5.999981 14.4998 2:24:30 PM -86.5461 -12.9724 Right

2:24:31 PM 6.440885 11.31119 5.999981 14.49996 2:24:31 PM -75.3952 -13.0778 Right

2:24:32 PM -62.2642 -13.079 Right

2:24:33 PM 6.462356 11.34398 5.999981 14.49996 2:24:33 PM -54.7869 -13.0567 Right

2:24:34 PM 6.484216 11.38859 5.999981 14.49985 2:24:34 PM -50.4268 -13.163 Right

2:24:35 PM 6.50139 11.43284 5.999981 14.49985 2:24:35 PM -50.4847 -13.2486 Right

2:24:36 PM 6.522586 11.49151 5.999981 14.49997 2:24:36 PM -52.2708 -283.639 Left

2:24:37 PM 6.547487 11.53818 5.999981 14.49985 2:24:37 PM -55.7413 -229.54 Left

2:24:38 PM 6.566504 11.58569 5.999981 14.4998 2:24:38 PM -54.1337 -13.3853 Right

2:24:39 PM 6.583529 11.63368 5.999981 14.49997 2:24:39 PM -55.1095 -13.2875 Right

2:24:40 PM 6.603369 11.68077 5.999981 14.49985 2:24:40 PM -55.1443 -13.1094 Right

2:24:41 PM 6.627034 11.72534 5.999981 14.49985 2:24:41 PM -54.7697 -13.689 Right

2:24:42 PM 6.64632 11.7841 5.999981 14.4998 2:24:42 PM -42.8075 -161.01 Left

2:24:43 PM 6.656025 11.84265 5.999981 14.49996 2:24:43 PM -34.2168 -19.851 Right

2:24:44 PM -29.707 -15.3036 Right

2:24:45 PM 6.656774 11.89533 5.999981 14.49985

2:24:46 PM 6.655472 11.9461 5.999981 14.4998 2:24:46 PM -22.2871 -15.4173 Right

2:24:47 PM 6.656337 11.94676 5.999981 14.49985 2:24:47 PM -12.9085 -33.1484 Left

2:24:48 PM 6.654506 11.94646 5.999981 14.49996 2:24:48 PM -15.1668 -18.4486 Left

2:24:49 PM 6.654568 11.94666 5.999981 14.49985 2:24:49 PM -21.3737 -13.7049 Right

2:24:50 PM 6.656505 11.94637 5.999981 14.49997 2:24:50 PM -19.519 -15.4722 Right

2:24:51 PM 6.656658 11.94659 5.999981 14.49985 2:24:51 PM -12.9536 -248.776 Left

2:24:52 PM 6.656202 11.94633 5.999981 14.49996 2:24:52 PM -14.6937 -248.976 Left

2:24:53 PM 6.656202 11.94633 5.999981 14.49996 2:24:53 PM -14.6903 -248.979 Left

2:24:54 PM 6.656202 11.94633 5.999981 14.49996 2:24:54 PM -14.7179 -248.952 Left

2:24:55 PM 6.656202 11.94633 5.999981 14.4998 2:24:55 PM -14.7214 -248.953 Left

2:24:56 PM -14.6902 -248.979 Left

2:24:57 PM 6.656202 11.94633 5.999981 14.49997

2:24:58 PM 6.656202 11.94633 5.999981 14.49997 2:24:58 PM -14.6901 -248.979 Left

2:24:59 PM 6.656203 11.94633 5.999981 14.49996 2:24:59 PM -14.6903 -248.98 Left

2:25:00 PM 6.656203 11.94633 5.999981 14.4998 2:25:00 PM -14.685 -248.975 Left

2:25:01 PM 6.656203 11.94633 5.999981 14.4998 2:25:01 PM -14.7179 -248.952 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 263: Modelling a real-time multi-sensor fusion-based navigation ...

245

Single Uniform Obstruction Sample 5

Motion Path Graph - Single Uniform obstruction - Sample 5.

8

8.5

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

23-5-14-36

CARMI

Child

Page 264: Modelling a real-time multi-sensor fusion-based navigation ...

246

Combined logs - Single Uniform obstruction - Sample 5.

X Z X Z L R

2:36:28 PM 6 9.5 6 14.5

2:36:29 PM 6 9.5 6 14.5

2:36:30 PM 6 9.5 6 14.5

2:36:31 PM 6 9.5 6 14.5

2:36:32 PM 5.992746 9.501021 5.999981 14.49985

2:36:34 PM 6.007417 9.582978 5.999981 14.4998 2:36:34 PM -30.0922 -13.6283 Right

2:36:35 PM 6.009819 9.583772 5.999981 14.49997 2:36:35 PM -39.0405 -14.9825 Right

2:36:36 PM 6.01228 9.583688 5.999981 14.49997 2:36:36 PM -23.8771 -15.1842 Right

2:36:37 PM 6.019166 9.657411 5.999981 14.49996 2:36:37 PM -15.3707 -18.3197 Left

2:36:38 PM 6.032834 9.764443 5.999981 14.49997 2:36:38 PM -23.6962 -14.36 Right

2:36:39 PM 6.038038 9.790393 5.999981 14.49996 2:36:39 PM -35.7012 -15.5458 Right

2:36:40 PM 6.039547 9.791645 5.999981 14.4998 2:36:40 PM -21.622 -15.6614 Right

2:36:41 PM 6.043862 9.916765 5.999981 14.49996

2:36:42 PM -15.6847 -23.3744 Left

2:36:43 PM 6.044794 9.945532 5.999981 14.49996 2:36:43 PM -17.5249 -14.823 Right

2:36:44 PM 6.054964 10.01599 5.999981 14.49985 2:36:44 PM -30.8444 -16.3091 Right

2:36:45 PM 6.05662 10.01635 5.999981 14.4998 2:36:45 PM -18.6087 -16.4086 Right

2:36:46 PM 6.052341 10.06703 5.999981 14.49996 2:36:46 PM -16.1921 -23.9914 Left

2:36:47 PM 6.06334 10.18729 5.999981 14.49996 2:36:47 PM -27.2906 -15.7063 Right

2:36:48 PM 6.063964 10.18735 5.999981 14.49985 2:36:48 PM -32.2744 -17.3988 Right

2:36:49 PM 6.066131 10.18779 5.999981 14.49985 2:36:49 PM -20.8783 -17.6071 Right

2:36:50 PM 6.071425 10.46762 5.999981 14.4998 2:36:50 PM -18.1744 -17.8435 Right

2:36:51 PM 6.077846 10.80885 5.999981 14.49997 2:36:51 PM -23.1739 -23.013 Right

2:36:52 PM 6.084408 11.03062 5.999981 14.49997 2:36:52 PM -49.7358 -44.9287 Right

2:36:53 PM 6.087319 11.03093 5.999981 14.49997 2:36:53 PM -46.6126 -47.9759 Left

2:36:54 PM 6.096694 11.03804 5.999981 14.49997 2:36:54 PM -43.8086 -52.6335 Left

2:36:55 PM -145.064 -73.6089 Right

2:36:56 PM 6.105868 11.04598 5.999981 14.49996 2:36:56 PM -173.743 -89.7996 Right

2:36:57 PM 6.11417 11.05821 5.999981 14.4998 2:36:57 PM -172.448 -75.3695 Right

2:36:58 PM 6.103453 11.07753 5.999981 14.49996 2:36:58 PM -219.803 -94.0106 Right

2:36:59 PM 6.045853 11.11095 5.999981 14.49985 2:36:59 PM -235.421 -109.872 Right

2:37:00 PM 5.993153 11.13041 5.999981 14.49996

2:37:01 PM 5.936425 11.13254 5.999981 14.4998 2:37:01 PM -219.867 -117.349 Right

2:37:02 PM 5.887553 11.12776 5.999981 14.49997 2:37:02 PM -13.1128 -131.36 Left

2:37:03 PM 5.835094 11.1264 5.999981 14.49996 2:37:03 PM -12.6219 -136.521 Left

2:37:04 PM 5.778492 11.13452 5.999981 14.49985 2:37:04 PM -290.969 -117.819 Right

2:37:05 PM 5.724764 11.14765 5.999981 14.49997 2:37:05 PM -13.1135 -98.7504 Left

2:37:06 PM 5.649559 11.17921 5.999981 14.49996 2:37:06 PM -12.7039 -99.1119 Left

2:37:07 PM -453.762 -90.8212 Right

2:37:08 PM 5.601518 11.21611 5.999981 14.4998 2:37:08 PM -12.4984 -71.999 Left

2:37:09 PM 5.562678 11.2585 5.999981 14.49997 2:37:09 PM -27.609 -65.6256 Left

2:37:10 PM 5.538737 11.29534 5.999981 14.49997 2:37:10 PM -12.6948 -59.952 Left

2:37:11 PM 5.512889 11.36313 5.999981 14.49996 2:37:11 PM -12.6604 -48.7911 Left

2:37:12 PM 5.490465 11.41259 5.999981 14.49997 2:37:12 PM -375.087 -49.832 Right

2:37:13 PM 5.47027 11.47035 5.999981 14.49996 2:37:13 PM -12.8178 -53.6879 Left

2:37:14 PM 5.450334 11.51438 5.999981 14.49985 2:37:14 PM -12.7879 -54.6147 Left

2:37:15 PM 5.428233 11.56512 5.999981 14.49997 2:37:15 PM -12.7249 -54.5971 Left

2:37:16 PM 5.404184 11.61186 5.999981 14.49985 2:37:16 PM -162.73 -54.7042 Right

2:37:17 PM 5.388108 11.65124 5.999981 14.4998

2:37:18 PM 5.364429 11.71439 5.999981 14.49996 2:37:18 PM -131.737 -54.7618 Right

2:37:19 PM -12.8862 -54.916 Left

2:37:20 PM 5.345765 11.7772 5.999981 14.49997 2:37:20 PM -164.713 -52.2816 Right

2:37:21 PM 5.340353 11.83129 5.999981 14.49985 2:37:21 PM -12.9225 -32.5273 Left

2:37:22 PM 5.330233 11.88993 5.999981 14.4998 2:37:22 PM -13.1033 -34.7145 Left

2:37:23 PM 5.327972 11.93691 5.999981 14.49985 2:37:23 PM -14.9929 -31.5138 Left

2:37:24 PM 5.328048 11.97961 5.999981 14.4998 2:37:24 PM -15.0799 -19.5488 Left

2:37:25 PM 5.328823 11.98095 5.999981 14.49996 2:37:25 PM -24.6793 -14.1889 Right

2:37:26 PM 5.33059 11.9813 5.999981 14.49997 2:37:26 PM -19.7997 -16.2016 Right

2:37:27 PM 5.331334 11.98141 5.999981 14.4998 2:37:27 PM -13.3778 -18.2063 Left

2:37:28 PM 5.331336 11.98141 5.999981 14.4998 2:37:28 PM -13.5521 -18.0721 Left

2:37:29 PM 5.33134 11.98141 5.999981 14.49997 2:37:29 PM -13.7984 -17.8204 Left

2:37:31 PM 5.331341 11.98141 5.999981 14.49985 2:37:31 PM -13.7985 -17.8208 Left

2:37:32 PM 5.331342 11.98141 5.999981 14.49996 2:37:32 PM -13.7986 -17.8205 Left

2:37:33 PM 5.331342 11.98141 5.999981 14.49996 2:37:33 PM -13.8118 -17.8346 Left

2:37:34 PM 5.331343 11.98141 5.999981 14.4998 2:37:34 PM -13.811 -17.8345 Left

2:37:35 PM 5.331323 11.98142 5.999981 14.49997 2:37:35 PM -13.8126 -17.8352 Left

2:37:36 PM 5.331327 11.98141 5.999981 14.4998 2:37:36 PM -13.7998 -17.8234 Left

2:37:37 PM 5.331337 11.98141 5.999981 14.4998 2:37:37 PM -13.7994 -17.8207 Left

2:37:38 PM 5.331334 11.98141 5.999981 14.49996 2:37:38 PM -13.7996 -17.82 Left

2:37:39 PM 5.331337 11.98141 5.999981 14.49997 2:37:39 PM -13.7949 -17.8157 Left

2:37:40 PM 5.331342 11.98141 5.999981 14.49996

2:37:41 PM 5.331341 11.98141 5.999981 14.49996 2:37:41 PM -13.7997 -17.8208 Left

2:37:42 PM 5.331332 11.98142 5.999981 14.4998 2:37:42 PM -13.8 -17.82 Left

2:37:43 PM -15.3493 -16.2713 Left

2:37:44 PM 5.331334 11.98142 5.999981 14.49997 2:37:44 PM -13.7997 -17.821 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 265: Modelling a real-time multi-sensor fusion-based navigation ...

247

Single Uniform Obstruction Sample 6

Motion Path Graph - Single Uniform obstruction - Sample 6.

X Z X Z L R

2:36:28 PM 6 9.5 6 14.5

2:36:29 PM 6 9.5 6 14.5

2:36:30 PM 6 9.5 6 14.5

2:36:31 PM 6 9.5 6 14.5

2:36:32 PM 5.992746 9.501021 5.999981 14.49985

2:36:34 PM 6.007417 9.582978 5.999981 14.4998 2:36:34 PM -30.0922 -13.6283 Right

2:36:35 PM 6.009819 9.583772 5.999981 14.49997 2:36:35 PM -39.0405 -14.9825 Right

2:36:36 PM 6.01228 9.583688 5.999981 14.49997 2:36:36 PM -23.8771 -15.1842 Right

2:36:37 PM 6.019166 9.657411 5.999981 14.49996 2:36:37 PM -15.3707 -18.3197 Left

2:36:38 PM 6.032834 9.764443 5.999981 14.49997 2:36:38 PM -23.6962 -14.36 Right

2:36:39 PM 6.038038 9.790393 5.999981 14.49996 2:36:39 PM -35.7012 -15.5458 Right

2:36:40 PM 6.039547 9.791645 5.999981 14.4998 2:36:40 PM -21.622 -15.6614 Right

2:36:41 PM 6.043862 9.916765 5.999981 14.49996

2:36:42 PM -15.6847 -23.3744 Left

2:36:43 PM 6.044794 9.945532 5.999981 14.49996 2:36:43 PM -17.5249 -14.823 Right

2:36:44 PM 6.054964 10.01599 5.999981 14.49985 2:36:44 PM -30.8444 -16.3091 Right

2:36:45 PM 6.05662 10.01635 5.999981 14.4998 2:36:45 PM -18.6087 -16.4086 Right

2:36:46 PM 6.052341 10.06703 5.999981 14.49996 2:36:46 PM -16.1921 -23.9914 Left

2:36:47 PM 6.06334 10.18729 5.999981 14.49996 2:36:47 PM -27.2906 -15.7063 Right

2:36:48 PM 6.063964 10.18735 5.999981 14.49985 2:36:48 PM -32.2744 -17.3988 Right

2:36:49 PM 6.066131 10.18779 5.999981 14.49985 2:36:49 PM -20.8783 -17.6071 Right

2:36:50 PM 6.071425 10.46762 5.999981 14.4998 2:36:50 PM -18.1744 -17.8435 Right

2:36:51 PM 6.077846 10.80885 5.999981 14.49997 2:36:51 PM -23.1739 -23.013 Right

2:36:52 PM 6.084408 11.03062 5.999981 14.49997 2:36:52 PM -49.7358 -44.9287 Right

2:36:53 PM 6.087319 11.03093 5.999981 14.49997 2:36:53 PM -46.6126 -47.9759 Left

2:36:54 PM 6.096694 11.03804 5.999981 14.49997 2:36:54 PM -43.8086 -52.6335 Left

2:36:55 PM -145.064 -73.6089 Right

2:36:56 PM 6.105868 11.04598 5.999981 14.49996 2:36:56 PM -173.743 -89.7996 Right

2:36:57 PM 6.11417 11.05821 5.999981 14.4998 2:36:57 PM -172.448 -75.3695 Right

2:36:58 PM 6.103453 11.07753 5.999981 14.49996 2:36:58 PM -219.803 -94.0106 Right

2:36:59 PM 6.045853 11.11095 5.999981 14.49985 2:36:59 PM -235.421 -109.872 Right

2:37:00 PM 5.993153 11.13041 5.999981 14.49996

2:37:01 PM 5.936425 11.13254 5.999981 14.4998 2:37:01 PM -219.867 -117.349 Right

2:37:02 PM 5.887553 11.12776 5.999981 14.49997 2:37:02 PM -13.1128 -131.36 Left

2:37:03 PM 5.835094 11.1264 5.999981 14.49996 2:37:03 PM -12.6219 -136.521 Left

2:37:04 PM 5.778492 11.13452 5.999981 14.49985 2:37:04 PM -290.969 -117.819 Right

2:37:05 PM 5.724764 11.14765 5.999981 14.49997 2:37:05 PM -13.1135 -98.7504 Left

2:37:06 PM 5.649559 11.17921 5.999981 14.49996 2:37:06 PM -12.7039 -99.1119 Left

2:37:07 PM -453.762 -90.8212 Right

2:37:08 PM 5.601518 11.21611 5.999981 14.4998 2:37:08 PM -12.4984 -71.999 Left

2:37:09 PM 5.562678 11.2585 5.999981 14.49997 2:37:09 PM -27.609 -65.6256 Left

2:37:10 PM 5.538737 11.29534 5.999981 14.49997 2:37:10 PM -12.6948 -59.952 Left

2:37:11 PM 5.512889 11.36313 5.999981 14.49996 2:37:11 PM -12.6604 -48.7911 Left

2:37:12 PM 5.490465 11.41259 5.999981 14.49997 2:37:12 PM -375.087 -49.832 Right

2:37:13 PM 5.47027 11.47035 5.999981 14.49996 2:37:13 PM -12.8178 -53.6879 Left

2:37:14 PM 5.450334 11.51438 5.999981 14.49985 2:37:14 PM -12.7879 -54.6147 Left

2:37:15 PM 5.428233 11.56512 5.999981 14.49997 2:37:15 PM -12.7249 -54.5971 Left

2:37:16 PM 5.404184 11.61186 5.999981 14.49985 2:37:16 PM -162.73 -54.7042 Right

2:37:17 PM 5.388108 11.65124 5.999981 14.4998

2:37:18 PM 5.364429 11.71439 5.999981 14.49996 2:37:18 PM -131.737 -54.7618 Right

2:37:19 PM -12.8862 -54.916 Left

2:37:20 PM 5.345765 11.7772 5.999981 14.49997 2:37:20 PM -164.713 -52.2816 Right

2:37:21 PM 5.340353 11.83129 5.999981 14.49985 2:37:21 PM -12.9225 -32.5273 Left

2:37:22 PM 5.330233 11.88993 5.999981 14.4998 2:37:22 PM -13.1033 -34.7145 Left

2:37:23 PM 5.327972 11.93691 5.999981 14.49985 2:37:23 PM -14.9929 -31.5138 Left

2:37:24 PM 5.328048 11.97961 5.999981 14.4998 2:37:24 PM -15.0799 -19.5488 Left

2:37:25 PM 5.328823 11.98095 5.999981 14.49996 2:37:25 PM -24.6793 -14.1889 Right

2:37:26 PM 5.33059 11.9813 5.999981 14.49997 2:37:26 PM -19.7997 -16.2016 Right

2:37:27 PM 5.331334 11.98141 5.999981 14.4998 2:37:27 PM -13.3778 -18.2063 Left

2:37:28 PM 5.331336 11.98141 5.999981 14.4998 2:37:28 PM -13.5521 -18.0721 Left

2:37:29 PM 5.33134 11.98141 5.999981 14.49997 2:37:29 PM -13.7984 -17.8204 Left

2:37:31 PM 5.331341 11.98141 5.999981 14.49985 2:37:31 PM -13.7985 -17.8208 Left

2:37:32 PM 5.331342 11.98141 5.999981 14.49996 2:37:32 PM -13.7986 -17.8205 Left

2:37:33 PM 5.331342 11.98141 5.999981 14.49996 2:37:33 PM -13.8118 -17.8346 Left

2:37:34 PM 5.331343 11.98141 5.999981 14.4998 2:37:34 PM -13.811 -17.8345 Left

2:37:35 PM 5.331323 11.98142 5.999981 14.49997 2:37:35 PM -13.8126 -17.8352 Left

2:37:36 PM 5.331327 11.98141 5.999981 14.4998 2:37:36 PM -13.7998 -17.8234 Left

2:37:37 PM 5.331337 11.98141 5.999981 14.4998 2:37:37 PM -13.7994 -17.8207 Left

2:37:38 PM 5.331334 11.98141 5.999981 14.49996 2:37:38 PM -13.7996 -17.82 Left

2:37:39 PM 5.331337 11.98141 5.999981 14.49997 2:37:39 PM -13.7949 -17.8157 Left

2:37:40 PM 5.331342 11.98141 5.999981 14.49996

2:37:41 PM 5.331341 11.98141 5.999981 14.49996 2:37:41 PM -13.7997 -17.8208 Left

2:37:42 PM 5.331332 11.98142 5.999981 14.4998 2:37:42 PM -13.8 -17.82 Left

2:37:43 PM -15.3493 -16.2713 Left

2:37:44 PM 5.331334 11.98142 5.999981 14.49997 2:37:44 PM -13.7997 -17.821 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

8

8.5

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

23-5-14-53

CARMI

Child

Page 266: Modelling a real-time multi-sensor fusion-based navigation ...

248

Combined logs - Single Uniform obstruction - Sample 6.

X Z X Z L R

2:53:22 PM 6 9.5 6 14.5

2:53:23 PM 6 9.5 6 14.5

2:53:24 PM 6 9.5 6 14.5

2:53:26 PM 6 9.5 6 14.5

2:53:27 PM 5.995378 9.50098 5.999981 14.4998

2:53:28 PM 5.982534 9.504639 5.999981 14.49985

2:53:29 PM 5.972011 9.506713 5.999981 14.4998

2:53:30 PM 5.966234 9.511897 5.999981 14.4998

2:53:31 PM 5.956379 9.517919 5.999981 14.49985

2:53:32 PM 5.948141 9.528584 5.999981 14.4998

2:53:45 PM 6 9.5 6 14.5

2:53:47 PM 6 9.5 6 14.5

2:53:48 PM 6 9.5 6 14.5

2:53:49 PM 6 9.5 6 14.5

2:53:50 PM 5.999637 9.500959 5.999981 14.49985

2:53:51 PM 5.989192 9.501268 5.999981 14.4998

2:53:52 PM 6.009953 9.581203 5.999981 14.4998 2:53:52 PM -24.1744 -13.7619 Right

2:53:53 PM -38.1995 -14.8678 Right

2:53:54 PM 6.011352 9.582403 5.999981 14.49996 2:53:54 PM -23.7347 -15.317 Right

2:53:55 PM 6.013167 9.582652 5.999981 14.49985 2:53:55 PM -17.9856 -13.6688 Right

2:53:56 PM 6.031202 9.803303 5.999981 14.49997 2:53:56 PM -18.9116 -14.8296 Right

2:53:57 PM 6.059042 10.14826 5.999981 14.49985 2:53:57 PM -24.0805 -15.7896 Right

2:53:58 PM 6.068552 10.22809 5.999981 14.49985

2:53:59 PM 6.069862 10.24462 5.999981 14.4998 2:53:59 PM -22.2815 -17.587 Right

2:54:00 PM 6.071493 10.30073 5.999981 14.4998 2:54:00 PM -17.8255 -22.0933 Left

2:54:01 PM 6.081372 10.45183 5.999981 14.49985 2:54:01 PM -20.5162 -16.7644 Right

2:54:02 PM 6.082294 10.45186 5.999981 14.49997 2:54:02 PM -31.3897 -17.7404 Right

2:54:03 PM 6.084512 10.45175 5.999981 14.49996 2:54:03 PM -26.5856 -19.7632 Right

2:54:04 PM 6.090533 10.64822 5.999981 14.49985 2:54:04 PM -17.0844 -19.7062 Left

2:54:05 PM -25.6252 -22.7553 Right

2:54:06 PM 6.105381 10.97767 5.999981 14.4998 2:54:06 PM -45.2909 -49.1814 Left

2:54:07 PM 6.108428 10.98384 5.999981 14.4998 2:54:07 PM -42.9896 -41.1448 Right

2:54:08 PM 6.109766 10.98386 5.999981 14.4998 2:54:08 PM -39.4612 -87.5404 Left

2:54:09 PM 6.097194 10.98714 5.999981 14.49985 2:54:09 PM -46.9477 -104.485 Left

2:54:10 PM 6.087941 10.99595 5.999981 14.4998 2:54:10 PM -82.4227 -121.018 Left

2:54:11 PM 6.08036 11.00706 5.999981 14.49985

2:54:12 PM 6.120484 11.0537 5.999981 14.4998 2:54:12 PM -72.6487 -117.417 Left

2:54:13 PM 6.164454 11.09466 5.999981 14.49997 2:54:13 PM -73.6245 -158.409 Left

2:54:14 PM 6.210436 11.13085 5.999981 14.49985 2:54:14 PM -88.4607 -245.75 Left

2:54:15 PM 6.251112 11.154 5.999981 14.49985 2:54:15 PM -100.657 -307.041 Left

2:54:16 PM 6.311867 11.17426 5.999981 14.49997 2:54:16 PM -112.671 -365.172 Left

2:54:17 PM 6.35881 11.19403 5.999981 14.49985 2:54:17 PM -98.6339 -12.8954 Right

2:54:18 PM -79.259 -12.9476 Right

2:54:19 PM 6.408604 11.22843 5.999981 14.49996 2:54:19 PM -77.2713 -12.9919 Right

2:54:20 PM 6.446264 11.2663 5.999981 14.49996 2:54:20 PM -73.5519 -12.8835 Right

2:54:21 PM 6.47231 11.29582 5.999981 14.49997

2:54:22 PM 6.496563 11.32801 5.999981 14.49985 2:54:22 PM -56.1056 -13.6863 Right

2:54:23 PM 6.517154 11.37788 5.999981 14.4998 2:54:23 PM -46.6246 -13.0017 Right

2:54:24 PM 6.537557 11.42929 5.999981 14.49997 2:54:24 PM -53.2391 -13.1359 Right

2:54:25 PM 6.560007 11.48385 5.999981 14.49997 2:54:25 PM -54.8647 -13.2653 Right

2:54:26 PM 6.571776 11.5283 5.999981 14.49997 2:54:26 PM -48.1041 -214.537 Left

2:54:27 PM 6.589194 11.57422 5.999981 14.49997 2:54:27 PM -45.7202 -12.6899 Right

2:54:28 PM 6.611063 11.62781 5.999981 14.49996 2:54:28 PM -61.3046 -123.48 Left

2:54:29 PM 6.63242 11.69119 5.999981 14.49985 2:54:29 PM -64.3123 -19.2208 Right

2:54:31 PM 6.647907 11.74242 5.999981 14.49997 2:54:31 PM -49.5957 -14.0962 Right

2:54:32 PM 6.667361 11.79609 5.999981 14.49997 2:54:32 PM -38.9248 -13.5605 Right

2:54:33 PM 6.670047 11.84444 5.999981 14.49996 2:54:33 PM -34.3969 -170.425 Left

2:54:34 PM 6.67282 11.89478 5.999981 14.49997 2:54:34 PM -30.5215 -13.6087 Right

2:54:35 PM 6.675263 11.94117 5.999981 14.49996 2:54:35 PM -25.2071 -15.5397 Right

2:54:36 PM 6.678015 11.95598 5.999981 14.49997 2:54:36 PM -12.9488 -21.5554 Left

2:54:37 PM 6.676054 11.95506 5.999981 14.4998 2:54:37 PM -15.2631 -21.6175 Left

2:54:38 PM 6.675122 11.95514 5.999981 14.4998

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 267: Modelling a real-time multi-sensor fusion-based navigation ...

249

Single Uniform Obstruction Sample 7

Motion Path Graph - Single Uniform obstruction - Sample 7.

Combined logs - Single Uniform obstruction - Sample 7.

8

8.5

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

23-5-16-26

CARMI

Child

X Z X Z L R

4:26:07 PM 6 9.5 6 14.5

4:26:08 PM 6 9.5 6 14.5

4:26:09 PM 6 9.5 6 14.5

4:26:10 PM 6 9.5 6 14.5

4:26:12 PM 5.99493 9.500999 5.999981 14.49996

4:26:13 PM 5.985316 9.503292 5.999981 14.49997

4:26:14 PM 5.977532 9.508522 5.999981 14.4998

4:26:15 PM 5.969895 9.521795 5.999981 14.4998

4:26:16 PM 5.962986 9.527066 5.999981 14.49985

4:26:17 PM 5.958001 9.536101 5.999981 14.49996

4:26:57 PM 6 9.5 6 14.5

4:26:58 PM 6 9.5 6 14.5

4:27:00 PM 6 9.5 6 14.5

4:27:01 PM 6 9.5 6 14.5

4:27:02 PM 6.000072 9.500964 5.999981 14.4998

4:27:03 PM 5.990367 9.502106 5.999981 14.49997

4:27:04 PM 6.011034 9.59257 5.999981 14.49985 4:27:04 PM -24.1753 -13.7776 Right

4:27:05 PM 6.012823 9.591856 5.999981 14.49985 4:27:05 PM -35.0501 -15.2392 Right

4:27:06 PM 6.014133 9.593542 5.999981 14.49997 4:27:06 PM -22.8171 -15.3363 Right

4:27:07 PM 6.034587 9.847871 5.999981 14.49985 4:27:07 PM -15.3375 -13.9715 Right

4:27:08 PM 6.054775 10.10355 5.999981 14.49985 4:27:08 PM -21.0661 -15.1408 Right

4:27:09 PM -23.4253 -16.987 Right

4:27:10 PM 6.056666 10.10434 5.999981 14.49997 4:27:10 PM -19.6587 -17.0278 Right

4:27:11 PM 6.056307 10.14029 5.999981 14.49996 4:27:11 PM -15.8541 -21.8172 Left

4:27:12 PM 6.041775 10.39607 5.999981 14.4998 4:27:12 PM -18.5775 -21.0774 Left

4:27:13 PM 6.038926 10.39728 5.999981 14.4998 4:27:13 PM -18.5231 -19.2477 Left

4:27:14 PM 6.039114 10.47541 5.999981 14.49996 4:27:14 PM -25.4049 -19.3215 Right

4:27:15 PM 6.047258 10.69636 5.999981 14.49985

4:27:16 PM 6.048785 10.69682 5.999981 14.49985 4:27:16 PM -26.9319 -23.9316 Right

4:27:17 PM 6.026943 10.95504 5.999981 14.4998 4:27:17 PM -22.2185 -25.896 Left

4:27:18 PM 6.024652 10.95601 5.999981 14.49996 4:27:18 PM -36.181 -41.3379 Left

4:27:19 PM 6.019865 10.95852 5.999981 14.49985 4:27:19 PM -36.2427 -40.3296 Left

4:27:20 PM 6.006553 10.96539 5.999981 14.49985 4:27:20 PM -38.0919 -35.6699 Right

4:27:21 PM -57.6676 -98.3272 Left

4:27:22 PM 5.99994 10.97151 5.999981 14.49997 4:27:22 PM -84.253 -108.996 Left

4:27:23 PM 5.99402 10.98467 5.999981 14.4998 4:27:23 PM -71.6522 -106.932 Left

4:27:24 PM 6.028559 11.02058 5.999981 14.49996 4:27:24 PM -87.5358 -134.778 Left

4:27:25 PM 6.06708 11.05093 5.999981 14.49985 4:27:25 PM -101.506 -158.424 Left

4:27:26 PM 6.128067 11.084 5.999981 14.4998 4:27:26 PM -110.811 -14.7392 Right

4:27:27 PM 6.197443 11.08626 5.999981 14.49997 4:27:27 PM -133.208 -12.8842 Right

4:27:28 PM 6.245965 11.09443 5.999981 14.49997 4:27:28 PM -109.782 -12.8407 Right

4:27:29 PM 6.309918 11.11571 5.999981 14.49985 4:27:29 PM -84.8274 -246.026 Left

4:27:30 PM 6.360104 11.15333 5.999981 14.49997

4:27:31 PM 6.407171 11.1833 5.999981 14.49996 4:27:31 PM -84.931 -13.0222 Right

4:27:32 PM 6.438076 11.22461 5.999981 14.4998 4:27:32 PM -80.9943 -365.304 Left

4:27:33 PM -66.9843 -12.9446 Right

4:27:34 PM 6.470629 11.26311 5.999981 14.49997 4:27:34 PM -64.1645 -13.0147 Right

4:27:35 PM 6.495314 11.30925 5.999981 14.49985 4:27:35 PM -51.7825 -606.21 Left

4:27:36 PM 6.513294 11.36181 5.999981 14.49996 4:27:36 PM -46.6 -365.382 Left

4:27:37 PM 6.534553 11.41395 5.999981 14.49996 4:27:37 PM -50.432 -246.373 Left

4:27:38 PM 6.557011 11.47013 5.999981 14.49997 4:27:38 PM -61.3928 -200.706 Left

4:27:39 PM 6.575093 11.51267 5.999981 14.49996 4:27:39 PM -60.4424 -178.61 Left

4:27:40 PM 6.602152 11.5899 5.999981 14.4998 4:27:40 PM -53.7903 -168.411 Left

4:27:41 PM 6.612155 11.63607 5.999981 14.49996 4:27:41 PM -49.5158 -12.7875 Right

4:27:42 PM 6.639353 11.70051 5.999981 14.49985 4:27:42 PM -63.1739 -13.2962 Right

4:27:43 PM 6.657452 11.74518 5.999981 14.4998 4:27:43 PM -57.7977 -17.5059 Right

4:27:44 PM 6.671871 11.79709 5.999981 14.49996 4:27:44 PM -31.771 -13.7893 Right

4:27:45 PM 6.676064 11.86286 5.999981 14.49985 4:27:45 PM -30.4718 -168.78 Left

4:27:46 PM -30.5825 -13.5749 Right

4:27:47 PM 6.683097 11.91554 5.999981 14.49985

4:27:48 PM 6.670757 11.96261 5.999981 14.49997 4:27:48 PM -19.5505 -15.5052 Right

4:27:49 PM 6.668495 11.97244 5.999981 14.49997 4:27:49 PM -12.5221 -25.5081 Left

4:27:50 PM 6.668192 11.97108 5.999981 14.49985 4:27:50 PM -15.2574 -18.2345 Left

4:27:51 PM 6.668219 11.97125 5.999981 14.49985 4:27:51 PM -22.3541 -15.4667 Right

4:27:52 PM 6.670714 11.97234 5.999981 14.49985 4:27:52 PM -12.9883 -15.4975 Left

4:27:53 PM 6.670653 11.97348 5.999981 14.49985

4:27:54 PM 6.668746 11.97258 5.999981 14.49997 4:27:54 PM -14.6944 -22.5089 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 268: Modelling a real-time multi-sensor fusion-based navigation ...

250

X Z X Z L R

4:26:07 PM 6 9.5 6 14.5

4:26:08 PM 6 9.5 6 14.5

4:26:09 PM 6 9.5 6 14.5

4:26:10 PM 6 9.5 6 14.5

4:26:12 PM 5.99493 9.500999 5.999981 14.49996

4:26:13 PM 5.985316 9.503292 5.999981 14.49997

4:26:14 PM 5.977532 9.508522 5.999981 14.4998

4:26:15 PM 5.969895 9.521795 5.999981 14.4998

4:26:16 PM 5.962986 9.527066 5.999981 14.49985

4:26:17 PM 5.958001 9.536101 5.999981 14.49996

4:26:57 PM 6 9.5 6 14.5

4:26:58 PM 6 9.5 6 14.5

4:27:00 PM 6 9.5 6 14.5

4:27:01 PM 6 9.5 6 14.5

4:27:02 PM 6.000072 9.500964 5.999981 14.4998

4:27:03 PM 5.990367 9.502106 5.999981 14.49997

4:27:04 PM 6.011034 9.59257 5.999981 14.49985 4:27:04 PM -24.1753 -13.7776 Right

4:27:05 PM 6.012823 9.591856 5.999981 14.49985 4:27:05 PM -35.0501 -15.2392 Right

4:27:06 PM 6.014133 9.593542 5.999981 14.49997 4:27:06 PM -22.8171 -15.3363 Right

4:27:07 PM 6.034587 9.847871 5.999981 14.49985 4:27:07 PM -15.3375 -13.9715 Right

4:27:08 PM 6.054775 10.10355 5.999981 14.49985 4:27:08 PM -21.0661 -15.1408 Right

4:27:09 PM -23.4253 -16.987 Right

4:27:10 PM 6.056666 10.10434 5.999981 14.49997 4:27:10 PM -19.6587 -17.0278 Right

4:27:11 PM 6.056307 10.14029 5.999981 14.49996 4:27:11 PM -15.8541 -21.8172 Left

4:27:12 PM 6.041775 10.39607 5.999981 14.4998 4:27:12 PM -18.5775 -21.0774 Left

4:27:13 PM 6.038926 10.39728 5.999981 14.4998 4:27:13 PM -18.5231 -19.2477 Left

4:27:14 PM 6.039114 10.47541 5.999981 14.49996 4:27:14 PM -25.4049 -19.3215 Right

4:27:15 PM 6.047258 10.69636 5.999981 14.49985

4:27:16 PM 6.048785 10.69682 5.999981 14.49985 4:27:16 PM -26.9319 -23.9316 Right

4:27:17 PM 6.026943 10.95504 5.999981 14.4998 4:27:17 PM -22.2185 -25.896 Left

4:27:18 PM 6.024652 10.95601 5.999981 14.49996 4:27:18 PM -36.181 -41.3379 Left

4:27:19 PM 6.019865 10.95852 5.999981 14.49985 4:27:19 PM -36.2427 -40.3296 Left

4:27:20 PM 6.006553 10.96539 5.999981 14.49985 4:27:20 PM -38.0919 -35.6699 Right

4:27:21 PM -57.6676 -98.3272 Left

4:27:22 PM 5.99994 10.97151 5.999981 14.49997 4:27:22 PM -84.253 -108.996 Left

4:27:23 PM 5.99402 10.98467 5.999981 14.4998 4:27:23 PM -71.6522 -106.932 Left

4:27:24 PM 6.028559 11.02058 5.999981 14.49996 4:27:24 PM -87.5358 -134.778 Left

4:27:25 PM 6.06708 11.05093 5.999981 14.49985 4:27:25 PM -101.506 -158.424 Left

4:27:26 PM 6.128067 11.084 5.999981 14.4998 4:27:26 PM -110.811 -14.7392 Right

4:27:27 PM 6.197443 11.08626 5.999981 14.49997 4:27:27 PM -133.208 -12.8842 Right

4:27:28 PM 6.245965 11.09443 5.999981 14.49997 4:27:28 PM -109.782 -12.8407 Right

4:27:29 PM 6.309918 11.11571 5.999981 14.49985 4:27:29 PM -84.8274 -246.026 Left

4:27:30 PM 6.360104 11.15333 5.999981 14.49997

4:27:31 PM 6.407171 11.1833 5.999981 14.49996 4:27:31 PM -84.931 -13.0222 Right

4:27:32 PM 6.438076 11.22461 5.999981 14.4998 4:27:32 PM -80.9943 -365.304 Left

4:27:33 PM -66.9843 -12.9446 Right

4:27:34 PM 6.470629 11.26311 5.999981 14.49997 4:27:34 PM -64.1645 -13.0147 Right

4:27:35 PM 6.495314 11.30925 5.999981 14.49985 4:27:35 PM -51.7825 -606.21 Left

4:27:36 PM 6.513294 11.36181 5.999981 14.49996 4:27:36 PM -46.6 -365.382 Left

4:27:37 PM 6.534553 11.41395 5.999981 14.49996 4:27:37 PM -50.432 -246.373 Left

4:27:38 PM 6.557011 11.47013 5.999981 14.49997 4:27:38 PM -61.3928 -200.706 Left

4:27:39 PM 6.575093 11.51267 5.999981 14.49996 4:27:39 PM -60.4424 -178.61 Left

4:27:40 PM 6.602152 11.5899 5.999981 14.4998 4:27:40 PM -53.7903 -168.411 Left

4:27:41 PM 6.612155 11.63607 5.999981 14.49996 4:27:41 PM -49.5158 -12.7875 Right

4:27:42 PM 6.639353 11.70051 5.999981 14.49985 4:27:42 PM -63.1739 -13.2962 Right

4:27:43 PM 6.657452 11.74518 5.999981 14.4998 4:27:43 PM -57.7977 -17.5059 Right

4:27:44 PM 6.671871 11.79709 5.999981 14.49996 4:27:44 PM -31.771 -13.7893 Right

4:27:45 PM 6.676064 11.86286 5.999981 14.49985 4:27:45 PM -30.4718 -168.78 Left

4:27:46 PM -30.5825 -13.5749 Right

4:27:47 PM 6.683097 11.91554 5.999981 14.49985

4:27:48 PM 6.670757 11.96261 5.999981 14.49997 4:27:48 PM -19.5505 -15.5052 Right

4:27:49 PM 6.668495 11.97244 5.999981 14.49997 4:27:49 PM -12.5221 -25.5081 Left

4:27:50 PM 6.668192 11.97108 5.999981 14.49985 4:27:50 PM -15.2574 -18.2345 Left

4:27:51 PM 6.668219 11.97125 5.999981 14.49985 4:27:51 PM -22.3541 -15.4667 Right

4:27:52 PM 6.670714 11.97234 5.999981 14.49985 4:27:52 PM -12.9883 -15.4975 Left

4:27:53 PM 6.670653 11.97348 5.999981 14.49985

4:27:54 PM 6.668746 11.97258 5.999981 14.49997 4:27:54 PM -14.6944 -22.5089 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 269: Modelling a real-time multi-sensor fusion-based navigation ...

251

Uniform Obstruction with Left Scatter Sample 1

Motion Path Graph - Uniform obstruction with Left Scatter – Sample 1.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

29-6-9-47

CARMI Child

Page 270: Modelling a real-time multi-sensor fusion-based navigation ...

252

Combined logs - Uniform obstruction with Left Scatter - Sample 1.

X Z X Z L R

9:47:12 AM 6 9.5 6 14.5

9:47:13 AM 6 9.5 6 14.5

9:47:14 AM 6 9.5 6 14.5

9:47:15 AM 6 9.5 6 14.5

9:47:16 AM 5.987337 9.501452 5.999981 14.49985

9:47:18 AM 5.988928 9.500646 5.999981 14.49997

9:47:19 AM 5.991797 9.500566 5.999981 14.49996

9:47:20 AM 5.992589 9.501349 5.999981 14.49985

9:47:21 AM 5.993534 9.501087 5.999981 14.49996

9:47:22 AM 5.995075 9.501016 5.999981 14.49997

9:47:23 AM 5.995943 9.500745 5.999981 14.49996

9:47:24 AM 5.995262 9.483018 5.999981 14.49996

9:47:25 AM 6.003499 9.420257 5.999981 14.49997

9:47:26 AM 6.023458 9.38566 5.999981 14.49996

9:47:27 AM 6.054671 9.353238 5.999981 14.49985

9:47:28 AM 6.090059 9.319435 5.999981 14.4998

9:47:30 AM 6.121851 9.301648 5.999981 14.4998

9:47:31 AM 6.17718 9.271253 5.999981 14.4998

9:47:32 AM 6.153265 9.288312 5.999981 14.49996 9:47:32 AM -115.903 -136.007 Left

9:47:33 AM 6.093799 9.311602 5.999981 14.49997 9:47:33 AM -113.891 -140.081 Left

9:47:34 AM 6.049867 9.339276 5.999981 14.49985 9:47:34 AM -115.026 -130.319 Left

9:47:35 AM 6.008853 9.37044 5.999981 14.4998 9:47:35 AM -117.518 -126.369 Left

9:47:36 AM 5.963525 9.421406 5.999981 14.4998 9:47:36 AM -114.813 -126.778 Left

9:47:37 AM 5.929592 9.469126 5.999981 14.49997 9:47:37 AM -115.132 -121.537 Left

9:47:38 AM 5.906971 9.505761 5.999981 14.4998

9:47:39 AM 5.885407 9.568006 5.999981 14.49997 9:47:39 AM -115.167 -119.052 Left

9:47:40 AM -114.555 -119.626 Left

9:47:41 AM 5.870071 9.63638 5.999981 14.4998 9:47:41 AM -115.13 -117.879 Left

9:47:42 AM 5.865932 9.661238 5.999981 14.49996 9:47:42 AM -117.715 -111.898 Right

9:47:43 AM 5.863786 9.890822 5.999981 14.49985 9:47:43 AM -122.866 -109.72 Right

9:47:44 AM 5.860491 10.24262 5.999981 14.4998 9:47:44 AM -126.129 -111.446 Right

9:47:45 AM 5.857299 10.58391 5.999981 14.49985 9:47:45 AM -128.219 -112.273 Right

9:47:46 AM 5.854057 10.93046 5.999981 14.49997 9:47:46 AM -126.979 -111.443 Right

9:47:47 AM 5.838756 11.05261 5.999981 14.49997 9:47:47 AM -125.024 -116.554 Right

9:47:48 AM 5.830475 11.05593 5.999981 14.49997 9:47:48 AM -124.047 -117.462 Right

9:47:49 AM 5.822028 11.06103 5.999981 14.4998 9:47:49 AM -142.527 -113.971 Right

9:47:51 AM 5.816976 11.06668 5.999981 14.49996 9:47:51 AM -142.48 -115.723 Right

9:47:52 AM 5.811285 11.07549 5.999981 14.49985 9:47:52 AM -134.459 -123.817 Right

9:47:53 AM 5.831675 11.09174 5.999981 14.49996 9:47:53 AM -144.642 -123.288 Right

9:47:54 AM 5.87175 11.10656 5.999981 14.49996 9:47:54 AM -149.996 -121.005 Right

9:47:55 AM 5.93866 11.12216 5.999981 14.4998 9:47:55 AM -148.146 -125.495 Right

9:47:56 AM 5.990708 11.12964 5.999981 14.49996 9:47:56 AM -150.659 -129.23 Right

9:47:57 AM 6.028641 11.12573 5.999981 14.4998 9:47:57 AM -154.512 -125.42 Right

9:47:58 AM 6.069986 11.1176 5.999981 14.49985 9:47:58 AM -154.828 -118.768 Right

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 271: Modelling a real-time multi-sensor fusion-based navigation ...

253

Uniform Obstruction with Left Scatter Sample 2

Motion Path Graph - Uniform obstruction with Left Scatter – Sample 2.

Combined logs - Uniform obstruction with Left Scatter - Sample 2.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

29-6-9-48

CARMI Child

X Z X Z L R

9:48:03 AM 6 9.5 6 14.5

9:48:04 AM 6 9.5 6 14.5

9:48:05 AM 6 9.5 6 14.5

9:48:07 AM 5.989539 9.500969 5.999981 14.4998

9:48:08 AM 5.984222 9.48856 5.999981 14.4998

9:48:09 AM 5.960499 9.443511 5.999981 14.4998

9:48:10 AM 5.921671 9.415379 5.999981 14.49985

9:48:11 AM 5.913706 9.40899 5.999981 14.49996

9:48:12 AM 5.955582 9.448895 5.999981 14.49985 9:48:12 AM -135.086 -115.03 Right

9:48:13 AM 5.991875 9.495641 5.999981 14.4998 9:48:13 AM -137.806 -107.979 Right

9:48:14 AM 6.020525 9.53996 5.999981 14.49985 9:48:14 AM -119.496 -116.305 Right

9:48:15 AM -124.849 -116.264 Right

9:48:16 AM 6.04394 9.594741 5.999981 14.4998 9:48:16 AM -132.185 -112.092 Right

9:48:17 AM 6.064877 9.645852 5.999981 14.49985 9:48:17 AM -122.547 -111.827 Right

9:48:18 AM 6.074235 9.685211 5.999981 14.4998 9:48:18 AM -116.574 -114.748 Right

9:48:19 AM 6.076669 9.685261 5.999981 14.4998 9:48:19 AM -119.221 -112.346 Right

9:48:20 AM 6.078529 9.685247 5.999981 14.49985 9:48:20 AM -118.702 -112.932 Right

9:48:21 AM 6.079652 9.690499 5.999981 14.49996 9:48:21 AM -115.761 -115.03 Right

9:48:22 AM 6.07695 10.02347 5.999981 14.4998

9:48:23 AM 6.07415 10.3699 5.999981 14.49996 9:48:23 AM -120.879 -114.956 Right

9:48:24 AM 6.071434 10.70584 5.999981 14.49996 9:48:24 AM -124.651 -114.623 Right

9:48:25 AM 6.068777 11.02084 5.999981 14.4998 9:48:25 AM -128.483 -111.964 Right

9:48:26 AM -132.286 -111.381 Right

9:48:27 AM 6.081495 11.02713 5.999981 14.4998 9:48:27 AM -116.973 -110.039 Right

9:48:28 AM 6.092523 11.03154 5.999981 14.49997 9:48:28 AM -116.035 -119.467 Left

9:48:29 AM 6.101803 11.03939 5.999981 14.4998 9:48:29 AM -118.244 -130.927 Left

9:48:30 AM 6.108337 11.04859 5.999981 14.49996 9:48:30 AM -114.67 -130.079 Left

9:48:31 AM 6.075421 11.07258 5.999981 14.49996 9:48:31 AM -114.446 -133.945 Left

9:48:32 AM 6.020397 11.10131 5.999981 14.49997 9:48:32 AM -115.426 -138.165 Left

9:48:33 AM 5.969902 11.12522 5.999981 14.49985 9:48:33 AM -113.648 -139.498 Left

9:48:34 AM 5.929872 11.13881 5.999981 14.49997

9:48:35 AM -111.132 -141.565 Left

9:48:36 AM 5.871634 11.14168 5.999981 14.49985 9:48:36 AM -110.162 -147.574 Left

9:48:37 AM 5.82583 11.13651 5.999981 14.4998 9:48:37 AM -116.455 -149.18 Left

9:48:38 AM 5.772073 11.13676 5.999981 14.49997 9:48:38 AM -115.975 -144.742 Left

9:48:39 AM 5.730596 11.14291 5.999981 14.49985 9:48:39 AM -115.315 -139.473 Left

9:48:40 AM 5.672811 11.15937 5.999981 14.49985 9:48:40 AM -116.231 -141.171 Left

9:48:41 AM 5.62881 11.17478 5.999981 14.4998 9:48:41 AM #NAME? #NAME? Mid

9:48:42 AM 5.590019 11.19199 5.999981 14.49996 9:48:42 AM -117.939 -138.012 Left

9:48:43 AM 5.550181 11.22446 5.999981 14.49985 9:48:43 AM -115.684 -133.22 Left

9:48:44 AM 5.511716 11.26664 5.999981 14.4998 9:48:44 AM -117.732 -133.865 Left

9:48:45 AM 5.466749 11.32162 5.999981 14.4998 9:48:45 AM -125.388 -132.517 Left

9:48:46 AM 5.450275 11.36241 5.999981 14.4998

9:48:47 AM 5.42876 11.40299 5.999981 14.49997 9:48:47 AM -119.432 -123.327 Left

9:48:48 AM -118.308 -116.1 Right

9:48:49 AM 5.412917 11.45558 5.999981 14.49985 9:48:49 AM -116.318 -128.303 Left

9:48:50 AM 5.381623 11.52646 5.999981 14.49997 9:48:50 AM -119.223 -130.755 Left

9:48:51 AM 5.365721 11.56622 5.999981 14.49985 9:48:51 AM -120.13 -129.811 Left

9:48:52 AM 5.340682 11.63942 5.999981 14.4998 9:48:52 AM -115.879 -127.178 Left

9:48:53 AM 5.318073 11.68324 5.999981 14.49996 9:48:53 AM -119.028 -132.238 Left

9:48:54 AM 5.300752 11.72077 5.999981 14.49996 9:48:54 AM -124.315 -133.235 Left

9:48:55 AM 5.288986 11.77509 5.999981 14.49985 9:48:55 AM -125.666 -124.846 Right

9:48:56 AM 5.277324 11.82756 5.999981 14.49996 9:48:56 AM -125.324 -127.783 Left

9:48:57 AM 5.279469 11.88951 5.999981 14.4998 9:48:57 AM -124.506 -128.075 Left

9:48:59 AM 5.281568 11.9309 5.999981 14.49985 9:48:59 AM -125.409 -124.389 Right

9:49:00 AM 5.285477 11.94316 5.999981 14.49996 9:49:00 AM -120.673 -119.517 Right

9:49:01 AM 5.285779 11.94318 5.999981 14.49985 9:49:01 AM -123.937 -120.516 Right

9:49:02 AM 5.284322 11.94364 5.999981 14.49997 9:49:02 AM -125.261 -122.851 Right

9:49:04 AM 5.282274 11.94449 5.999981 14.49985 9:49:04 AM -126.032 -122.59 Right

9:49:05 AM 5.282291 11.94449 5.999981 14.49997 9:49:05 AM -126.888 -120.201 Right

9:49:06 AM 5.28231 11.94448 5.999981 14.49996 9:49:06 AM -125.039 -122.037 Right

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 272: Modelling a real-time multi-sensor fusion-based navigation ...

254

X Z X Z L R

9:48:03 AM 6 9.5 6 14.5

9:48:04 AM 6 9.5 6 14.5

9:48:05 AM 6 9.5 6 14.5

9:48:07 AM 5.989539 9.500969 5.999981 14.4998

9:48:08 AM 5.984222 9.48856 5.999981 14.4998

9:48:09 AM 5.960499 9.443511 5.999981 14.4998

9:48:10 AM 5.921671 9.415379 5.999981 14.49985

9:48:11 AM 5.913706 9.40899 5.999981 14.49996

9:48:12 AM 5.955582 9.448895 5.999981 14.49985 9:48:12 AM -135.086 -115.03 Right

9:48:13 AM 5.991875 9.495641 5.999981 14.4998 9:48:13 AM -137.806 -107.979 Right

9:48:14 AM 6.020525 9.53996 5.999981 14.49985 9:48:14 AM -119.496 -116.305 Right

9:48:15 AM -124.849 -116.264 Right

9:48:16 AM 6.04394 9.594741 5.999981 14.4998 9:48:16 AM -132.185 -112.092 Right

9:48:17 AM 6.064877 9.645852 5.999981 14.49985 9:48:17 AM -122.547 -111.827 Right

9:48:18 AM 6.074235 9.685211 5.999981 14.4998 9:48:18 AM -116.574 -114.748 Right

9:48:19 AM 6.076669 9.685261 5.999981 14.4998 9:48:19 AM -119.221 -112.346 Right

9:48:20 AM 6.078529 9.685247 5.999981 14.49985 9:48:20 AM -118.702 -112.932 Right

9:48:21 AM 6.079652 9.690499 5.999981 14.49996 9:48:21 AM -115.761 -115.03 Right

9:48:22 AM 6.07695 10.02347 5.999981 14.4998

9:48:23 AM 6.07415 10.3699 5.999981 14.49996 9:48:23 AM -120.879 -114.956 Right

9:48:24 AM 6.071434 10.70584 5.999981 14.49996 9:48:24 AM -124.651 -114.623 Right

9:48:25 AM 6.068777 11.02084 5.999981 14.4998 9:48:25 AM -128.483 -111.964 Right

9:48:26 AM -132.286 -111.381 Right

9:48:27 AM 6.081495 11.02713 5.999981 14.4998 9:48:27 AM -116.973 -110.039 Right

9:48:28 AM 6.092523 11.03154 5.999981 14.49997 9:48:28 AM -116.035 -119.467 Left

9:48:29 AM 6.101803 11.03939 5.999981 14.4998 9:48:29 AM -118.244 -130.927 Left

9:48:30 AM 6.108337 11.04859 5.999981 14.49996 9:48:30 AM -114.67 -130.079 Left

9:48:31 AM 6.075421 11.07258 5.999981 14.49996 9:48:31 AM -114.446 -133.945 Left

9:48:32 AM 6.020397 11.10131 5.999981 14.49997 9:48:32 AM -115.426 -138.165 Left

9:48:33 AM 5.969902 11.12522 5.999981 14.49985 9:48:33 AM -113.648 -139.498 Left

9:48:34 AM 5.929872 11.13881 5.999981 14.49997

9:48:35 AM -111.132 -141.565 Left

9:48:36 AM 5.871634 11.14168 5.999981 14.49985 9:48:36 AM -110.162 -147.574 Left

9:48:37 AM 5.82583 11.13651 5.999981 14.4998 9:48:37 AM -116.455 -149.18 Left

9:48:38 AM 5.772073 11.13676 5.999981 14.49997 9:48:38 AM -115.975 -144.742 Left

9:48:39 AM 5.730596 11.14291 5.999981 14.49985 9:48:39 AM -115.315 -139.473 Left

9:48:40 AM 5.672811 11.15937 5.999981 14.49985 9:48:40 AM -116.231 -141.171 Left

9:48:41 AM 5.62881 11.17478 5.999981 14.4998 9:48:41 AM #NAME? #NAME? Mid

9:48:42 AM 5.590019 11.19199 5.999981 14.49996 9:48:42 AM -117.939 -138.012 Left

9:48:43 AM 5.550181 11.22446 5.999981 14.49985 9:48:43 AM -115.684 -133.22 Left

9:48:44 AM 5.511716 11.26664 5.999981 14.4998 9:48:44 AM -117.732 -133.865 Left

9:48:45 AM 5.466749 11.32162 5.999981 14.4998 9:48:45 AM -125.388 -132.517 Left

9:48:46 AM 5.450275 11.36241 5.999981 14.4998

9:48:47 AM 5.42876 11.40299 5.999981 14.49997 9:48:47 AM -119.432 -123.327 Left

9:48:48 AM -118.308 -116.1 Right

9:48:49 AM 5.412917 11.45558 5.999981 14.49985 9:48:49 AM -116.318 -128.303 Left

9:48:50 AM 5.381623 11.52646 5.999981 14.49997 9:48:50 AM -119.223 -130.755 Left

9:48:51 AM 5.365721 11.56622 5.999981 14.49985 9:48:51 AM -120.13 -129.811 Left

9:48:52 AM 5.340682 11.63942 5.999981 14.4998 9:48:52 AM -115.879 -127.178 Left

9:48:53 AM 5.318073 11.68324 5.999981 14.49996 9:48:53 AM -119.028 -132.238 Left

9:48:54 AM 5.300752 11.72077 5.999981 14.49996 9:48:54 AM -124.315 -133.235 Left

9:48:55 AM 5.288986 11.77509 5.999981 14.49985 9:48:55 AM -125.666 -124.846 Right

9:48:56 AM 5.277324 11.82756 5.999981 14.49996 9:48:56 AM -125.324 -127.783 Left

9:48:57 AM 5.279469 11.88951 5.999981 14.4998 9:48:57 AM -124.506 -128.075 Left

9:48:59 AM 5.281568 11.9309 5.999981 14.49985 9:48:59 AM -125.409 -124.389 Right

9:49:00 AM 5.285477 11.94316 5.999981 14.49996 9:49:00 AM -120.673 -119.517 Right

9:49:01 AM 5.285779 11.94318 5.999981 14.49985 9:49:01 AM -123.937 -120.516 Right

9:49:02 AM 5.284322 11.94364 5.999981 14.49997 9:49:02 AM -125.261 -122.851 Right

9:49:04 AM 5.282274 11.94449 5.999981 14.49985 9:49:04 AM -126.032 -122.59 Right

9:49:05 AM 5.282291 11.94449 5.999981 14.49997 9:49:05 AM -126.888 -120.201 Right

9:49:06 AM 5.28231 11.94448 5.999981 14.49996 9:49:06 AM -125.039 -122.037 Right

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 273: Modelling a real-time multi-sensor fusion-based navigation ...

255

Uniform Obstruction with Left Scatter Sample 3

Motion Path Graph - Uniform obstruction with Left Scatter – Sample 3.

Combined logs - Uniform obstruction with Left Scatter - Sample 3.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.25.45.65.866.26.46.66.87

29-6-9-49

CARMI Child

X Z X Z L R

9:49:23 AM 6 9.5 6 14.5

9:49:24 AM 6 9.5 6 14.5

9:49:27 AM 5.990247 9.501098 5.999981 14.49985

9:49:28 AM 6.006161 9.585439 5.999981 14.49985 9:49:28 AM -112.068 -117.462 Left

9:49:29 AM -121.448 -111.362 Right

9:49:30 AM 6.007299 9.586416 5.999981 14.49997 9:49:30 AM -120.412 -111.959 Right

9:49:31 AM 6.009 9.587243 5.999981 14.49985 9:49:31 AM -119.405 -111.379 Right

9:49:32 AM 6.010558 9.587162 5.999981 14.49996

9:49:33 AM 6.012883 9.587519 5.999981 14.49985 9:49:33 AM -118.443 -110.829 Right

9:49:34 AM 6.01392 9.858509 5.999981 14.49985 9:49:34 AM -117.534 -113.545 Right

9:49:35 AM 6.015444 10.21553 5.999981 14.49985 9:49:35 AM -121.3 -114.512 Right

9:49:36 AM 6.017012 10.58303 5.999981 14.49997 9:49:36 AM -125.439 -114.659 Right

9:49:37 AM 6.018445 10.91909 5.999981 14.4998 9:49:37 AM -129.452 -111.18 Right

9:49:38 AM 6.021758 11.08074 5.999981 14.49997 9:49:38 AM -126.899 -110.434 Right

9:49:39 AM -117.661 -112.276 Right

9:49:40 AM 6.031547 11.08267 5.999981 14.4998 9:49:40 AM -118.313 -117.977 Right

9:49:41 AM 6.047211 11.07371 5.999981 14.49996

9:49:42 AM 6.052978 11.07791 5.999981 14.49997 9:49:42 AM -119.363 -125.76 Left

9:49:43 AM 6.059465 11.08437 5.999981 14.49996 9:49:43 AM -117.24 -123.76 Left

9:49:44 AM 6.063909 11.0908 5.999981 14.49985 9:49:44 AM -112.777 -129.06 Left

9:49:45 AM 6.0028 11.13235 5.999981 14.49985 9:49:45 AM -115.587 -135.043 Left

9:49:46 AM 5.96297 11.15077 5.999981 14.49996 9:49:46 AM -116.001 -137.089 Left

9:49:47 AM 5.921112 11.16725 5.999981 14.4998 9:49:47 AM -114.439 -138.936 Left

9:49:48 AM 5.859396 11.18007 5.999981 14.49996 9:49:48 AM -114.199 -142.834 Left

9:49:49 AM -116.969 -146.289 Left

9:49:50 AM 5.80068 11.18311 5.999981 14.4998 9:49:50 AM -116.233 -148.387 Left

9:49:51 AM 5.730271 11.1985 5.999981 14.49996 9:49:51 AM -115.684 -142.432 Left

9:49:52 AM 5.680724 11.21501 5.999981 14.4998 9:49:52 AM -119.419 -137.763 Left

9:49:53 AM 5.648708 11.23523 5.999981 14.49997 9:49:53 AM -116.15 -135.482 Left

9:49:54 AM 5.61039 11.26261 5.999981 14.49985

9:49:55 AM 5.573277 11.30005 5.999981 14.49985 9:49:55 AM -120.8 -135.158 Left

9:49:56 AM 5.534227 11.34663 5.999981 14.49997 9:49:56 AM -121.587 -130.395 Left

9:49:57 AM 5.51938 11.37982 5.999981 14.49985 9:49:57 AM -133.881 -125.947 Right

9:49:58 AM 5.504006 11.42471 5.999981 14.49996 9:49:58 AM -122.529 -118.344 Right

9:49:59 AM 5.484873 11.47762 5.999981 14.4998 9:49:59 AM -124.391 -128.29 Left

9:50:00 AM 5.468014 11.51742 5.999981 14.49997 9:50:00 AM -120.77 -129.273 Left

9:50:01 AM -124.157 -128.181 Left

9:50:02 AM 5.454309 11.56304 5.999981 14.49997 9:50:02 AM -121.817 -129.561 Left

9:50:03 AM 5.433574 11.61482 5.999981 14.49997 9:50:03 AM -120.675 -129.265 Left

9:50:04 AM 5.413979 11.66395 5.999981 14.4998 9:50:04 AM -122.72 -131.32 Left

9:50:05 AM 5.400045 11.70569 5.999981 14.49996 9:50:05 AM -120.805 -131.515 Left

9:50:06 AM 5.378882 11.7506 5.999981 14.49996

9:50:07 AM 5.368788 11.79686 5.999981 14.49985 9:50:07 AM -125.437 -132.457 Left

9:50:08 AM 5.353352 11.84335 5.999981 14.49985 9:50:08 AM -118.325 -125.026 Left

9:50:09 AM 5.344953 11.89445 5.999981 14.49996 9:50:09 AM -123.876 -127.563 Left

9:50:10 AM 5.341011 11.95598 5.999981 14.4998 9:50:10 AM -125.192 -130.355 Left

9:50:11 AM 5.341657 11.98195 5.999981 14.4998 9:50:11 AM -127.904 -128.598 Left

9:50:12 AM -124.013 -120.393 Right

9:50:13 AM 5.343054 11.98208 5.999981 14.49997 9:50:13 AM -120.808 -121.112 Left

9:50:14 AM 5.342436 11.98178 5.999981 14.49985 9:50:14 AM -124.526 -127.061 Left

9:50:15 AM 5.340897 11.98321 5.999981 14.49996 9:50:15 AM -126.05 -126.105 Left

9:50:16 AM 5.340421 11.98296 5.999981 14.49996 9:50:16 AM -126.808 -122.442 Right

9:50:17 AM 5.340442 11.98295 5.999981 14.49985 9:50:17 AM -126.582 -122.563 Right

9:50:18 AM 5.340468 11.98294 5.999981 14.4998

9:50:19 AM 5.340486 11.98294 5.999981 14.49985 9:50:19 AM -126.589 -122.571 Right

9:50:20 AM -126.626 -122.595 Right

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 274: Modelling a real-time multi-sensor fusion-based navigation ...

256

X Z X Z L R

9:49:23 AM 6 9.5 6 14.5

9:49:24 AM 6 9.5 6 14.5

9:49:27 AM 5.990247 9.501098 5.999981 14.49985

9:49:28 AM 6.006161 9.585439 5.999981 14.49985 9:49:28 AM -112.068 -117.462 Left

9:49:29 AM -121.448 -111.362 Right

9:49:30 AM 6.007299 9.586416 5.999981 14.49997 9:49:30 AM -120.412 -111.959 Right

9:49:31 AM 6.009 9.587243 5.999981 14.49985 9:49:31 AM -119.405 -111.379 Right

9:49:32 AM 6.010558 9.587162 5.999981 14.49996

9:49:33 AM 6.012883 9.587519 5.999981 14.49985 9:49:33 AM -118.443 -110.829 Right

9:49:34 AM 6.01392 9.858509 5.999981 14.49985 9:49:34 AM -117.534 -113.545 Right

9:49:35 AM 6.015444 10.21553 5.999981 14.49985 9:49:35 AM -121.3 -114.512 Right

9:49:36 AM 6.017012 10.58303 5.999981 14.49997 9:49:36 AM -125.439 -114.659 Right

9:49:37 AM 6.018445 10.91909 5.999981 14.4998 9:49:37 AM -129.452 -111.18 Right

9:49:38 AM 6.021758 11.08074 5.999981 14.49997 9:49:38 AM -126.899 -110.434 Right

9:49:39 AM -117.661 -112.276 Right

9:49:40 AM 6.031547 11.08267 5.999981 14.4998 9:49:40 AM -118.313 -117.977 Right

9:49:41 AM 6.047211 11.07371 5.999981 14.49996

9:49:42 AM 6.052978 11.07791 5.999981 14.49997 9:49:42 AM -119.363 -125.76 Left

9:49:43 AM 6.059465 11.08437 5.999981 14.49996 9:49:43 AM -117.24 -123.76 Left

9:49:44 AM 6.063909 11.0908 5.999981 14.49985 9:49:44 AM -112.777 -129.06 Left

9:49:45 AM 6.0028 11.13235 5.999981 14.49985 9:49:45 AM -115.587 -135.043 Left

9:49:46 AM 5.96297 11.15077 5.999981 14.49996 9:49:46 AM -116.001 -137.089 Left

9:49:47 AM 5.921112 11.16725 5.999981 14.4998 9:49:47 AM -114.439 -138.936 Left

9:49:48 AM 5.859396 11.18007 5.999981 14.49996 9:49:48 AM -114.199 -142.834 Left

9:49:49 AM -116.969 -146.289 Left

9:49:50 AM 5.80068 11.18311 5.999981 14.4998 9:49:50 AM -116.233 -148.387 Left

9:49:51 AM 5.730271 11.1985 5.999981 14.49996 9:49:51 AM -115.684 -142.432 Left

9:49:52 AM 5.680724 11.21501 5.999981 14.4998 9:49:52 AM -119.419 -137.763 Left

9:49:53 AM 5.648708 11.23523 5.999981 14.49997 9:49:53 AM -116.15 -135.482 Left

9:49:54 AM 5.61039 11.26261 5.999981 14.49985

9:49:55 AM 5.573277 11.30005 5.999981 14.49985 9:49:55 AM -120.8 -135.158 Left

9:49:56 AM 5.534227 11.34663 5.999981 14.49997 9:49:56 AM -121.587 -130.395 Left

9:49:57 AM 5.51938 11.37982 5.999981 14.49985 9:49:57 AM -133.881 -125.947 Right

9:49:58 AM 5.504006 11.42471 5.999981 14.49996 9:49:58 AM -122.529 -118.344 Right

9:49:59 AM 5.484873 11.47762 5.999981 14.4998 9:49:59 AM -124.391 -128.29 Left

9:50:00 AM 5.468014 11.51742 5.999981 14.49997 9:50:00 AM -120.77 -129.273 Left

9:50:01 AM -124.157 -128.181 Left

9:50:02 AM 5.454309 11.56304 5.999981 14.49997 9:50:02 AM -121.817 -129.561 Left

9:50:03 AM 5.433574 11.61482 5.999981 14.49997 9:50:03 AM -120.675 -129.265 Left

9:50:04 AM 5.413979 11.66395 5.999981 14.4998 9:50:04 AM -122.72 -131.32 Left

9:50:05 AM 5.400045 11.70569 5.999981 14.49996 9:50:05 AM -120.805 -131.515 Left

9:50:06 AM 5.378882 11.7506 5.999981 14.49996

9:50:07 AM 5.368788 11.79686 5.999981 14.49985 9:50:07 AM -125.437 -132.457 Left

9:50:08 AM 5.353352 11.84335 5.999981 14.49985 9:50:08 AM -118.325 -125.026 Left

9:50:09 AM 5.344953 11.89445 5.999981 14.49996 9:50:09 AM -123.876 -127.563 Left

9:50:10 AM 5.341011 11.95598 5.999981 14.4998 9:50:10 AM -125.192 -130.355 Left

9:50:11 AM 5.341657 11.98195 5.999981 14.4998 9:50:11 AM -127.904 -128.598 Left

9:50:12 AM -124.013 -120.393 Right

9:50:13 AM 5.343054 11.98208 5.999981 14.49997 9:50:13 AM -120.808 -121.112 Left

9:50:14 AM 5.342436 11.98178 5.999981 14.49985 9:50:14 AM -124.526 -127.061 Left

9:50:15 AM 5.340897 11.98321 5.999981 14.49996 9:50:15 AM -126.05 -126.105 Left

9:50:16 AM 5.340421 11.98296 5.999981 14.49996 9:50:16 AM -126.808 -122.442 Right

9:50:17 AM 5.340442 11.98295 5.999981 14.49985 9:50:17 AM -126.582 -122.563 Right

9:50:18 AM 5.340468 11.98294 5.999981 14.4998

9:50:19 AM 5.340486 11.98294 5.999981 14.49985 9:50:19 AM -126.589 -122.571 Right

9:50:20 AM -126.626 -122.595 Right

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 275: Modelling a real-time multi-sensor fusion-based navigation ...

257

Uniform Obstruction with Left Scatter Sample 4

Motion Path Graph - Uniform obstruction with Left Scatter – Sample 4.

Combined logs - Uniform with Left Scatter - Sample 4.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

29-6-9-50

CARMI Child

X Z X Z L R

9:50:35 AM 6 9.5 6 14.5

9:50:36 AM 6 9.5 6 14.5

9:50:38 AM 6 9.5 6 14.5

9:50:39 AM 5.992033 9.500932 5.999981 14.49996

9:50:40 AM 5.982918 9.495651 5.999981 14.49996

9:50:41 AM 5.965286 9.452658 5.999981 14.49996

9:50:43 AM 5.935833 9.396638 5.999981 14.49997

9:50:44 AM 5.900529 9.345466 5.999981 14.49996

9:50:45 AM 5.862112 9.30288 5.999981 14.49996

9:50:46 AM 5.835354 9.306749 5.999981 14.49985

9:50:47 AM 5.797246 9.296778 5.999981 14.49997

9:50:48 AM 5.751979 9.290651 5.999981 14.4998

9:50:49 AM 5.805103 9.300448 5.999981 14.49997 9:50:49 AM -141.224 -114.622 Right

9:50:50 AM 5.847622 9.310148 5.999981 14.49996 9:50:50 AM -140.256 -114.785 Right

9:50:51 AM 5.891947 9.33126 5.999981 14.4998 9:50:51 AM -137.189 -114.589 Right

9:50:52 AM -135.441 -112.594 Right

9:50:53 AM 5.927948 9.347473 5.999981 14.49996 9:50:53 AM -129.679 -112.388 Right

9:50:54 AM 5.972642 9.385774 5.999981 14.49985

9:50:55 AM 6.01306 9.436497 5.999981 14.49997 9:50:55 AM -128.985 -114.427 Right

9:50:56 AM 6.036311 9.466467 5.999981 14.49985 9:50:56 AM -128.68 -112.12 Right

9:50:57 AM 6.058016 9.510811 5.999981 14.49996 9:50:57 AM -123.546 -112.662 Right

9:50:58 AM 6.075562 9.555101 5.999981 14.49997 9:50:58 AM -121.452 -113.771 Right

9:50:59 AM 6.092412 9.600558 5.999981 14.49985 9:50:59 AM -123.72 -112.473 Right

9:51:00 AM 6.09944 9.633196 5.999981 14.49997 9:51:00 AM -122.155 -111.153 Right

9:51:01 AM 6.096567 9.749199 5.999981 14.49996 9:51:01 AM -113.865 -117.882 Left

9:51:02 AM 6.094957 9.750555 5.999981 14.49996 9:51:02 AM -111.771 -119.903 Left

9:51:03 AM -118.113 -115.477 Right

9:51:04 AM 6.100585 9.870809 5.999981 14.49996 9:51:04 AM -121.892 -113.187 Right

9:51:05 AM 6.10114 9.870079 5.999981 14.4998 9:51:05 AM -118.546 -114.644 Right

9:51:06 AM 6.095837 10.05738 5.999981 14.49985

9:51:07 AM 6.084424 10.39807 5.999981 14.4998 9:51:07 AM -121.816 -116.319 Right

9:51:08 AM 6.073164 10.73291 5.999981 14.4998 9:51:08 AM -126.438 -114.119 Right

9:51:09 AM 6.065636 11.0088 5.999981 14.4998 9:51:09 AM -129.306 -110.574 Right

9:51:10 AM 6.072446 11.0112 5.999981 14.49997 9:51:10 AM -120.818 -111.02 Right

9:51:11 AM 6.084439 11.0161 5.999981 14.49996 9:51:11 AM -116.894 -112.894 Right

9:51:12 AM 6.091426 11.02125 5.999981 14.49997 9:51:12 AM -117.755 -122.583 Left

9:51:13 AM -118.115 -128.913 Left

9:51:14 AM 6.097644 11.02645 5.999981 14.49997 9:51:14 AM -110.282 -127.571 Left

9:51:15 AM 6.075973 11.04914 5.999981 14.49985 9:51:15 AM -112.153 -138.25 Left

9:51:16 AM 6.030676 11.07591 5.999981 14.49985

9:51:17 AM 5.988955 11.09675 5.999981 14.4998 9:51:17 AM -117.982 -141.948 Left

9:51:18 AM 5.947705 11.10923 5.999981 14.49997 9:51:18 AM -113.099 -140.849 Left

9:51:19 AM 5.894887 11.11926 5.999981 14.4998 9:51:19 AM -111.674 -143.529 Left

9:51:20 AM 5.859352 11.11634 5.999981 14.49997 9:51:20 AM -113.138 -147.49 Left

9:51:21 AM 5.814218 11.11827 5.999981 14.49985 9:51:21 AM -115.187 -150.509 Left

9:51:22 AM 5.765913 11.1298 5.999981 14.49996 9:51:22 AM -119.04 -140.767 Left

9:51:23 AM -114.366 -137.759 Left

9:51:24 AM 5.719851 11.14345 5.999981 14.49985 9:51:24 AM -116.066 -139.195 Left

9:51:25 AM 5.674833 11.16323 5.999981 14.4998 9:51:25 AM -118.379 -141.961 Left

9:51:26 AM 5.636488 11.18308 5.999981 14.49997 9:51:26 AM -116.041 -137.193 Left

9:51:27 AM 5.593248 11.21056 5.999981 14.49985

9:51:28 AM 5.555815 11.2404 5.999981 14.49996 9:51:28 AM -116.584 -134.762 Left

9:51:29 AM 5.521909 11.27086 5.999981 14.4998 9:51:29 AM -117.846 -133.757 Left

9:51:30 AM 5.489244 11.31269 5.999981 14.49996 9:51:30 AM -121.019 -132.14 Left

9:51:31 AM 5.464964 11.36073 5.999981 14.49985 9:51:31 AM -125.493 -128.432 Left

9:51:32 AM 5.44524 11.40433 5.999981 14.49996 9:51:32 AM -117.556 -124.887 Left

9:51:33 AM 5.42282 11.45891 5.999981 14.49985 9:51:33 AM -115.972 -125.244 Left

9:51:34 AM -116.677 -127.455 Left

9:51:35 AM 5.39315 11.52384 5.999981 14.49996 9:51:35 AM -122.047 -129.672 Left

9:51:36 AM 5.38341 11.56829 5.999981 14.49997 9:51:36 AM -115.94 -125.697 Left

9:51:37 AM 5.357516 11.61091 5.999981 14.49996 9:51:37 AM -119.221 -130.562 Left

9:51:38 AM 5.34116 11.66051 5.999981 14.49985 9:51:38 AM -123.953 -132.117 Left

9:51:39 AM 5.322598 11.7019 5.999981 14.49996 9:51:39 AM -123.509 -129.53 Left

9:51:40 AM 5.314125 11.74172 5.999981 14.49996

9:51:41 AM 5.304253 11.7815 5.999981 14.49997 9:51:41 AM -121.5 -126.523 Left

9:51:42 AM 5.296102 11.83753 5.999981 14.49996 9:51:42 AM -124.387 -129.128 Left

9:51:43 AM 5.300335 11.88195 5.999981 14.49985 9:51:43 AM -128.3 -129.23 Left

9:51:44 AM 5.299342 11.88287 5.999981 14.49996 9:51:44 AM -124.344 -123.061 Right

9:51:45 AM -124.703 -123.206 Right

9:51:46 AM 5.297827 11.88361 5.999981 14.49985 9:51:46 AM -125.292 -127.356 Left

9:51:47 AM 5.295514 11.88386 5.999981 14.49996 9:51:47 AM -127.382 -123.893 Right

9:51:48 AM 5.29448 11.88433 5.999981 14.49985 9:51:48 AM -127.344 -122.725 Right

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 276: Modelling a real-time multi-sensor fusion-based navigation ...

258

X Z X Z L R

9:50:35 AM 6 9.5 6 14.5

9:50:36 AM 6 9.5 6 14.5

9:50:38 AM 6 9.5 6 14.5

9:50:39 AM 5.992033 9.500932 5.999981 14.49996

9:50:40 AM 5.982918 9.495651 5.999981 14.49996

9:50:41 AM 5.965286 9.452658 5.999981 14.49996

9:50:43 AM 5.935833 9.396638 5.999981 14.49997

9:50:44 AM 5.900529 9.345466 5.999981 14.49996

9:50:45 AM 5.862112 9.30288 5.999981 14.49996

9:50:46 AM 5.835354 9.306749 5.999981 14.49985

9:50:47 AM 5.797246 9.296778 5.999981 14.49997

9:50:48 AM 5.751979 9.290651 5.999981 14.4998

9:50:49 AM 5.805103 9.300448 5.999981 14.49997 9:50:49 AM -141.224 -114.622 Right

9:50:50 AM 5.847622 9.310148 5.999981 14.49996 9:50:50 AM -140.256 -114.785 Right

9:50:51 AM 5.891947 9.33126 5.999981 14.4998 9:50:51 AM -137.189 -114.589 Right

9:50:52 AM -135.441 -112.594 Right

9:50:53 AM 5.927948 9.347473 5.999981 14.49996 9:50:53 AM -129.679 -112.388 Right

9:50:54 AM 5.972642 9.385774 5.999981 14.49985

9:50:55 AM 6.01306 9.436497 5.999981 14.49997 9:50:55 AM -128.985 -114.427 Right

9:50:56 AM 6.036311 9.466467 5.999981 14.49985 9:50:56 AM -128.68 -112.12 Right

9:50:57 AM 6.058016 9.510811 5.999981 14.49996 9:50:57 AM -123.546 -112.662 Right

9:50:58 AM 6.075562 9.555101 5.999981 14.49997 9:50:58 AM -121.452 -113.771 Right

9:50:59 AM 6.092412 9.600558 5.999981 14.49985 9:50:59 AM -123.72 -112.473 Right

9:51:00 AM 6.09944 9.633196 5.999981 14.49997 9:51:00 AM -122.155 -111.153 Right

9:51:01 AM 6.096567 9.749199 5.999981 14.49996 9:51:01 AM -113.865 -117.882 Left

9:51:02 AM 6.094957 9.750555 5.999981 14.49996 9:51:02 AM -111.771 -119.903 Left

9:51:03 AM -118.113 -115.477 Right

9:51:04 AM 6.100585 9.870809 5.999981 14.49996 9:51:04 AM -121.892 -113.187 Right

9:51:05 AM 6.10114 9.870079 5.999981 14.4998 9:51:05 AM -118.546 -114.644 Right

9:51:06 AM 6.095837 10.05738 5.999981 14.49985

9:51:07 AM 6.084424 10.39807 5.999981 14.4998 9:51:07 AM -121.816 -116.319 Right

9:51:08 AM 6.073164 10.73291 5.999981 14.4998 9:51:08 AM -126.438 -114.119 Right

9:51:09 AM 6.065636 11.0088 5.999981 14.4998 9:51:09 AM -129.306 -110.574 Right

9:51:10 AM 6.072446 11.0112 5.999981 14.49997 9:51:10 AM -120.818 -111.02 Right

9:51:11 AM 6.084439 11.0161 5.999981 14.49996 9:51:11 AM -116.894 -112.894 Right

9:51:12 AM 6.091426 11.02125 5.999981 14.49997 9:51:12 AM -117.755 -122.583 Left

9:51:13 AM -118.115 -128.913 Left

9:51:14 AM 6.097644 11.02645 5.999981 14.49997 9:51:14 AM -110.282 -127.571 Left

9:51:15 AM 6.075973 11.04914 5.999981 14.49985 9:51:15 AM -112.153 -138.25 Left

9:51:16 AM 6.030676 11.07591 5.999981 14.49985

9:51:17 AM 5.988955 11.09675 5.999981 14.4998 9:51:17 AM -117.982 -141.948 Left

9:51:18 AM 5.947705 11.10923 5.999981 14.49997 9:51:18 AM -113.099 -140.849 Left

9:51:19 AM 5.894887 11.11926 5.999981 14.4998 9:51:19 AM -111.674 -143.529 Left

9:51:20 AM 5.859352 11.11634 5.999981 14.49997 9:51:20 AM -113.138 -147.49 Left

9:51:21 AM 5.814218 11.11827 5.999981 14.49985 9:51:21 AM -115.187 -150.509 Left

9:51:22 AM 5.765913 11.1298 5.999981 14.49996 9:51:22 AM -119.04 -140.767 Left

9:51:23 AM -114.366 -137.759 Left

9:51:24 AM 5.719851 11.14345 5.999981 14.49985 9:51:24 AM -116.066 -139.195 Left

9:51:25 AM 5.674833 11.16323 5.999981 14.4998 9:51:25 AM -118.379 -141.961 Left

9:51:26 AM 5.636488 11.18308 5.999981 14.49997 9:51:26 AM -116.041 -137.193 Left

9:51:27 AM 5.593248 11.21056 5.999981 14.49985

9:51:28 AM 5.555815 11.2404 5.999981 14.49996 9:51:28 AM -116.584 -134.762 Left

9:51:29 AM 5.521909 11.27086 5.999981 14.4998 9:51:29 AM -117.846 -133.757 Left

9:51:30 AM 5.489244 11.31269 5.999981 14.49996 9:51:30 AM -121.019 -132.14 Left

9:51:31 AM 5.464964 11.36073 5.999981 14.49985 9:51:31 AM -125.493 -128.432 Left

9:51:32 AM 5.44524 11.40433 5.999981 14.49996 9:51:32 AM -117.556 -124.887 Left

9:51:33 AM 5.42282 11.45891 5.999981 14.49985 9:51:33 AM -115.972 -125.244 Left

9:51:34 AM -116.677 -127.455 Left

9:51:35 AM 5.39315 11.52384 5.999981 14.49996 9:51:35 AM -122.047 -129.672 Left

9:51:36 AM 5.38341 11.56829 5.999981 14.49997 9:51:36 AM -115.94 -125.697 Left

9:51:37 AM 5.357516 11.61091 5.999981 14.49996 9:51:37 AM -119.221 -130.562 Left

9:51:38 AM 5.34116 11.66051 5.999981 14.49985 9:51:38 AM -123.953 -132.117 Left

9:51:39 AM 5.322598 11.7019 5.999981 14.49996 9:51:39 AM -123.509 -129.53 Left

9:51:40 AM 5.314125 11.74172 5.999981 14.49996

9:51:41 AM 5.304253 11.7815 5.999981 14.49997 9:51:41 AM -121.5 -126.523 Left

9:51:42 AM 5.296102 11.83753 5.999981 14.49996 9:51:42 AM -124.387 -129.128 Left

9:51:43 AM 5.300335 11.88195 5.999981 14.49985 9:51:43 AM -128.3 -129.23 Left

9:51:44 AM 5.299342 11.88287 5.999981 14.49996 9:51:44 AM -124.344 -123.061 Right

9:51:45 AM -124.703 -123.206 Right

9:51:46 AM 5.297827 11.88361 5.999981 14.49985 9:51:46 AM -125.292 -127.356 Left

9:51:47 AM 5.295514 11.88386 5.999981 14.49996 9:51:47 AM -127.382 -123.893 Right

9:51:48 AM 5.29448 11.88433 5.999981 14.49985 9:51:48 AM -127.344 -122.725 Right

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 277: Modelling a real-time multi-sensor fusion-based navigation ...

259

Uniform Obstruction with Left Scatter Sample 5

Motion Path Graph - Uniform obstruction with Left Scatter – Sample 5.

Combined logs - Uniform with Left Scatter - Sample 5.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

29-6-9-52

CARMI Child

X Z X Z L R

9:52:03 AM 6 9.5 6 14.5

9:52:04 AM 6 9.5 6 14.5

9:52:05 AM 6 9.5 6 14.5

9:52:06 AM 5.986923 9.501168 5.999981 14.49996

9:52:07 AM -114.247 -116.774 Left

9:52:08 AM 5.985776 9.500831 5.999981 14.49997

9:52:09 AM 5.986289 9.500549 5.999981 14.49996 9:52:09 AM -126.844 -109.334 Right

9:52:10 AM 5.987936 9.50121 5.999981 14.49985 9:52:10 AM -119.296 -112.33 Right

9:52:11 AM 5.989825 9.501816 5.999981 14.4998 9:52:11 AM -119.035 -111.647 Right

9:52:12 AM 5.991438 9.501507 5.999981 14.49996 9:52:12 AM -119.947 -110.943 Right

9:52:13 AM 5.994099 9.501693 5.999981 14.49997 9:52:13 AM -120.233 -110.404 Right

9:52:14 AM 5.996954 9.501394 5.999981 14.49985 9:52:14 AM -117.913 -109.889 Right

9:52:15 AM 5.992976 9.814143 5.999981 14.49985 9:52:15 AM -116.118 -113.455 Right

9:52:16 AM 5.9884 10.17606 5.999981 14.4998 9:52:16 AM -121.528 -113.426 Right

9:52:17 AM -125.733 -113.562 Right

9:52:18 AM 5.983995 10.52225 5.999981 14.4998 9:52:18 AM -129.233 -111.506 Right

9:52:19 AM 5.979523 10.87368 5.999981 14.49997

9:52:20 AM 5.981913 11.05711 5.999981 14.49985 9:52:20 AM -127.362 -111.066 Right

9:52:21 AM 5.991177 11.059 5.999981 14.49997 9:52:21 AM -118.371 -113.215 Right

9:52:22 AM 5.998302 11.07402 5.999981 14.49996 9:52:22 AM -111.838 -115.564 Left

9:52:23 AM 6.005992 11.08325 5.999981 14.4998 9:52:23 AM -111.101 -110.24 Right

9:52:24 AM 5.992832 11.08154 5.999981 14.49985 9:52:24 AM -106.847 -107.324 Left

9:52:25 AM 5.985376 11.07502 5.999981 14.49997 9:52:25 AM -111.303 -107.679 Right

9:52:26 AM 5.976749 11.07048 5.999981 14.49985 9:52:26 AM -112.173 -107.702 Right

9:52:27 AM -117.917 -111.765 Right

9:52:28 AM 5.978275 11.05606 5.999981 14.49997 9:52:28 AM -122.525 -119.825 Right

9:52:29 AM 5.999763 11.00662 5.999981 14.49985 9:52:29 AM -119.697 -120.466 Left

9:52:30 AM 6.024755 10.9677 5.999981 14.4998 9:52:30 AM -118.504 -121.348 Left

9:52:31 AM 6.032842 10.96397 5.999981 14.49985

9:52:32 AM 6.014199 10.9949 5.999981 14.49996 9:52:32 AM -117.925 -125.723 Left

9:52:33 AM 5.987284 11.02559 5.999981 14.4998 9:52:33 AM -116.407 -126.901 Left

9:52:34 AM 5.953606 11.05565 5.999981 14.49985 9:52:34 AM -115.08 -129 Left

9:52:35 AM 5.898875 11.09244 5.999981 14.4998 9:52:35 AM -115.281 -133.179 Left

9:52:36 AM 5.862541 11.11813 5.999981 14.49997 9:52:36 AM -111.91 -135.706 Left

9:52:37 AM 5.80668 11.13608 5.999981 14.49996 9:52:37 AM -113.123 -140.932 Left

9:52:38 AM 5.75741 11.14399 5.999981 14.49997 9:52:38 AM -115.412 -143.464 Left

9:52:39 AM -115.86 -147.343 Left

9:52:40 AM 5.71046 11.15507 5.999981 14.49996 9:52:40 AM -115.989 -140.025 Left

9:52:41 AM 5.665432 11.17138 5.999981 14.4998 9:52:41 AM -118.114 -140.827 Left

9:52:42 AM 5.616353 11.19842 5.999981 14.49985 9:52:42 AM -121.249 -135.504 Left

9:52:43 AM 5.583124 11.22611 5.999981 14.49997 9:52:43 AM -117.91 -131.563 Left

9:52:44 AM 5.5418 11.26998 5.999981 14.49985

9:52:45 AM 5.508549 11.32217 5.999981 14.49996 9:52:45 AM -122.432 -131.554 Left

9:52:46 AM 5.48676 11.37243 5.999981 14.49985 9:52:46 AM -125.706 -126.489 Left

9:52:47 AM 5.468561 11.41863 5.999981 14.4998 9:52:47 AM -118.398 -124.538 Left

9:52:48 AM 5.454777 11.45083 5.999981 14.49997 9:52:48 AM -116.756 -125.79 Left

9:52:49 AM 5.433987 11.49951 5.999981 14.49985 9:52:49 AM -117.402 -127.23 Left

9:52:50 AM -119.579 -127.874 Left

9:52:51 AM 5.419274 11.54935 5.999981 14.49996 9:52:51 AM -119.162 -129.584 Left

9:52:52 AM 5.392208 11.60023 5.999981 14.4998 9:52:52 AM -118.521 -130.245 Left

9:52:53 AM 5.376558 11.64423 5.999981 14.49997 9:52:53 AM -117.961 -130.065 Left

9:52:54 AM 5.357267 11.68128 5.999981 14.49996 9:52:54 AM -121.425 -131.033 Left

9:52:55 AM 5.341269 11.72774 5.999981 14.49985 9:52:55 AM -125.744 -131.34 Left

9:52:56 AM 5.324807 11.78022 5.999981 14.49997 9:52:56 AM -125.978 -126.602 Left

9:52:57 AM 5.318489 11.83174 5.999981 14.49985

9:52:58 AM 5.312061 11.87229 5.999981 14.49996 9:52:58 AM -123.377 -125.145 Left

9:52:59 AM 5.315662 11.94361 5.999981 14.49985 9:52:59 AM -125.563 -129.555 Left

9:53:00 AM 5.316074 11.95011 5.999981 14.49997 9:53:00 AM -126.156 -127.504 Left

9:53:01 AM -124.699 -119.457 Right

9:53:02 AM 5.316076 11.95011 5.999981 14.49985 9:53:02 AM -122.596 -119.9 Right

9:53:03 AM 5.316077 11.95011 5.999981 14.49996 9:53:03 AM -125.269 -123.511 Right

9:53:04 AM 5.31516 11.95057 5.999981 14.4998 9:53:04 AM -125.74 -124.766 Right

9:53:05 AM 5.314207 11.94992 5.999981 14.49985 9:53:05 AM -126.238 -122.88 Right

9:53:06 AM 5.31402 11.94994 5.999981 14.49996 9:53:06 AM -126.052 -121.72 Right

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 278: Modelling a real-time multi-sensor fusion-based navigation ...

260

X Z X Z L R

9:52:03 AM 6 9.5 6 14.5

9:52:04 AM 6 9.5 6 14.5

9:52:05 AM 6 9.5 6 14.5

9:52:06 AM 5.986923 9.501168 5.999981 14.49996

9:52:07 AM -114.247 -116.774 Left

9:52:08 AM 5.985776 9.500831 5.999981 14.49997

9:52:09 AM 5.986289 9.500549 5.999981 14.49996 9:52:09 AM -126.844 -109.334 Right

9:52:10 AM 5.987936 9.50121 5.999981 14.49985 9:52:10 AM -119.296 -112.33 Right

9:52:11 AM 5.989825 9.501816 5.999981 14.4998 9:52:11 AM -119.035 -111.647 Right

9:52:12 AM 5.991438 9.501507 5.999981 14.49996 9:52:12 AM -119.947 -110.943 Right

9:52:13 AM 5.994099 9.501693 5.999981 14.49997 9:52:13 AM -120.233 -110.404 Right

9:52:14 AM 5.996954 9.501394 5.999981 14.49985 9:52:14 AM -117.913 -109.889 Right

9:52:15 AM 5.992976 9.814143 5.999981 14.49985 9:52:15 AM -116.118 -113.455 Right

9:52:16 AM 5.9884 10.17606 5.999981 14.4998 9:52:16 AM -121.528 -113.426 Right

9:52:17 AM -125.733 -113.562 Right

9:52:18 AM 5.983995 10.52225 5.999981 14.4998 9:52:18 AM -129.233 -111.506 Right

9:52:19 AM 5.979523 10.87368 5.999981 14.49997

9:52:20 AM 5.981913 11.05711 5.999981 14.49985 9:52:20 AM -127.362 -111.066 Right

9:52:21 AM 5.991177 11.059 5.999981 14.49997 9:52:21 AM -118.371 -113.215 Right

9:52:22 AM 5.998302 11.07402 5.999981 14.49996 9:52:22 AM -111.838 -115.564 Left

9:52:23 AM 6.005992 11.08325 5.999981 14.4998 9:52:23 AM -111.101 -110.24 Right

9:52:24 AM 5.992832 11.08154 5.999981 14.49985 9:52:24 AM -106.847 -107.324 Left

9:52:25 AM 5.985376 11.07502 5.999981 14.49997 9:52:25 AM -111.303 -107.679 Right

9:52:26 AM 5.976749 11.07048 5.999981 14.49985 9:52:26 AM -112.173 -107.702 Right

9:52:27 AM -117.917 -111.765 Right

9:52:28 AM 5.978275 11.05606 5.999981 14.49997 9:52:28 AM -122.525 -119.825 Right

9:52:29 AM 5.999763 11.00662 5.999981 14.49985 9:52:29 AM -119.697 -120.466 Left

9:52:30 AM 6.024755 10.9677 5.999981 14.4998 9:52:30 AM -118.504 -121.348 Left

9:52:31 AM 6.032842 10.96397 5.999981 14.49985

9:52:32 AM 6.014199 10.9949 5.999981 14.49996 9:52:32 AM -117.925 -125.723 Left

9:52:33 AM 5.987284 11.02559 5.999981 14.4998 9:52:33 AM -116.407 -126.901 Left

9:52:34 AM 5.953606 11.05565 5.999981 14.49985 9:52:34 AM -115.08 -129 Left

9:52:35 AM 5.898875 11.09244 5.999981 14.4998 9:52:35 AM -115.281 -133.179 Left

9:52:36 AM 5.862541 11.11813 5.999981 14.49997 9:52:36 AM -111.91 -135.706 Left

9:52:37 AM 5.80668 11.13608 5.999981 14.49996 9:52:37 AM -113.123 -140.932 Left

9:52:38 AM 5.75741 11.14399 5.999981 14.49997 9:52:38 AM -115.412 -143.464 Left

9:52:39 AM -115.86 -147.343 Left

9:52:40 AM 5.71046 11.15507 5.999981 14.49996 9:52:40 AM -115.989 -140.025 Left

9:52:41 AM 5.665432 11.17138 5.999981 14.4998 9:52:41 AM -118.114 -140.827 Left

9:52:42 AM 5.616353 11.19842 5.999981 14.49985 9:52:42 AM -121.249 -135.504 Left

9:52:43 AM 5.583124 11.22611 5.999981 14.49997 9:52:43 AM -117.91 -131.563 Left

9:52:44 AM 5.5418 11.26998 5.999981 14.49985

9:52:45 AM 5.508549 11.32217 5.999981 14.49996 9:52:45 AM -122.432 -131.554 Left

9:52:46 AM 5.48676 11.37243 5.999981 14.49985 9:52:46 AM -125.706 -126.489 Left

9:52:47 AM 5.468561 11.41863 5.999981 14.4998 9:52:47 AM -118.398 -124.538 Left

9:52:48 AM 5.454777 11.45083 5.999981 14.49997 9:52:48 AM -116.756 -125.79 Left

9:52:49 AM 5.433987 11.49951 5.999981 14.49985 9:52:49 AM -117.402 -127.23 Left

9:52:50 AM -119.579 -127.874 Left

9:52:51 AM 5.419274 11.54935 5.999981 14.49996 9:52:51 AM -119.162 -129.584 Left

9:52:52 AM 5.392208 11.60023 5.999981 14.4998 9:52:52 AM -118.521 -130.245 Left

9:52:53 AM 5.376558 11.64423 5.999981 14.49997 9:52:53 AM -117.961 -130.065 Left

9:52:54 AM 5.357267 11.68128 5.999981 14.49996 9:52:54 AM -121.425 -131.033 Left

9:52:55 AM 5.341269 11.72774 5.999981 14.49985 9:52:55 AM -125.744 -131.34 Left

9:52:56 AM 5.324807 11.78022 5.999981 14.49997 9:52:56 AM -125.978 -126.602 Left

9:52:57 AM 5.318489 11.83174 5.999981 14.49985

9:52:58 AM 5.312061 11.87229 5.999981 14.49996 9:52:58 AM -123.377 -125.145 Left

9:52:59 AM 5.315662 11.94361 5.999981 14.49985 9:52:59 AM -125.563 -129.555 Left

9:53:00 AM 5.316074 11.95011 5.999981 14.49997 9:53:00 AM -126.156 -127.504 Left

9:53:01 AM -124.699 -119.457 Right

9:53:02 AM 5.316076 11.95011 5.999981 14.49985 9:53:02 AM -122.596 -119.9 Right

9:53:03 AM 5.316077 11.95011 5.999981 14.49996 9:53:03 AM -125.269 -123.511 Right

9:53:04 AM 5.31516 11.95057 5.999981 14.4998 9:53:04 AM -125.74 -124.766 Right

9:53:05 AM 5.314207 11.94992 5.999981 14.49985 9:53:05 AM -126.238 -122.88 Right

9:53:06 AM 5.31402 11.94994 5.999981 14.49996 9:53:06 AM -126.052 -121.72 Right

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 279: Modelling a real-time multi-sensor fusion-based navigation ...

261

Uniform Obstruction with Right Scatter Sample 1

Motion Path Graph - Uniform obstruction with Right Scatter – Sample 1.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

29-6-9-54

CARMI Child

Page 280: Modelling a real-time multi-sensor fusion-based navigation ...

262

Combined logs - Uniform obstruction with Right Scatter - Sample 1.

X Z X Z L R

9:54:20 AM 6 9.5 6 14.5

9:54:21 AM 6 9.5 6 14.5

9:54:22 AM 6 9.5 6 14.5

9:54:24 AM 5.992743 9.501327 5.999981 14.49996

9:54:25 AM 6.014415 9.652831 5.999981 14.49985 9:54:25 AM -109.355 -121.732 Left

9:54:26 AM 6.016137 9.652969 5.999981 14.49997 9:54:26 AM -120.416 -115.859 Right

9:54:27 AM 6.018673 9.652266 5.999981 14.49997 9:54:27 AM -124.603 -112.088 Right

9:54:28 AM 6.022757 9.734281 5.999981 14.49996 9:54:28 AM -114.683 -115.343 Left

9:54:29 AM 6.035034 10.07532 5.999981 14.49997 9:54:29 AM -111.127 -122.824 Left

9:54:30 AM 6.047583 10.42157 5.999981 14.49985 9:54:30 AM -112.634 -126.391 Left

9:54:31 AM 6.050531 10.42703 5.999981 14.49997 9:54:31 AM -112.889 -127.154 Left

9:54:32 AM -110.945 -125.513 Left

9:54:33 AM 6.045438 10.59026 5.999981 14.49985 9:54:33 AM -110.314 -126.542 Left

9:54:34 AM 6.035212 10.93034 5.999981 14.4998 9:54:34 AM -107.805 -122.514 Left

9:54:35 AM 6.029407 11.02936 5.999981 14.49985 9:54:35 AM -108.739 -114.683 Left

9:54:36 AM 6.018277 11.03932 5.999981 14.49997 9:54:36 AM -115.581 -107.849 Right

9:54:37 AM 6.017547 11.05508 5.999981 14.4998

9:54:38 AM 6.012517 11.05829 5.999981 14.49997 9:54:38 AM -126.3 -107.902 Right

9:54:39 AM 6.028029 11.08198 5.999981 14.49996 9:54:39 AM -130.589 -113.127 Right

9:54:40 AM 6.065265 11.11689 5.999981 14.4998 9:54:40 AM -129.694 -108.345 Right

9:54:41 AM 6.111082 11.14374 5.999981 14.4998 9:54:41 AM -135.096 -109.961 Right

9:54:42 AM -142.776 -109.965 Right

9:54:43 AM 6.158471 11.16322 5.999981 14.49997 9:54:43 AM -143.954 -106.512 Right

9:54:44 AM 6.205731 11.16906 5.999981 14.49996 9:54:44 AM -143.341 -107.78 Right

9:54:45 AM 6.242528 11.17795 5.999981 14.49985 9:54:45 AM -139.898 -109.889 Right

9:54:46 AM 6.29443 11.19541 5.999981 14.49996 9:54:46 AM -137.761 -110.354 Right

9:54:47 AM 6.337522 11.21556 5.999981 14.49997 9:54:47 AM -134.745 -110.4 Right

9:54:48 AM 6.385379 11.24562 5.999981 14.49996

9:54:49 AM 6.422552 11.27703 5.999981 14.4998 9:54:49 AM -132.099 -111.216 Right

9:54:50 AM 6.450261 11.30324 5.999981 14.49996 9:54:50 AM -129.517 -112.148 Right

9:54:51 AM 6.484265 11.34326 5.999981 14.49985 9:54:51 AM #NAME? #NAME? Mid

9:54:52 AM 6.518316 11.39919 5.999981 14.49985 9:54:52 AM -125.377 -116.424 Right

9:54:53 AM 6.531538 11.44027 5.999981 14.4998 9:54:53 AM -119.71 -117.881 Right

9:54:54 AM -122.98 -108.096 Right

9:54:55 AM 6.549782 11.48136 5.999981 14.4998 9:54:55 AM -123.539 -114.581 Right

9:54:56 AM 6.573968 11.53471 5.999981 14.4998 9:54:56 AM -121.92 -114.437 Right

9:54:57 AM 6.586883 11.57396 5.999981 14.49996 9:54:57 AM -122.01 -115.686 Right

9:54:58 AM 6.599334 11.61252 5.999981 14.49985 9:54:58 AM -124.641 -112.713 Right

9:54:59 AM 6.614729 11.64415 5.999981 14.4998 9:54:59 AM -123.006 -113.922 Right

9:55:00 AM 6.635639 11.69306 5.999981 14.49996 9:55:00 AM -122.358 -116.383 Right

9:55:01 AM 6.658867 11.74774 5.999981 14.4998

9:55:02 AM 6.669861 11.78621 5.999981 14.49985 9:55:02 AM -120.054 -118.531 Right

9:55:03 AM 6.673469 11.83668 5.999981 14.4998 9:55:03 AM -116.947 -119.705 Left

9:55:04 AM 6.6792 11.87639 5.999981 14.49996 9:55:04 AM -113.345 -118.26 Left

9:55:05 AM -116.976 -117.922 Left

9:55:06 AM 6.680484 11.92096 5.999981 14.49997 9:55:06 AM -115.768 -123.232 Left

9:55:07 AM 6.674488 11.96908 5.999981 14.4998 9:55:07 AM -110.095 -123.105 Left

9:55:08 AM 6.673965 11.96892 5.999981 14.49985 9:55:08 AM -110.483 -120.978 Left

9:55:09 AM 6.67272 11.96874 5.999981 14.49996 9:55:09 AM -107.896 -122.012 Left

9:55:10 AM 6.672424 11.9693 5.999981 14.4998 9:55:10 AM -110.864 -121.478 Left

9:55:11 AM 6.672506 11.96932 5.999981 14.49996 9:55:11 AM -110.537 -121.547 Left

9:55:12 AM 6.672506 11.96932 5.999981 14.49985 9:55:12 AM -108.211 -122.777 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 281: Modelling a real-time multi-sensor fusion-based navigation ...

263

Uniform Obstruction with Right Scatter Sample 2

Motion Path Graph - Uniform obstruction with Right Scatter – Sample 2.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

29-6-9-55

CARMI Child

Page 282: Modelling a real-time multi-sensor fusion-based navigation ...

264

Combined logs - Uniform obstruction with Right Scatter - Sample 2.

X Z X Z L R

9:55:25 AM 6 9.5 6 14.5

9:55:26 AM 6 9.5 6 14.5

9:55:27 AM 6 9.5 6 14.5

9:55:29 AM 5.997623 9.500815 5.999981 14.49985

9:55:30 AM 5.988283 9.501217 5.999981 14.49997 9:55:30 AM -111.921 -120.332 Left

9:55:31 AM 5.99008 9.501486 5.999981 14.4998 9:55:31 AM -129.2 -110.807 Right

9:55:32 AM 5.990736 9.50109 5.999981 14.49997 9:55:32 AM -128.366 -109.722 Right

9:55:33 AM 5.992559 9.507976 5.999981 14.49985 9:55:33 AM -107.885 -120.328 Left

9:55:34 AM 6.015729 9.66961 5.999981 14.49985 9:55:34 AM -109.284 -121.959 Left

9:55:36 AM 6.02519 9.704388 5.999981 14.49996 9:55:36 AM -123.774 -113.495 Right

9:55:37 AM 6.026832 9.70393 5.999981 14.4998 9:55:37 AM -120.867 -113.435 Right

9:55:38 AM 6.038556 9.898973 5.999981 14.49996 9:55:38 AM -109.734 -123.514 Left

9:55:39 AM 6.059045 10.24453 5.999981 14.4998 9:55:39 AM -111.636 -127.029 Left

9:55:40 AM 6.064431 10.27414 5.999981 14.49985 9:55:40 AM -114.657 -124.947 Left

9:55:41 AM 6.065457 10.27537 5.999981 14.49997 9:55:41 AM -114.294 -124.039 Left

9:55:42 AM 6.057249 10.58068 5.999981 14.4998 9:55:42 AM -110.138 -129.492 Left

9:55:43 AM 6.047837 10.93047 5.999981 14.49996 9:55:43 AM -109.817 -125.938 Left

9:55:44 AM -108.363 -119.862 Left

9:55:45 AM 6.036555 11.04493 5.999981 14.49985 9:55:45 AM -109.689 -108.021 Right

9:55:46 AM 6.029049 11.0474 5.999981 14.49997 9:55:46 AM -118.704 -107.957 Right

9:55:47 AM 6.024122 11.06406 5.999981 14.49997

9:55:48 AM 6.018104 11.06892 5.999981 14.49997 9:55:48 AM -125.172 -110.427 Right

9:55:49 AM 6.039644 11.10099 5.999981 14.49996 9:55:49 AM -128.314 -108.153 Right

9:55:50 AM 6.081065 11.13148 5.999981 14.4998 9:55:50 AM -132.476 -109.466 Right

9:55:51 AM 6.128121 11.15453 5.999981 14.49997 9:55:51 AM -139.565 -109.474 Right

9:55:52 AM 6.170222 11.17085 5.999981 14.49996 9:55:52 AM -141.764 -110.459 Right

9:55:53 AM 6.215927 11.17769 5.999981 14.49985 9:55:53 AM -144.497 -106.811 Right

9:55:54 AM -143.525 -109.554 Right

9:55:55 AM 6.274438 11.18949 5.999981 14.49996 9:55:55 AM -138.297 -109.77 Right

9:55:56 AM 6.323219 11.20669 5.999981 14.49997 9:55:56 AM -136.206 -110.422 Right

9:55:57 AM 6.378795 11.23668 5.999981 14.4998 9:55:57 AM -133.81 -112.324 Right

9:55:58 AM 6.424065 11.27384 5.999981 14.49985 9:55:58 AM -130.528 -110.918 Right

9:55:59 AM 6.463225 11.30717 5.999981 14.4998

9:56:00 AM 6.509186 11.35582 5.999981 14.49997 9:56:00 AM -129.205 -112.825 Right

9:56:01 AM 6.537572 11.39879 5.999981 14.49997 9:56:01 AM -124.932 -117.751 Right

9:56:02 AM 6.557451 11.44825 5.999981 14.49996 9:56:02 AM -120.974 -115.47 Right

9:56:03 AM 6.577597 11.48911 5.999981 14.4998 9:56:03 AM -123.077 -109.536 Right

9:56:04 AM 6.590925 11.52767 5.999981 14.49996 9:56:04 AM -123.449 -115.81 Right

9:56:05 AM -122.369 -114.732 Right

9:56:06 AM 6.608106 11.56886 5.999981 14.49997 9:56:06 AM -121.47 -116.422 Right

9:56:07 AM 6.623736 11.61112 5.999981 14.4998 9:56:07 AM -126.571 -110.798 Right

9:56:08 AM 6.65342 11.67208 5.999981 14.49996 9:56:08 AM -122.908 -115.925 Right

9:56:09 AM 6.682631 11.74033 5.999981 14.49997 9:56:09 AM -121.75 -117.973 Right

9:56:10 AM 6.695972 11.78157 5.999981 14.49985 9:56:10 AM -117.141 -118.419 Left

9:56:11 AM 6.704181 11.83208 5.999981 14.49997 9:56:11 AM -115.505 -119.768 Left

9:56:12 AM 6.710687 11.87988 5.999981 14.4998

9:56:13 AM 6.706898 11.93766 5.999981 14.49996 9:56:13 AM -115.857 -120.153 Left

9:56:14 AM 6.707311 11.93733 5.999981 14.49985 9:56:14 AM -111.201 -121.07 Left

9:56:15 AM 6.707311 11.93733 5.999981 14.49985 9:56:15 AM -108.14 -121.309 Left

9:56:16 AM -108.108 -121.252 Left

9:56:17 AM 6.707311 11.93733 5.999981 14.49997 9:56:17 AM -108.109 -121.257 Left

9:56:18 AM 6.707311 11.93733 5.999981 14.49996 9:56:18 AM -108.11 -121.256 Left

9:56:19 AM 6.707311 11.93733 5.999981 14.49985 9:56:19 AM -108.111 -121.274 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 283: Modelling a real-time multi-sensor fusion-based navigation ...

265

Uniform Obstruction with Right Scatter Sample 3

Motion Path Graph - Uniform obstruction with Right Scatter – Sample 3.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

29-6-9-56

CARMI Child

Page 284: Modelling a real-time multi-sensor fusion-based navigation ...

266

Combined logs - Uniform obstruction with Right Scatter - Sample 3.

X Z X Z L R

9:56:33 AM 6 9.5 6 14.5

9:56:34 AM 6 9.5 6 14.5

9:56:35 AM 6 9.5 6 14.5

9:56:36 AM 5.995082 9.501253 5.999981 14.49985

9:56:37 AM -108.99 -121.201 Left

9:56:38 AM 6.006018 9.59443 5.999981 14.49996

9:56:39 AM 6.018891 9.647563 5.999981 14.4998 9:56:39 AM -118.808 -115.094 Right

9:56:40 AM 6.020895 9.646997 5.999981 14.49997 9:56:40 AM -119.743 -113.432 Right

9:56:41 AM 6.022592 9.687309 5.999981 14.49996 9:56:41 AM -110.94 -117.35 Left

9:56:42 AM 6.024922 10.03388 5.999981 14.49997 9:56:42 AM -113.492 -121.404 Left

9:56:43 AM 6.027252 10.38044 5.999981 14.49985 9:56:43 AM -113.169 -125.448 Left

9:56:44 AM 6.029546 10.72174 5.999981 14.4998 9:56:44 AM -111.101 -128.459 Left

9:56:45 AM 6.031813 11.05793 5.999981 14.4998 9:56:45 AM -108.403 -125.918 Left

9:56:46 AM 6.026261 11.06274 5.999981 14.49996 9:56:46 AM -109.4 -118.314 Left

9:56:47 AM 6.017118 11.06535 5.999981 14.49997 9:56:47 AM -114.102 -109.689 Right

9:56:48 AM -120.613 -113.018 Right

9:56:49 AM 6.008781 11.06935 5.999981 14.49985

9:56:50 AM 6.002533 11.07401 5.999981 14.49997 9:56:50 AM -122.447 -109.911 Right

9:56:51 AM 5.99748 11.07964 5.999981 14.4998 9:56:51 AM -125.98 -108.241 Right

9:56:52 AM 6.026755 11.1105 5.999981 14.49985 9:56:52 AM -133.218 -108.974 Right

9:56:53 AM 6.069405 11.14307 5.999981 14.49985 9:56:53 AM -134.032 -109.163 Right

9:56:54 AM 6.106963 11.16498 5.999981 14.49996 9:56:54 AM -136.311 -110.887 Right

9:56:55 AM 6.144246 11.1833 5.999981 14.49997 9:56:55 AM -141.633 -111.348 Right

9:56:56 AM 6.183669 11.19402 5.999981 14.49996 9:56:56 AM -144.741 -106.667 Right

9:56:57 AM 6.221784 11.196 5.999981 14.4998 9:56:57 AM -144.266 -108.355 Right

9:56:58 AM 6.275648 11.21893 5.999981 14.49985

9:56:59 AM -138.604 -110.43 Right

9:57:00 AM 6.32091 11.23946 5.999981 14.49997 9:57:00 AM -136.629 -108.489 Right

9:57:01 AM 6.374705 11.26873 5.999981 14.49985 9:57:01 AM -133.432 -111.818 Right

9:57:02 AM 6.409009 11.29554 5.999981 14.49997 9:57:02 AM -129.942 -111.85 Right

9:57:03 AM 6.4429 11.32918 5.999981 14.49985 9:57:03 AM -127.518 -112.007 Right

9:57:04 AM 6.468696 11.35887 5.999981 14.49985 9:57:04 AM -124.39 -113.133 Right

9:57:05 AM 6.503281 11.42072 5.999981 14.49996 9:57:05 AM -122.957 -120.454 Right

9:57:06 AM 6.518909 11.46089 5.999981 14.4998 9:57:06 AM -121.716 -113.31 Right

9:57:07 AM 6.535764 11.50683 5.999981 14.49985 9:57:07 AM -123.608 -113.982 Right

9:57:08 AM 6.554189 11.55095 5.999981 14.49996 9:57:08 AM -123.459 -113.277 Right

9:57:09 AM 6.566879 11.58492 5.999981 14.49996 9:57:09 AM -123.731 -108.69 Right

9:57:10 AM -124.006 -113.585 Right

9:57:11 AM 6.591447 11.63612 5.999981 14.49997

9:57:12 AM 6.607576 11.68182 5.999981 14.49997 9:57:12 AM -122.801 -114.706 Right

9:57:13 AM 6.629003 11.73351 5.999981 14.49997 9:57:13 AM -122.17 -117.626 Right

9:57:14 AM 6.638561 11.77993 5.999981 14.49996 9:57:14 AM -119.621 -118.539 Right

9:57:15 AM 6.650188 11.84339 5.999981 14.49985 9:57:15 AM -118.487 -120.691 Left

9:57:16 AM 6.659105 11.90425 5.999981 14.49996 9:57:16 AM -115.089 -120.502 Left

9:57:17 AM 6.658787 11.94016 5.999981 14.49985 9:57:17 AM -115.377 -120.652 Left

9:57:18 AM 6.659345 11.94735 5.999981 14.49996 9:57:18 AM -110.968 -121.346 Left

9:57:19 AM -107.701 -122.393 Left

9:57:20 AM 6.659346 11.94735 5.999981 14.49996 9:57:20 AM -108.151 -121.729 Left

9:57:21 AM 6.659349 11.94735 5.999981 14.49997 9:57:21 AM -109.076 -121.405 Left

9:57:22 AM 6.65935 11.94735 5.999981 14.49985 9:57:22 AM -109.107 -121.409 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 285: Modelling a real-time multi-sensor fusion-based navigation ...

267

Uniform Obstruction with Right Scatter Sample 4

Motion Path Graph - Uniform obstruction with Right Scatter – Sample 4.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

29-6-9-57

CARMI Child

Page 286: Modelling a real-time multi-sensor fusion-based navigation ...

268

Combined logs - Uniform obstruction with Right Scatter - Sample 4.

X Z X Z L R

9:57:32 AM 6 9.5 6 14.5

9:57:33 AM 6 9.5 6 14.5

9:57:34 AM 6 9.5 6 14.5

9:57:35 AM 5.994548 9.500986 5.999981 14.49996

9:57:36 AM -109.011 -120.832 Left

9:57:37 AM 6.006451 9.595366 5.999981 14.49985 9:57:37 AM -118.064 -115.032 Right

9:57:38 AM 6.011172 9.61203 5.999981 14.4998

9:57:39 AM 6.012843 9.612447 5.999981 14.49996 9:57:39 AM -115.645 -115.769 Left

9:57:40 AM 6.014937 9.613797 5.999981 14.4998 9:57:40 AM -112.795 -117.482 Left

9:57:41 AM 6.017207 9.614681 5.999981 14.49997 9:57:41 AM -114.784 -115.365 Left

9:57:42 AM 6.009337 9.84693 5.999981 14.49996 9:57:42 AM -112.888 -118.412 Left

9:57:43 AM 5.996538 10.1879 5.999981 14.49985 9:57:43 AM -112.895 -123.559 Left

9:57:44 AM 5.983344 10.53899 5.999981 14.49996 9:57:44 AM -112.299 -127.131 Left

9:57:45 AM 5.97054 10.87854 5.999981 14.49985 9:57:45 AM -109.298 -128.267 Left

9:57:46 AM 5.960775 11.06625 5.999981 14.49996 9:57:46 AM -106.639 -126.349 Left

9:57:47 AM 5.952388 11.0663 5.999981 14.4998 9:57:47 AM -108.208 -115.141 Left

9:57:48 AM -114.495 -112.658 Right

9:57:49 AM 5.943923 11.06854 5.999981 14.49997

9:57:50 AM 5.943664 11.07973 5.999981 14.4998 9:57:50 AM -119.886 -113.928 Right

9:57:51 AM 5.936495 11.08825 5.999981 14.49997 9:57:51 AM -123.583 -108.658 Right

9:57:52 AM 5.931809 11.09559 5.999981 14.49996 9:57:52 AM -131.65 -108.474 Right

9:57:53 AM 5.969792 11.12473 5.999981 14.49985 9:57:53 AM -134.739 -108.887 Right

9:57:54 AM 6.010285 11.14679 5.999981 14.4998 9:57:54 AM -137.277 -110.01 Right

9:57:55 AM 6.047515 11.16355 5.999981 14.49996 9:57:55 AM -141.899 -110.14 Right

9:57:56 AM 6.113077 11.17416 5.999981 14.4998 9:57:56 AM -143.16 -110.918 Right

9:57:57 AM 6.162657 11.1731 5.999981 14.49996 9:57:57 AM -148.983 -106.421 Right

9:57:58 AM -146.597 -109.746 Right

9:57:59 AM 6.198093 11.17299 5.999981 14.49997 9:57:59 AM -142.169 -109.001 Right

9:58:00 AM 6.246221 11.182 5.999981 14.49985

9:58:01 AM 6.287482 11.19607 5.999981 14.49996 9:58:01 AM -139.258 -108.498 Right

9:58:02 AM 6.330459 11.21676 5.999981 14.49997 9:58:02 AM -138.008 -110.519 Right

9:58:03 AM 6.363225 11.23319 5.999981 14.49985 9:58:03 AM -135.52 -110.245 Right

9:58:04 AM 6.399527 11.26743 5.999981 14.4998 9:58:04 AM -131.53 -113.533 Right

9:58:05 AM 6.439667 11.31122 5.999981 14.4998 9:58:05 AM -128.758 -111.366 Right

9:58:06 AM 6.470792 11.35169 5.999981 14.49997 9:58:06 AM -126.635 -114.149 Right

9:58:07 AM 6.501385 11.40107 5.999981 14.49997 9:58:07 AM -122.3 -114.156 Right

9:58:08 AM 6.515336 11.43874 5.999981 14.49996 9:58:08 AM -122.708 -114.139 Right

9:58:09 AM -123.045 -116.371 Right

9:58:10 AM 6.532215 11.47868 5.999981 14.4998 9:58:10 AM -121.162 -119.805 Right

9:58:11 AM 6.55264 11.53074 5.999981 14.49985 9:58:11 AM -123.572 -113.19 Right

9:58:12 AM 6.570231 11.58547 5.999981 14.49996

9:58:13 AM 6.587126 11.63303 5.999981 14.49985 9:58:13 AM -122.133 -117.49 Right

9:58:14 AM 6.599205 11.66648 5.999981 14.49996 9:58:14 AM -121.686 -114.865 Right

9:58:15 AM 6.617399 11.70682 5.999981 14.49985 9:58:15 AM -125.86 -110.926 Right

9:58:16 AM 6.636156 11.74861 5.999981 14.49997 9:58:16 AM -122.107 -117.28 Right

9:58:17 AM 6.6454 11.78609 5.999981 14.49985 9:58:17 AM -119.438 -119.395 Right

9:58:18 AM 6.661684 11.85751 5.999981 14.49997 9:58:18 AM -115.43 -118.168 Left

9:58:19 AM -116.605 -122.597 Left

9:58:20 AM 6.65705 11.90391 5.999981 14.49997 9:58:20 AM -107.669 -123.22 Left

9:58:21 AM 6.658482 11.92179 5.999981 14.4998 9:58:21 AM -109.358 -121.231 Left

9:58:22 AM 6.660076 11.92215 5.999981 14.49997 9:58:22 AM -110.572 -120.749 Left

9:58:23 AM 6.661672 11.92333 5.999981 14.49985 9:58:23 AM -110.661 -123.473 Left

9:58:24 AM 6.663239 11.92305 5.999981 14.4998

9:58:25 AM 6.664297 11.92356 5.999981 14.4998 9:58:25 AM -111.16 -123.086 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 287: Modelling a real-time multi-sensor fusion-based navigation ...

269

Uniform Obstruction with Right Scatter Sample 5

Motion Path Graph - Uniform obstruction with Right Scatter – Sample 5.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

29-6-9-58

CARMI Child

Page 288: Modelling a real-time multi-sensor fusion-based navigation ...

270

Combined logs - Uniform obstruction with Right Scatter – Sample 5.

X Z X Z L R

9:58:45 AM 6 9.5 6 14.5

9:58:46 AM 6 9.5 6 14.5

9:58:47 AM 6 9.5 6 14.5

9:58:48 AM 6 9.5 6 14.5

9:58:50 AM 5.993189 9.501262 5.999981 14.49997

9:58:51 AM 6.008582 9.61451 5.999981 14.49996 9:58:51 AM -110.171 -121.082 Left

9:58:52 AM 6.010566 9.615026 5.999981 14.49985 9:58:52 AM -116.409 -116.397 Right

9:58:53 AM 6.012881 9.615509 5.999981 14.49996 9:58:53 AM -115.724 -115.42 Right

9:58:54 AM 6.015862 9.718835 5.999981 14.49996 9:58:54 AM -112.066 -116.652 Left

9:58:55 AM 6.019601 10.07033 5.999981 14.49997 9:58:55 AM -114.08 -120.006 Left

9:58:56 AM 6.02334 10.42181 5.999981 14.4998 9:58:56 AM -114.112 -124.132 Left

9:58:57 AM 6.026967 10.7628 5.999981 14.49997 9:58:57 AM -112.058 -126.733 Left

9:58:58 AM 6.025583 11.06142 5.999981 14.49997 9:58:58 AM -109.408 -123.925 Left

9:58:59 AM 6.012806 11.06401 5.999981 14.49997 9:58:59 AM -109.165 -112.242 Left

9:59:00 AM -120.305 -108.132 Right

9:59:01 AM 6.00105 11.06982 5.999981 14.49997 9:59:01 AM -128.477 -110.022 Right

9:59:02 AM 5.994045 11.07644 5.999981 14.49997

9:59:03 AM 6.012623 11.09851 5.999981 14.49985 9:59:03 AM -130.794 -108.339 Right

9:59:04 AM 6.043941 11.12405 5.999981 14.49996 9:59:04 AM -133.159 -108.999 Right

9:59:05 AM 6.10221 11.1601 5.999981 14.49996 9:59:05 AM -136.71 -109.699 Right

9:59:06 AM 6.140649 11.17582 5.999981 14.4998 9:59:06 AM -140.868 -111.454 Right

9:59:07 AM 6.173244 11.18021 5.999981 14.49985 9:59:07 AM -145.563 -106.88 Right

9:59:08 AM 6.207083 11.18391 5.999981 14.49997 9:59:08 AM -145.895 -110.107 Right

9:59:09 AM 6.255499 11.19372 5.999981 14.49996 9:59:09 AM -139.945 -109.159 Right

9:59:10 AM -139.337 -108.066 Right

9:59:11 AM 6.298922 11.2069 5.999981 14.49996 9:59:11 AM -136.126 -111.213 Right

9:59:12 AM 6.352117 11.23507 5.999981 14.49985 9:59:12 AM -132.905 -111.341 Right

9:59:13 AM 6.388076 11.25852 5.999981 14.49996 9:59:13 AM -130.994 -111.339 Right

9:59:14 AM 6.436852 11.30429 5.999981 14.4998 9:59:14 AM -128.745 -113.657 Right

9:59:15 AM 6.474067 11.34313 5.999981 14.49997

9:59:16 AM 6.505203 11.38421 5.999981 14.49996 9:59:16 AM -124.746 -113.107 Right

9:59:17 AM 6.532163 11.43241 5.999981 14.49985 9:59:17 AM -122.59 -113.534 Right

9:59:18 AM 6.545402 11.47203 5.999981 14.49996 9:59:18 AM -122.432 -115.954 Right

9:59:19 AM 6.561539 11.50831 5.999981 14.4998 9:59:19 AM -123.261 -109.104 Right

9:59:20 AM 6.587187 11.5717 5.999981 14.49996 9:59:20 AM -122.661 -115.484 Right

9:59:21 AM -121.97 -115.392 Right

9:59:22 AM 6.610212 11.62936 5.999981 14.4998 9:59:22 AM -121.599 -115.747 Right

9:59:23 AM 6.628486 11.67712 5.999981 14.49985 9:59:23 AM -121.599 -115.948 Right

9:59:24 AM 6.641671 11.71919 5.999981 14.4998 9:59:24 AM -121.04 -116.998 Right

9:59:25 AM 6.65768 11.77047 5.999981 14.49985 9:59:25 AM -119.608 -117.796 Right

9:59:26 AM 6.670365 11.82412 5.999981 14.49996 9:59:26 AM -117.651 -118.717 Left

9:59:27 AM 6.676343 11.86964 5.999981 14.49985 9:59:27 AM -115.078 -119.698 Left

9:59:28 AM 6.677368 11.91808 5.999981 14.49997 9:59:28 AM -114.894 -122.213 Left

9:59:29 AM 6.67787 11.93672 5.999981 14.49996 9:59:29 AM -110.996 -121.218 Left

9:59:30 AM 6.678367 11.93714 5.999981 14.49997

9:59:31 AM -109.572 -120.981 Left

9:59:32 AM 6.678356 11.93713 5.999981 14.49997 9:59:32 AM -109.391 -121.091 Left

9:59:33 AM 6.678356 11.93713 5.999981 14.49996 9:59:33 AM -109.391 -121.107 Left

9:59:34 AM 6.678356 11.93714 5.999981 14.4998 9:59:34 AM -109.392 -121.097 Left

9:59:35 AM 6.678356 11.93713 5.999981 14.49996 9:59:35 AM -109.391 -121.112 Left

9:59:36 AM 6.678357 11.93713 5.999981 14.4998 9:59:36 AM -109.391 -121.095 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 289: Modelling a real-time multi-sensor fusion-based navigation ...

271

Single Non-Uniform Obstruction Sample 1

Motion Path Graph - Single Non-Uniform obstruction – Sample 1.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.25.45.65.866.26.46.66.87

29-6-10-13

CARMI Child

Page 290: Modelling a real-time multi-sensor fusion-based navigation ...

272

Combined logs – Single Non-Uniform obstruction – Sample 1.

X Z X Z L R

10:13:45 AM 6 9.5 6 14.5

10:13:46 AM 6 9.5 6 14.5

10:13:47 AM 6 9.5 6 14.5

10:13:48 AM 5.991876 9.501328 5.999981 14.49997

10:13:49 AM 5.986232 9.492682 5.999981 14.49996

10:13:51 AM 5.963475 9.437613 5.999981 14.4998

10:13:52 AM 5.934978 9.393324 5.999981 14.49985

10:13:53 AM 5.910014 9.365203 5.999981 14.49996

10:13:54 AM 5.873061 9.333778 5.999981 14.4998

10:13:55 AM 5.839613 9.307987 5.999981 14.4998

10:13:56 AM 5.879478 9.32711 5.999981 14.49996 10:13:56 AM -143.018 -116.469 Right

10:13:57 AM 5.919246 9.351189 5.999981 14.4998 10:13:57 AM -142.429 -109.693 Right

10:13:58 AM 5.958128 9.391424 5.999981 14.49997 10:13:58 AM -140.122 -114.848 Right

10:13:59 AM 5.992304 9.430294 5.999981 14.49985 10:13:59 AM -138.026 -113.818 Right

10:14:00 AM 6.016098 9.468188 5.999981 14.49996 10:14:00 AM -135.606 -112.005 Right

10:14:01 AM 6.042299 9.515686 5.999981 14.4998 10:14:01 AM -135.022 -112.19 Right

10:14:02 AM -134.784 -113.094 Right

10:14:03 AM 6.059335 9.562017 5.999981 14.49997 10:14:03 AM -131.914 -111.386 Right

10:14:04 AM 6.069165 9.594148 5.999981 14.49985 10:14:04 AM -130.484 -113.099 Right

10:14:05 AM 6.070604 9.603067 5.999981 14.49996

10:14:06 AM 6.078282 9.952428 5.999981 14.49996 10:14:06 AM -132.67 -113.611 Right

10:14:07 AM 6.085866 10.29835 5.999981 14.49985 10:14:07 AM -131.297 -115.268 Right

10:14:08 AM 6.093336 10.63891 5.999981 14.49997 10:14:08 AM -125.151 -115.122 Right

10:14:09 AM 6.10093 10.98456 5.999981 14.49985 10:14:09 AM -114.693 -111.836 Right

10:14:10 AM 6.128466 11.00007 5.999981 14.49997 10:14:10 AM -111.018 -108.294 Right

10:14:11 AM 6.137124 11.0077 5.999981 14.49996 10:14:11 AM -109.46 -110.532 Left

10:14:12 AM -110.378 -107.551 Right

10:14:13 AM 6.143859 11.01498 5.999981 14.49985 10:14:13 AM -106.25 -107.319 Left

10:14:14 AM 6.145187 11.01672 5.999981 14.49985 10:14:14 AM -106.427 -107.547 Left

10:14:15 AM 6.140853 11.00891 5.999981 14.49997 10:14:15 AM -110.11 -107.514 Right

10:14:16 AM 6.134578 11.00257 5.999981 14.49985 10:14:16 AM -111.484 -107.63 Right

10:14:17 AM 6.126868 10.99637 5.999981 14.49985 10:14:17 AM -109.464 -108.303 Right

10:14:18 AM 6.142395 10.95863 5.999981 14.49997 10:14:18 AM -109.082 -120.846 Left

10:14:19 AM 6.148995 10.95172 5.999981 14.49997 10:14:19 AM -107.394 -121.711 Left

10:14:20 AM 6.14954 10.96553 5.999981 14.4998

10:14:21 AM 6.120224 11.00731 5.999981 14.49997 10:14:21 AM -110.192 -121.456 Left

10:14:22 AM 6.11064 11.02822 5.999981 14.49985 10:14:22 AM -110.177 -127.288 Left

10:14:23 AM 6.07086 11.05933 5.999981 14.49997 10:14:23 AM #NAME? #NAME? Mid

10:14:24 AM -110.714 -132.566 Left

10:14:25 AM 6.027144 11.08917 5.999981 14.49996 10:14:25 AM -111.257 -136.161 Left

10:14:26 AM 5.976007 11.11179 5.999981 14.4998 10:14:26 AM -112.081 -140.23 Left

10:14:27 AM 5.94027 11.12349 5.999981 14.4998 10:14:27 AM -112.447 -142.929 Left

10:14:28 AM 5.896355 11.13249 5.999981 14.49997 10:14:28 AM -110.031 -144.912 Left

10:14:29 AM 5.856781 11.12952 5.999981 14.49996 10:14:29 AM -110.321 -146.758 Left

10:14:30 AM 5.80122 11.13281 5.999981 14.4998 10:14:30 AM -106.377 -149.661 Left

10:14:31 AM 5.749688 11.14359 5.999981 14.49997 10:14:31 AM -107.998 -142.594 Left

10:14:32 AM 5.702169 11.15886 5.999981 14.49985 10:14:32 AM #NAME? #NAME? Mid

10:14:33 AM 5.642862 11.19348 5.999981 14.49996 10:14:33 AM -107.457 -138.866 Left

10:14:34 AM 5.609009 11.21532 5.999981 14.49996 10:14:34 AM -106.3 -137.481 Left

10:14:35 AM 5.56469 11.24916 5.999981 14.49997 10:14:35 AM -106.688 -135.484 Left

10:14:36 AM -107.583 -133.49 Left

10:14:37 AM 5.529091 11.28092 5.999981 14.49996 10:14:37 AM -108.618 -132.668 Left

10:14:38 AM 5.495217 11.3248 5.999981 14.49985 10:14:38 AM #NAME? #NAME? Mid

10:14:39 AM 5.475418 11.36049 5.999981 14.49997

10:14:40 AM 5.461507 11.40364 5.999981 14.49985 10:14:40 AM -109.518 -124.934 Left

10:14:41 AM 5.440051 11.45829 5.999981 14.49997 10:14:41 AM -109.664 -126.322 Left

10:14:42 AM 5.421883 11.50379 5.999981 14.49996 10:14:42 AM -108.333 -127.681 Left

10:14:43 AM 5.401868 11.55611 5.999981 14.49997 10:14:43 AM -108.116 -128.118 Left

10:14:44 AM 5.380318 11.60897 5.999981 14.49985 10:14:44 AM -109.132 -130.318 Left

10:14:45 AM 5.364387 11.65057 5.999981 14.4998 10:14:45 AM -109.022 -130.6 Left

10:14:46 AM -110.706 -132.62 Left

10:14:47 AM 5.341948 11.69557 5.999981 14.49996 10:14:47 AM -109.805 -129.706 Left

10:14:48 AM 5.329875 11.75236 5.999981 14.49996 10:14:48 AM -110.187 -131.28 Left

10:14:49 AM 5.314241 11.80533 5.999981 14.49985 10:14:49 AM -108.399 -129.789 Left

10:14:50 AM 5.305806 11.85946 5.999981 14.4998

10:14:51 AM 5.306743 11.89437 5.999981 14.49996 10:14:51 AM -112.841 -119.404 Left

10:14:52 AM 5.305887 11.89471 5.999981 14.4998 10:14:52 AM -111.333 -121.329 Left

10:14:53 AM 5.30481 11.89482 5.999981 14.49996 10:14:53 AM -112.74 -120.444 Left

10:14:54 AM 5.304423 11.89456 5.999981 14.49996 10:14:54 AM -111.456 -123.375 Left

10:14:55 AM 5.30336 11.895 5.999981 14.49985 10:14:55 AM -111.91 -123.153 Left

10:14:56 AM 5.30139 11.89341 5.999981 14.49985 10:14:56 AM -111.031 -123.401 Left

10:14:57 AM -110.793 -123.839 Left

10:14:58 AM 5.300543 11.89305 5.999981 14.49985 10:14:58 AM -111.952 -121.689 Left

10:14:59 AM 5.300544 11.89305 5.999981 14.49997 10:14:59 AM -113.44 -120.002 Left

10:15:00 AM 5.300546 11.89305 5.999981 14.49996 10:15:00 AM -113.355 -119.982 Left

10:15:01 AM 5.300547 11.89305 5.999981 14.49997 10:15:01 AM -113.349 -119.971 Left

10:15:02 AM 5.300554 11.89305 5.999981 14.49985 10:15:02 AM -113.457 -119.98 Left

10:15:03 AM 5.300559 11.89304 5.999981 14.49996 10:15:03 AM -113.344 -119.975 Left

10:15:04 AM 5.30056 11.89304 5.999981 14.49996 10:15:04 AM -113.351 -119.983 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 291: Modelling a real-time multi-sensor fusion-based navigation ...

273

X Z X Z L R

10:13:45 AM 6 9.5 6 14.5

10:13:46 AM 6 9.5 6 14.5

10:13:47 AM 6 9.5 6 14.5

10:13:48 AM 5.991876 9.501328 5.999981 14.49997

10:13:49 AM 5.986232 9.492682 5.999981 14.49996

10:13:51 AM 5.963475 9.437613 5.999981 14.4998

10:13:52 AM 5.934978 9.393324 5.999981 14.49985

10:13:53 AM 5.910014 9.365203 5.999981 14.49996

10:13:54 AM 5.873061 9.333778 5.999981 14.4998

10:13:55 AM 5.839613 9.307987 5.999981 14.4998

10:13:56 AM 5.879478 9.32711 5.999981 14.49996 10:13:56 AM -143.018 -116.469 Right

10:13:57 AM 5.919246 9.351189 5.999981 14.4998 10:13:57 AM -142.429 -109.693 Right

10:13:58 AM 5.958128 9.391424 5.999981 14.49997 10:13:58 AM -140.122 -114.848 Right

10:13:59 AM 5.992304 9.430294 5.999981 14.49985 10:13:59 AM -138.026 -113.818 Right

10:14:00 AM 6.016098 9.468188 5.999981 14.49996 10:14:00 AM -135.606 -112.005 Right

10:14:01 AM 6.042299 9.515686 5.999981 14.4998 10:14:01 AM -135.022 -112.19 Right

10:14:02 AM -134.784 -113.094 Right

10:14:03 AM 6.059335 9.562017 5.999981 14.49997 10:14:03 AM -131.914 -111.386 Right

10:14:04 AM 6.069165 9.594148 5.999981 14.49985 10:14:04 AM -130.484 -113.099 Right

10:14:05 AM 6.070604 9.603067 5.999981 14.49996

10:14:06 AM 6.078282 9.952428 5.999981 14.49996 10:14:06 AM -132.67 -113.611 Right

10:14:07 AM 6.085866 10.29835 5.999981 14.49985 10:14:07 AM -131.297 -115.268 Right

10:14:08 AM 6.093336 10.63891 5.999981 14.49997 10:14:08 AM -125.151 -115.122 Right

10:14:09 AM 6.10093 10.98456 5.999981 14.49985 10:14:09 AM -114.693 -111.836 Right

10:14:10 AM 6.128466 11.00007 5.999981 14.49997 10:14:10 AM -111.018 -108.294 Right

10:14:11 AM 6.137124 11.0077 5.999981 14.49996 10:14:11 AM -109.46 -110.532 Left

10:14:12 AM -110.378 -107.551 Right

10:14:13 AM 6.143859 11.01498 5.999981 14.49985 10:14:13 AM -106.25 -107.319 Left

10:14:14 AM 6.145187 11.01672 5.999981 14.49985 10:14:14 AM -106.427 -107.547 Left

10:14:15 AM 6.140853 11.00891 5.999981 14.49997 10:14:15 AM -110.11 -107.514 Right

10:14:16 AM 6.134578 11.00257 5.999981 14.49985 10:14:16 AM -111.484 -107.63 Right

10:14:17 AM 6.126868 10.99637 5.999981 14.49985 10:14:17 AM -109.464 -108.303 Right

10:14:18 AM 6.142395 10.95863 5.999981 14.49997 10:14:18 AM -109.082 -120.846 Left

10:14:19 AM 6.148995 10.95172 5.999981 14.49997 10:14:19 AM -107.394 -121.711 Left

10:14:20 AM 6.14954 10.96553 5.999981 14.4998

10:14:21 AM 6.120224 11.00731 5.999981 14.49997 10:14:21 AM -110.192 -121.456 Left

10:14:22 AM 6.11064 11.02822 5.999981 14.49985 10:14:22 AM -110.177 -127.288 Left

10:14:23 AM 6.07086 11.05933 5.999981 14.49997 10:14:23 AM #NAME? #NAME? Mid

10:14:24 AM -110.714 -132.566 Left

10:14:25 AM 6.027144 11.08917 5.999981 14.49996 10:14:25 AM -111.257 -136.161 Left

10:14:26 AM 5.976007 11.11179 5.999981 14.4998 10:14:26 AM -112.081 -140.23 Left

10:14:27 AM 5.94027 11.12349 5.999981 14.4998 10:14:27 AM -112.447 -142.929 Left

10:14:28 AM 5.896355 11.13249 5.999981 14.49997 10:14:28 AM -110.031 -144.912 Left

10:14:29 AM 5.856781 11.12952 5.999981 14.49996 10:14:29 AM -110.321 -146.758 Left

10:14:30 AM 5.80122 11.13281 5.999981 14.4998 10:14:30 AM -106.377 -149.661 Left

10:14:31 AM 5.749688 11.14359 5.999981 14.49997 10:14:31 AM -107.998 -142.594 Left

10:14:32 AM 5.702169 11.15886 5.999981 14.49985 10:14:32 AM #NAME? #NAME? Mid

10:14:33 AM 5.642862 11.19348 5.999981 14.49996 10:14:33 AM -107.457 -138.866 Left

10:14:34 AM 5.609009 11.21532 5.999981 14.49996 10:14:34 AM -106.3 -137.481 Left

10:14:35 AM 5.56469 11.24916 5.999981 14.49997 10:14:35 AM -106.688 -135.484 Left

10:14:36 AM -107.583 -133.49 Left

10:14:37 AM 5.529091 11.28092 5.999981 14.49996 10:14:37 AM -108.618 -132.668 Left

10:14:38 AM 5.495217 11.3248 5.999981 14.49985 10:14:38 AM #NAME? #NAME? Mid

10:14:39 AM 5.475418 11.36049 5.999981 14.49997

10:14:40 AM 5.461507 11.40364 5.999981 14.49985 10:14:40 AM -109.518 -124.934 Left

10:14:41 AM 5.440051 11.45829 5.999981 14.49997 10:14:41 AM -109.664 -126.322 Left

10:14:42 AM 5.421883 11.50379 5.999981 14.49996 10:14:42 AM -108.333 -127.681 Left

10:14:43 AM 5.401868 11.55611 5.999981 14.49997 10:14:43 AM -108.116 -128.118 Left

10:14:44 AM 5.380318 11.60897 5.999981 14.49985 10:14:44 AM -109.132 -130.318 Left

10:14:45 AM 5.364387 11.65057 5.999981 14.4998 10:14:45 AM -109.022 -130.6 Left

10:14:46 AM -110.706 -132.62 Left

10:14:47 AM 5.341948 11.69557 5.999981 14.49996 10:14:47 AM -109.805 -129.706 Left

10:14:48 AM 5.329875 11.75236 5.999981 14.49996 10:14:48 AM -110.187 -131.28 Left

10:14:49 AM 5.314241 11.80533 5.999981 14.49985 10:14:49 AM -108.399 -129.789 Left

10:14:50 AM 5.305806 11.85946 5.999981 14.4998

10:14:51 AM 5.306743 11.89437 5.999981 14.49996 10:14:51 AM -112.841 -119.404 Left

10:14:52 AM 5.305887 11.89471 5.999981 14.4998 10:14:52 AM -111.333 -121.329 Left

10:14:53 AM 5.30481 11.89482 5.999981 14.49996 10:14:53 AM -112.74 -120.444 Left

10:14:54 AM 5.304423 11.89456 5.999981 14.49996 10:14:54 AM -111.456 -123.375 Left

10:14:55 AM 5.30336 11.895 5.999981 14.49985 10:14:55 AM -111.91 -123.153 Left

10:14:56 AM 5.30139 11.89341 5.999981 14.49985 10:14:56 AM -111.031 -123.401 Left

10:14:57 AM -110.793 -123.839 Left

10:14:58 AM 5.300543 11.89305 5.999981 14.49985 10:14:58 AM -111.952 -121.689 Left

10:14:59 AM 5.300544 11.89305 5.999981 14.49997 10:14:59 AM -113.44 -120.002 Left

10:15:00 AM 5.300546 11.89305 5.999981 14.49996 10:15:00 AM -113.355 -119.982 Left

10:15:01 AM 5.300547 11.89305 5.999981 14.49997 10:15:01 AM -113.349 -119.971 Left

10:15:02 AM 5.300554 11.89305 5.999981 14.49985 10:15:02 AM -113.457 -119.98 Left

10:15:03 AM 5.300559 11.89304 5.999981 14.49996 10:15:03 AM -113.344 -119.975 Left

10:15:04 AM 5.30056 11.89304 5.999981 14.49996 10:15:04 AM -113.351 -119.983 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 292: Modelling a real-time multi-sensor fusion-based navigation ...

274

Single Non-Uniform Obstruction Sample 2

Motion Path Graph - Single Non-Uniform obstruction – Sample 2.

Combined logs – Single Non-Uniform obstruction – Sample 2.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.25.45.65.866.26.46.66.87

29-6-10-15

CARMI Child

X Z X Z L R

10:15:11 AM 6 9.5 6 14.5

10:15:13 AM 6 9.5 6 14.5

10:15:14 AM 6 9.5 6 14.5

10:15:15 AM 5.994308 9.500896 5.999981 14.49985

10:15:16 AM 6.011705 9.688725 5.999981 14.49997 10:15:16 AM -133.196 -114.818 Right

10:15:17 AM 6.012587 9.688594 5.999981 14.49997 10:15:17 AM -133.468 -111.689 Right

10:15:18 AM 6.012868 9.68847 5.999981 14.49985 10:15:18 AM -131.599 -112.553 Right

10:15:19 AM 6.013505 9.688615 5.999981 14.4998 10:15:19 AM -131.134 -112.334 Right

10:15:20 AM 6.014375 9.689281 5.999981 14.49997 10:15:20 AM -131.074 -112.977 Right

10:15:21 AM -131.058 -112.709 Right

10:15:22 AM 6.014764 9.689977 5.999981 14.49997 10:15:22 AM -130.746 -112.007 Right

10:15:23 AM 6.015401 9.689497 5.999981 14.49985 10:15:23 AM -130.371 -111.646 Right

10:15:24 AM 6.014698 9.76862 5.999981 14.49997

10:15:25 AM 6.008034 10.11512 5.999981 14.49985 10:15:25 AM -129.743 -113.956 Right

10:15:26 AM 6.001469 10.45637 5.999981 14.4998 10:15:26 AM -125.08 -114.417 Right

10:15:27 AM 5.994901 10.79762 5.999981 14.49997 10:15:27 AM -117.042 -113.094 Right

10:15:28 AM 5.986585 11.05514 5.999981 14.4998 10:15:28 AM #NAME? #NAME? Mid

10:15:29 AM 5.977194 11.05714 5.999981 14.49985 10:15:29 AM -110.042 -111.487 Left

10:15:30 AM 5.968297 11.06085 5.999981 14.4998 10:15:30 AM -116.185 -110.696 Right

10:15:31 AM 5.959096 11.06872 5.999981 14.49997 10:15:31 AM -121.786 -110.306 Right

10:15:32 AM -128.323 -108.181 Right

10:15:33 AM 5.952369 11.07753 5.999981 14.49996 10:15:33 AM -134.193 -108.237 Right

10:15:34 AM 5.94794 11.08485 5.999981 14.49996 10:15:34 AM -137.615 -107.843 Right

10:15:35 AM 5.984583 11.10779 5.999981 14.49985 10:15:35 AM -139.084 -109.197 Right

10:15:36 AM 6.03102 11.12298 5.999981 14.4998 10:15:36 AM -143.162 -109.015 Right

10:15:37 AM 6.08515 11.13496 5.999981 14.49997

10:15:38 AM 6.131732 11.14088 5.999981 14.4998 10:15:38 AM -147.214 -109.419 Right

10:15:39 AM 6.180035 11.14217 5.999981 14.49997 10:15:39 AM -149.494 -110.027 Right

10:15:40 AM 6.241859 11.12912 5.999981 14.49996 10:15:40 AM -155.086 -109.518 Right

10:15:41 AM 6.287606 11.11476 5.999981 14.4998 10:15:41 AM -157.074 -108.382 Right

10:15:42 AM 6.328498 11.09644 5.999981 14.49997 10:15:42 AM -159.059 -108.906 Right

10:15:43 AM -155.406 -105.672 Right

10:15:44 AM 6.377422 11.07776 5.999981 14.49985 10:15:44 AM -158.41 -108.969 Right

10:15:45 AM 6.427683 11.05554 5.999981 14.4998 10:15:45 AM 124 Recenter-Right

10:15:46 AM 6.448325 11.04522 5.999981 14.49997 10:15:46 AM -158.062 -106.754 Right

10:15:47 AM 6.447668 11.04526 5.999981 14.49996 10:15:47 AM -155.091 -107.359 Right

10:15:48 AM 6.447338 11.04475 5.999981 14.49997 10:15:48 AM -155.756 -109.431 Right

10:15:49 AM 6.447227 11.04415 5.999981 14.49985 10:15:49 AM -156.493 -107.918 Right

10:15:50 AM 6.446936 11.04281 5.999981 14.49996 10:15:50 AM -157.333 -107.337 Right

10:15:51 AM 6.447393 11.0419 5.999981 14.49985

10:15:52 AM 6.448669 11.04137 5.999981 14.49997 10:15:52 AM -157.18 -106.901 Right

10:15:53 AM 6.448472 11.04022 5.999981 14.49997 10:15:53 AM -154.86 -107.584 Right

10:15:54 AM 6.448925 11.03927 5.999981 14.49996 10:15:54 AM -155.106 -107.471 Right

10:15:55 AM -155.361 -106.97 Right

10:15:56 AM 6.449743 11.03851 5.999981 14.4998 10:15:56 AM -154.355 -106.969 Right

10:15:57 AM 6.450462 11.03801 5.999981 14.49997 10:15:57 AM -153.511 -107.182 Right

10:15:58 AM 6.452027 11.03657 5.999981 14.49985

10:15:59 AM 6.452987 11.03602 5.999981 14.49997 10:15:59 AM -153.908 -107.077 Right

10:16:00 AM 6.453202 11.03454 5.999981 14.49996 10:16:00 AM -153.938 -107.089 Right

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 293: Modelling a real-time multi-sensor fusion-based navigation ...

275

X Z X Z L R

10:15:11 AM 6 9.5 6 14.5

10:15:13 AM 6 9.5 6 14.5

10:15:14 AM 6 9.5 6 14.5

10:15:15 AM 5.994308 9.500896 5.999981 14.49985

10:15:16 AM 6.011705 9.688725 5.999981 14.49997 10:15:16 AM -133.196 -114.818 Right

10:15:17 AM 6.012587 9.688594 5.999981 14.49997 10:15:17 AM -133.468 -111.689 Right

10:15:18 AM 6.012868 9.68847 5.999981 14.49985 10:15:18 AM -131.599 -112.553 Right

10:15:19 AM 6.013505 9.688615 5.999981 14.4998 10:15:19 AM -131.134 -112.334 Right

10:15:20 AM 6.014375 9.689281 5.999981 14.49997 10:15:20 AM -131.074 -112.977 Right

10:15:21 AM -131.058 -112.709 Right

10:15:22 AM 6.014764 9.689977 5.999981 14.49997 10:15:22 AM -130.746 -112.007 Right

10:15:23 AM 6.015401 9.689497 5.999981 14.49985 10:15:23 AM -130.371 -111.646 Right

10:15:24 AM 6.014698 9.76862 5.999981 14.49997

10:15:25 AM 6.008034 10.11512 5.999981 14.49985 10:15:25 AM -129.743 -113.956 Right

10:15:26 AM 6.001469 10.45637 5.999981 14.4998 10:15:26 AM -125.08 -114.417 Right

10:15:27 AM 5.994901 10.79762 5.999981 14.49997 10:15:27 AM -117.042 -113.094 Right

10:15:28 AM 5.986585 11.05514 5.999981 14.4998 10:15:28 AM #NAME? #NAME? Mid

10:15:29 AM 5.977194 11.05714 5.999981 14.49985 10:15:29 AM -110.042 -111.487 Left

10:15:30 AM 5.968297 11.06085 5.999981 14.4998 10:15:30 AM -116.185 -110.696 Right

10:15:31 AM 5.959096 11.06872 5.999981 14.49997 10:15:31 AM -121.786 -110.306 Right

10:15:32 AM -128.323 -108.181 Right

10:15:33 AM 5.952369 11.07753 5.999981 14.49996 10:15:33 AM -134.193 -108.237 Right

10:15:34 AM 5.94794 11.08485 5.999981 14.49996 10:15:34 AM -137.615 -107.843 Right

10:15:35 AM 5.984583 11.10779 5.999981 14.49985 10:15:35 AM -139.084 -109.197 Right

10:15:36 AM 6.03102 11.12298 5.999981 14.4998 10:15:36 AM -143.162 -109.015 Right

10:15:37 AM 6.08515 11.13496 5.999981 14.49997

10:15:38 AM 6.131732 11.14088 5.999981 14.4998 10:15:38 AM -147.214 -109.419 Right

10:15:39 AM 6.180035 11.14217 5.999981 14.49997 10:15:39 AM -149.494 -110.027 Right

10:15:40 AM 6.241859 11.12912 5.999981 14.49996 10:15:40 AM -155.086 -109.518 Right

10:15:41 AM 6.287606 11.11476 5.999981 14.4998 10:15:41 AM -157.074 -108.382 Right

10:15:42 AM 6.328498 11.09644 5.999981 14.49997 10:15:42 AM -159.059 -108.906 Right

10:15:43 AM -155.406 -105.672 Right

10:15:44 AM 6.377422 11.07776 5.999981 14.49985 10:15:44 AM -158.41 -108.969 Right

10:15:45 AM 6.427683 11.05554 5.999981 14.4998 10:15:45 AM 124 Recenter-Right

10:15:46 AM 6.448325 11.04522 5.999981 14.49997 10:15:46 AM -158.062 -106.754 Right

10:15:47 AM 6.447668 11.04526 5.999981 14.49996 10:15:47 AM -155.091 -107.359 Right

10:15:48 AM 6.447338 11.04475 5.999981 14.49997 10:15:48 AM -155.756 -109.431 Right

10:15:49 AM 6.447227 11.04415 5.999981 14.49985 10:15:49 AM -156.493 -107.918 Right

10:15:50 AM 6.446936 11.04281 5.999981 14.49996 10:15:50 AM -157.333 -107.337 Right

10:15:51 AM 6.447393 11.0419 5.999981 14.49985

10:15:52 AM 6.448669 11.04137 5.999981 14.49997 10:15:52 AM -157.18 -106.901 Right

10:15:53 AM 6.448472 11.04022 5.999981 14.49997 10:15:53 AM -154.86 -107.584 Right

10:15:54 AM 6.448925 11.03927 5.999981 14.49996 10:15:54 AM -155.106 -107.471 Right

10:15:55 AM -155.361 -106.97 Right

10:15:56 AM 6.449743 11.03851 5.999981 14.4998 10:15:56 AM -154.355 -106.969 Right

10:15:57 AM 6.450462 11.03801 5.999981 14.49997 10:15:57 AM -153.511 -107.182 Right

10:15:58 AM 6.452027 11.03657 5.999981 14.49985

10:15:59 AM 6.452987 11.03602 5.999981 14.49997 10:15:59 AM -153.908 -107.077 Right

10:16:00 AM 6.453202 11.03454 5.999981 14.49996 10:16:00 AM -153.938 -107.089 Right

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 294: Modelling a real-time multi-sensor fusion-based navigation ...

276

Single Non-Uniform Obstruction Sample 3

Motion Path Graph - Single Non-Uniform obstruction – Sample 3.

Combined logs – Single Non-Uniform obstruction – Sample 3.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.25.45.65.866.26.46.66.87

29-6-10-16

CARMI Child

X Z X Z L R

10:16:53 AM 6 9.5 6 14.5

10:16:54 AM 6 9.5 6 14.5

10:16:55 AM 6 9.5 6 14.5

10:16:56 AM 5.993452 9.500679 5.999981 14.49996

10:16:57 AM 5.999599 9.552241 5.999981 14.49996 10:16:57 AM -132.124 -116.025 Right

10:16:58 AM 6.000326 9.552691 5.999981 14.49985 10:16:58 AM -134.157 -110.572 Right

10:16:59 AM -131.811 -112.343 Right

10:17:00 AM 6.000171 9.553128 5.999981 14.49996 10:17:00 AM -132.424 -112.206 Right

10:17:01 AM 6.000528 9.553026 5.999981 14.49996 10:17:01 AM -132.63 -111.586 Right

10:17:02 AM 6.000697 9.554759 5.999981 14.4998 10:17:02 AM -132.506 -110.914 Right

10:17:03 AM 6.002162 9.555518 5.999981 14.49996

10:17:04 AM 6.003139 9.555337 5.999981 14.49985 10:17:04 AM -130.727 -112.015 Right

10:17:05 AM 6.004873 9.623884 5.999981 14.49997 10:17:05 AM -129.967 -111.937 Right

10:17:06 AM 6.011419 9.969964 5.999981 14.49996 10:17:06 AM -131.387 -113.284 Right

10:17:07 AM 6.017867 10.31073 5.999981 14.4998 10:17:07 AM -129.489 -114.628 Right

10:17:08 AM 6.024119 10.64093 5.999981 14.49985 10:17:08 AM -124.298 -113.608 Right

10:17:09 AM 6.030694 10.98676 5.999981 14.49997 10:17:09 AM -115.538 -111.615 Right

10:17:10 AM 6.04765 10.99352 5.999981 14.49996 10:17:10 AM -110.186 -108.531 Right

10:17:11 AM -109.742 -112.638 Left

10:17:12 AM 6.068041 10.9871 5.999981 14.49985 10:17:12 AM -110.432 -107.518 Right

10:17:13 AM 6.077093 10.99598 5.999981 14.4998 10:17:13 AM -105.983 -107.361 Left

10:17:14 AM 6.071382 10.98927 5.999981 14.49997 10:17:14 AM -110.106 -107.525 Right

10:17:15 AM 6.064232 10.98177 5.999981 14.49996 10:17:15 AM -111.033 -107.44 Right

10:17:16 AM 6.057449 10.97628 5.999981 14.49985 10:17:16 AM -110.042 -107.665 Right

10:17:17 AM 6.05679 10.96443 5.999981 14.49997 10:17:17 AM -106.477 -118.164 Left

10:17:18 AM 6.072156 10.93973 5.999981 14.49996 10:17:18 AM -108.472 -119.953 Left

10:17:19 AM 6.072358 10.94778 5.999981 14.4998

10:17:20 AM 6.067204 10.9676 5.999981 14.49996 10:17:20 AM -109.532 -119.306 Left

10:17:21 AM 6.046095 11.00017 5.999981 14.4998 10:17:21 AM -109.895 -123.513 Left

10:17:22 AM 6.014355 11.03808 5.999981 14.49996 10:17:22 AM -109.449 -126.515 Left

10:17:23 AM -110.324 -128.663 Left

10:17:24 AM 5.972888 11.07598 5.999981 14.49996 10:17:24 AM -110.986 -132.091 Left

10:17:25 AM 5.929632 11.10214 5.999981 14.49985 10:17:25 AM -111.862 -136.251 Left

10:17:26 AM 5.883834 11.12943 5.999981 14.4998 10:17:26 AM -112.447 -138.189 Left

10:17:27 AM 5.842865 11.14613 5.999981 14.49997 10:17:27 AM -110.127 -142.235 Left

10:17:28 AM 5.793264 11.14915 5.999981 14.49985 10:17:28 AM -109.969 -146.577 Left

10:17:29 AM 5.74009 11.1601 5.999981 14.49985 10:17:29 AM -106.888 -146.841 Left

10:17:30 AM 5.69971 11.17113 5.999981 14.49997 10:17:30 AM -108.855 -140.573 Left

10:17:31 AM 5.655742 11.19149 5.999981 14.49997 10:17:31 AM -107.092 -141.764 Left

10:17:32 AM 5.611956 11.21838 5.999981 14.49996 10:17:32 AM -106.573 -137.738 Left

10:17:33 AM 5.573955 11.24575 5.999981 14.4998 10:17:33 AM -106.656 -133.394 Left

10:17:34 AM -106.971 -131.146 Left

10:17:35 AM 5.54641 11.27501 5.999981 14.49996 10:17:35 AM -108.498 -130.312 Left

10:17:36 AM 5.519897 11.31632 5.999981 14.49985 10:17:36 AM -115.724 -124.703 Left

10:17:37 AM 5.504065 11.35276 5.999981 14.4998

10:17:38 AM 5.484795 11.39576 5.999981 14.49985 10:17:38 AM -106.949 -122.888 Left

10:17:39 AM 5.465627 11.44603 5.999981 14.49996 10:17:39 AM -110.239 -126.36 Left

10:17:40 AM 5.450945 11.49048 5.999981 14.49985 10:17:40 AM -110.013 -126.911 Left

10:17:41 AM 5.429731 11.54324 5.999981 14.49985 10:17:41 AM -109.354 -127.565 Left

10:17:42 AM 5.413665 11.59411 5.999981 14.49997 10:17:42 AM -107.401 -127.635 Left

10:17:43 AM 5.397177 11.63756 5.999981 14.49996 10:17:43 AM -109.198 -128.934 Left

10:17:44 AM 5.376156 11.69708 5.999981 14.49985 10:17:44 AM -108.216 -129.755 Left

10:17:45 AM 5.361761 11.74582 5.999981 14.49996 10:17:45 AM -111.217 -134.354 Left

10:17:46 AM -109 -127.934 Left

10:17:47 AM 5.340284 11.78464 5.999981 14.4998 10:17:47 AM -110.223 -130.236 Left

10:17:48 AM 5.327333 11.82891 5.999981 14.49996 10:17:48 AM -108.933 -129.492 Left

10:17:49 AM 5.315457 11.88585 5.999981 14.4998 10:17:49 AM -110.138 -127.054 Left

10:17:50 AM 5.315816 11.92651 5.999981 14.4998 10:17:50 AM -110.513 -125.454 Left

10:17:51 AM 5.314696 11.92731 5.999981 14.49996 10:17:51 AM -109.53 -125.917 Left

10:17:52 AM 5.313026 11.92741 5.999981 14.49985

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 295: Modelling a real-time multi-sensor fusion-based navigation ...

277

X Z X Z L R

10:16:53 AM 6 9.5 6 14.5

10:16:54 AM 6 9.5 6 14.5

10:16:55 AM 6 9.5 6 14.5

10:16:56 AM 5.993452 9.500679 5.999981 14.49996

10:16:57 AM 5.999599 9.552241 5.999981 14.49996 10:16:57 AM -132.124 -116.025 Right

10:16:58 AM 6.000326 9.552691 5.999981 14.49985 10:16:58 AM -134.157 -110.572 Right

10:16:59 AM -131.811 -112.343 Right

10:17:00 AM 6.000171 9.553128 5.999981 14.49996 10:17:00 AM -132.424 -112.206 Right

10:17:01 AM 6.000528 9.553026 5.999981 14.49996 10:17:01 AM -132.63 -111.586 Right

10:17:02 AM 6.000697 9.554759 5.999981 14.4998 10:17:02 AM -132.506 -110.914 Right

10:17:03 AM 6.002162 9.555518 5.999981 14.49996

10:17:04 AM 6.003139 9.555337 5.999981 14.49985 10:17:04 AM -130.727 -112.015 Right

10:17:05 AM 6.004873 9.623884 5.999981 14.49997 10:17:05 AM -129.967 -111.937 Right

10:17:06 AM 6.011419 9.969964 5.999981 14.49996 10:17:06 AM -131.387 -113.284 Right

10:17:07 AM 6.017867 10.31073 5.999981 14.4998 10:17:07 AM -129.489 -114.628 Right

10:17:08 AM 6.024119 10.64093 5.999981 14.49985 10:17:08 AM -124.298 -113.608 Right

10:17:09 AM 6.030694 10.98676 5.999981 14.49997 10:17:09 AM -115.538 -111.615 Right

10:17:10 AM 6.04765 10.99352 5.999981 14.49996 10:17:10 AM -110.186 -108.531 Right

10:17:11 AM -109.742 -112.638 Left

10:17:12 AM 6.068041 10.9871 5.999981 14.49985 10:17:12 AM -110.432 -107.518 Right

10:17:13 AM 6.077093 10.99598 5.999981 14.4998 10:17:13 AM -105.983 -107.361 Left

10:17:14 AM 6.071382 10.98927 5.999981 14.49997 10:17:14 AM -110.106 -107.525 Right

10:17:15 AM 6.064232 10.98177 5.999981 14.49996 10:17:15 AM -111.033 -107.44 Right

10:17:16 AM 6.057449 10.97628 5.999981 14.49985 10:17:16 AM -110.042 -107.665 Right

10:17:17 AM 6.05679 10.96443 5.999981 14.49997 10:17:17 AM -106.477 -118.164 Left

10:17:18 AM 6.072156 10.93973 5.999981 14.49996 10:17:18 AM -108.472 -119.953 Left

10:17:19 AM 6.072358 10.94778 5.999981 14.4998

10:17:20 AM 6.067204 10.9676 5.999981 14.49996 10:17:20 AM -109.532 -119.306 Left

10:17:21 AM 6.046095 11.00017 5.999981 14.4998 10:17:21 AM -109.895 -123.513 Left

10:17:22 AM 6.014355 11.03808 5.999981 14.49996 10:17:22 AM -109.449 -126.515 Left

10:17:23 AM -110.324 -128.663 Left

10:17:24 AM 5.972888 11.07598 5.999981 14.49996 10:17:24 AM -110.986 -132.091 Left

10:17:25 AM 5.929632 11.10214 5.999981 14.49985 10:17:25 AM -111.862 -136.251 Left

10:17:26 AM 5.883834 11.12943 5.999981 14.4998 10:17:26 AM -112.447 -138.189 Left

10:17:27 AM 5.842865 11.14613 5.999981 14.49997 10:17:27 AM -110.127 -142.235 Left

10:17:28 AM 5.793264 11.14915 5.999981 14.49985 10:17:28 AM -109.969 -146.577 Left

10:17:29 AM 5.74009 11.1601 5.999981 14.49985 10:17:29 AM -106.888 -146.841 Left

10:17:30 AM 5.69971 11.17113 5.999981 14.49997 10:17:30 AM -108.855 -140.573 Left

10:17:31 AM 5.655742 11.19149 5.999981 14.49997 10:17:31 AM -107.092 -141.764 Left

10:17:32 AM 5.611956 11.21838 5.999981 14.49996 10:17:32 AM -106.573 -137.738 Left

10:17:33 AM 5.573955 11.24575 5.999981 14.4998 10:17:33 AM -106.656 -133.394 Left

10:17:34 AM -106.971 -131.146 Left

10:17:35 AM 5.54641 11.27501 5.999981 14.49996 10:17:35 AM -108.498 -130.312 Left

10:17:36 AM 5.519897 11.31632 5.999981 14.49985 10:17:36 AM -115.724 -124.703 Left

10:17:37 AM 5.504065 11.35276 5.999981 14.4998

10:17:38 AM 5.484795 11.39576 5.999981 14.49985 10:17:38 AM -106.949 -122.888 Left

10:17:39 AM 5.465627 11.44603 5.999981 14.49996 10:17:39 AM -110.239 -126.36 Left

10:17:40 AM 5.450945 11.49048 5.999981 14.49985 10:17:40 AM -110.013 -126.911 Left

10:17:41 AM 5.429731 11.54324 5.999981 14.49985 10:17:41 AM -109.354 -127.565 Left

10:17:42 AM 5.413665 11.59411 5.999981 14.49997 10:17:42 AM -107.401 -127.635 Left

10:17:43 AM 5.397177 11.63756 5.999981 14.49996 10:17:43 AM -109.198 -128.934 Left

10:17:44 AM 5.376156 11.69708 5.999981 14.49985 10:17:44 AM -108.216 -129.755 Left

10:17:45 AM 5.361761 11.74582 5.999981 14.49996 10:17:45 AM -111.217 -134.354 Left

10:17:46 AM -109 -127.934 Left

10:17:47 AM 5.340284 11.78464 5.999981 14.4998 10:17:47 AM -110.223 -130.236 Left

10:17:48 AM 5.327333 11.82891 5.999981 14.49996 10:17:48 AM -108.933 -129.492 Left

10:17:49 AM 5.315457 11.88585 5.999981 14.4998 10:17:49 AM -110.138 -127.054 Left

10:17:50 AM 5.315816 11.92651 5.999981 14.4998 10:17:50 AM -110.513 -125.454 Left

10:17:51 AM 5.314696 11.92731 5.999981 14.49996 10:17:51 AM -109.53 -125.917 Left

10:17:52 AM 5.313026 11.92741 5.999981 14.49985

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 296: Modelling a real-time multi-sensor fusion-based navigation ...

278

Single Non-Uniform Obstruction Sample 4

Motion Path Graph - Single Non-Uniform obstruction – Sample 4.

Combined logs – Single Non-Uniform obstruction – Sample 4.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.25.45.65.866.26.46.66.87

29-6-10-18

CARMI Child

X Z X Z L R

10:18:05 AM 6 9.5 6 14.5

10:18:06 AM 6 9.5 6 14.5

10:18:07 AM 6 9.5 6 14.5

10:18:09 AM 5.993934 9.500367 5.999981 14.49997

10:18:10 AM 6.010031 9.619991 5.999981 14.4998 10:18:10 AM -132.834 -116.985 Right

10:18:11 AM 6.011676 9.621343 5.999981 14.49996 10:18:11 AM -134.814 -110.081 Right

10:18:12 AM 6.012438 9.621804 5.999981 14.49985 10:18:12 AM -132.063 -111.835 Right

10:18:13 AM 6.013259 9.623123 5.999981 14.4998 10:18:13 AM -131.044 -113.105 Right

10:18:14 AM 6.014624 9.623962 5.999981 14.49985 10:18:14 AM -131.16 -112.389 Right

10:18:15 AM 6.015474 9.624219 5.999981 14.49997 10:18:15 AM -131.011 -111.704 Right

10:18:16 AM 6.014895 9.797959 5.999981 14.49997 10:18:16 AM -130.022 -113.046 Right

10:18:17 AM -131.57 -113.303 Right

10:18:18 AM 6.015023 10.14977 5.999981 14.4998 10:18:18 AM -126.974 -113.786 Right

10:18:19 AM 6.015141 10.47532 5.999981 14.49996 10:18:19 AM -117.066 -111.781 Right

10:18:20 AM 6.015267 10.82188 5.999981 14.4998 10:18:20 AM -109.104 -109.444 Left

10:18:21 AM 6.026482 11.04233 5.999981 14.4998

10:18:22 AM 6.034119 11.04236 5.999981 14.4998 10:18:22 AM -110.76 -108.059 Right

10:18:23 AM 6.04326 11.04776 5.999981 14.49985 10:18:23 AM -110.612 -115.867 Left

10:18:24 AM 6.059678 11.04736 5.999981 14.49997 10:18:24 AM -110.446 -123.748 Left

10:18:25 AM 6.05544 11.05963 5.999981 14.49996 10:18:25 AM -110.835 -127.309 Left

10:18:26 AM 6.010893 11.09063 5.999981 14.4998 10:18:26 AM -111.74 -134.211 Left

10:18:27 AM 5.976772 11.11469 5.999981 14.49997 10:18:27 AM -111.982 -136.212 Left

10:18:28 AM -112.318 -137.912 Left

10:18:29 AM 5.936329 11.12874 5.999981 14.49985 10:18:29 AM -112.341 -140.625 Left

10:18:30 AM 5.878414 11.14417 5.999981 14.49996 10:18:30 AM -110.242 -145.539 Left

10:18:31 AM 5.828092 11.14292 5.999981 14.49985 10:18:31 AM -109.433 -146.94 Left

10:18:32 AM 5.779953 11.14536 5.999981 14.49997 10:18:32 AM -106.493 -146.371 Left

10:18:33 AM 5.717555 11.16523 5.999981 14.4998 10:18:33 AM -106.874 -142.403 Left

10:18:34 AM 5.675188 11.18161 5.999981 14.49997 10:18:34 AM -107.15 -140.403 Left

10:18:35 AM 5.641723 11.19602 5.999981 14.49996 10:18:35 AM -107.341 -139.42 Left

10:18:36 AM 5.606854 11.22756 5.999981 14.49985

10:18:37 AM 5.581649 11.25686 5.999981 14.49996 10:18:37 AM -114.069 -133.043 Left

10:18:38 AM 5.5511 11.29459 5.999981 14.49985 10:18:38 AM -106.67 -128.454 Left

10:18:39 AM -107.816 -130.315 Left

10:18:40 AM 5.522519 11.33867 5.999981 14.49996 10:18:40 AM -108.905 -128.1 Left

10:18:41 AM 5.504525 11.39319 5.999981 14.49996 10:18:41 AM -107.914 -124.633 Left

10:18:42 AM 5.48465 11.44384 5.999981 14.49985 10:18:42 AM -108.392 -123.207 Left

10:18:43 AM 5.46643 11.48882 5.999981 14.49985 10:18:43 AM -107.316 -127.406 Left

10:18:44 AM 5.443839 11.53905 5.999981 14.4998 10:18:44 AM -109.651 -128.393 Left

10:18:45 AM 5.422859 11.59307 5.999981 14.49996 10:18:45 AM -109.555 -129.139 Left

10:18:46 AM 5.40688 11.63516 5.999981 14.49997 10:18:46 AM #NAME? #NAME? Mid

10:18:47 AM 5.385821 11.69192 5.999981 14.49996 10:18:47 AM -108.106 -129.574 Left

10:18:48 AM 5.363208 11.74637 5.999981 14.49985 10:18:48 AM -110.058 -131.333 Left

10:18:49 AM 5.350856 11.78458 5.999981 14.4998 10:18:49 AM -108.528 -133.115 Left

10:18:51 AM 5.348593 11.83465 5.999981 14.49996 10:18:51 AM -111.234 -120.562 Left

10:18:52 AM 5.34094 11.87351 5.999981 14.49997 10:18:52 AM -109.552 -121.474 Left

10:18:53 AM 5.339965 11.87398 5.999981 14.49985 10:18:53 AM -110.263 -128.251 Left

10:18:54 AM 5.338418 11.87351 5.999981 14.49996 10:18:54 AM -110.1 -127.286 Left

10:18:55 AM 5.337015 11.87264 5.999981 14.49996 10:18:55 AM -111.259 -123.338 Left

10:18:56 AM 5.336518 11.87282 5.999981 14.4998 10:18:56 AM -112.944 -123.609 Left

10:18:57 AM 5.33497 11.87354 5.999981 14.4998 10:18:57 AM -111.497 -125.395 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 297: Modelling a real-time multi-sensor fusion-based navigation ...

279

X Z X Z L R

10:18:05 AM 6 9.5 6 14.5

10:18:06 AM 6 9.5 6 14.5

10:18:07 AM 6 9.5 6 14.5

10:18:09 AM 5.993934 9.500367 5.999981 14.49997

10:18:10 AM 6.010031 9.619991 5.999981 14.4998 10:18:10 AM -132.834 -116.985 Right

10:18:11 AM 6.011676 9.621343 5.999981 14.49996 10:18:11 AM -134.814 -110.081 Right

10:18:12 AM 6.012438 9.621804 5.999981 14.49985 10:18:12 AM -132.063 -111.835 Right

10:18:13 AM 6.013259 9.623123 5.999981 14.4998 10:18:13 AM -131.044 -113.105 Right

10:18:14 AM 6.014624 9.623962 5.999981 14.49985 10:18:14 AM -131.16 -112.389 Right

10:18:15 AM 6.015474 9.624219 5.999981 14.49997 10:18:15 AM -131.011 -111.704 Right

10:18:16 AM 6.014895 9.797959 5.999981 14.49997 10:18:16 AM -130.022 -113.046 Right

10:18:17 AM -131.57 -113.303 Right

10:18:18 AM 6.015023 10.14977 5.999981 14.4998 10:18:18 AM -126.974 -113.786 Right

10:18:19 AM 6.015141 10.47532 5.999981 14.49996 10:18:19 AM -117.066 -111.781 Right

10:18:20 AM 6.015267 10.82188 5.999981 14.4998 10:18:20 AM -109.104 -109.444 Left

10:18:21 AM 6.026482 11.04233 5.999981 14.4998

10:18:22 AM 6.034119 11.04236 5.999981 14.4998 10:18:22 AM -110.76 -108.059 Right

10:18:23 AM 6.04326 11.04776 5.999981 14.49985 10:18:23 AM -110.612 -115.867 Left

10:18:24 AM 6.059678 11.04736 5.999981 14.49997 10:18:24 AM -110.446 -123.748 Left

10:18:25 AM 6.05544 11.05963 5.999981 14.49996 10:18:25 AM -110.835 -127.309 Left

10:18:26 AM 6.010893 11.09063 5.999981 14.4998 10:18:26 AM -111.74 -134.211 Left

10:18:27 AM 5.976772 11.11469 5.999981 14.49997 10:18:27 AM -111.982 -136.212 Left

10:18:28 AM -112.318 -137.912 Left

10:18:29 AM 5.936329 11.12874 5.999981 14.49985 10:18:29 AM -112.341 -140.625 Left

10:18:30 AM 5.878414 11.14417 5.999981 14.49996 10:18:30 AM -110.242 -145.539 Left

10:18:31 AM 5.828092 11.14292 5.999981 14.49985 10:18:31 AM -109.433 -146.94 Left

10:18:32 AM 5.779953 11.14536 5.999981 14.49997 10:18:32 AM -106.493 -146.371 Left

10:18:33 AM 5.717555 11.16523 5.999981 14.4998 10:18:33 AM -106.874 -142.403 Left

10:18:34 AM 5.675188 11.18161 5.999981 14.49997 10:18:34 AM -107.15 -140.403 Left

10:18:35 AM 5.641723 11.19602 5.999981 14.49996 10:18:35 AM -107.341 -139.42 Left

10:18:36 AM 5.606854 11.22756 5.999981 14.49985

10:18:37 AM 5.581649 11.25686 5.999981 14.49996 10:18:37 AM -114.069 -133.043 Left

10:18:38 AM 5.5511 11.29459 5.999981 14.49985 10:18:38 AM -106.67 -128.454 Left

10:18:39 AM -107.816 -130.315 Left

10:18:40 AM 5.522519 11.33867 5.999981 14.49996 10:18:40 AM -108.905 -128.1 Left

10:18:41 AM 5.504525 11.39319 5.999981 14.49996 10:18:41 AM -107.914 -124.633 Left

10:18:42 AM 5.48465 11.44384 5.999981 14.49985 10:18:42 AM -108.392 -123.207 Left

10:18:43 AM 5.46643 11.48882 5.999981 14.49985 10:18:43 AM -107.316 -127.406 Left

10:18:44 AM 5.443839 11.53905 5.999981 14.4998 10:18:44 AM -109.651 -128.393 Left

10:18:45 AM 5.422859 11.59307 5.999981 14.49996 10:18:45 AM -109.555 -129.139 Left

10:18:46 AM 5.40688 11.63516 5.999981 14.49997 10:18:46 AM #NAME? #NAME? Mid

10:18:47 AM 5.385821 11.69192 5.999981 14.49996 10:18:47 AM -108.106 -129.574 Left

10:18:48 AM 5.363208 11.74637 5.999981 14.49985 10:18:48 AM -110.058 -131.333 Left

10:18:49 AM 5.350856 11.78458 5.999981 14.4998 10:18:49 AM -108.528 -133.115 Left

10:18:51 AM 5.348593 11.83465 5.999981 14.49996 10:18:51 AM -111.234 -120.562 Left

10:18:52 AM 5.34094 11.87351 5.999981 14.49997 10:18:52 AM -109.552 -121.474 Left

10:18:53 AM 5.339965 11.87398 5.999981 14.49985 10:18:53 AM -110.263 -128.251 Left

10:18:54 AM 5.338418 11.87351 5.999981 14.49996 10:18:54 AM -110.1 -127.286 Left

10:18:55 AM 5.337015 11.87264 5.999981 14.49996 10:18:55 AM -111.259 -123.338 Left

10:18:56 AM 5.336518 11.87282 5.999981 14.4998 10:18:56 AM -112.944 -123.609 Left

10:18:57 AM 5.33497 11.87354 5.999981 14.4998 10:18:57 AM -111.497 -125.395 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 298: Modelling a real-time multi-sensor fusion-based navigation ...

280

Single Non-Uniform Obstruction Sample 5

Motion Path Graph - Single Non-Uniform obstruction – Sample 5.

Combined logs – Single Non-Uniform obstruction – Sample 5.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.25.45.65.866.26.46.66.87

29-6-10-19

CARMI Child

X Z X Z L R

10:19:14 AM 6 9.5 6 14.5

10:19:15 AM 6 9.5 6 14.5

10:19:16 AM 6 9.5 6 14.5

10:19:18 AM 5.994834 9.500858 5.999981 14.4998

10:19:19 AM 6.005515 9.621281 5.999981 14.49997 10:19:19 AM -133.298 -112.951 Right

10:19:20 AM 6.006132 9.620813 5.999981 14.49996 10:19:20 AM -131.588 -112.189 Right

10:19:21 AM 6.00739 9.62138 5.999981 14.49996 10:19:21 AM -130.589 -112.424 Right

10:19:22 AM 6.017666 9.909372 5.999981 14.49996 10:19:22 AM -132.701 #NAME? Left

10:19:23 AM 6.029618 10.24524 5.999981 14.49985 10:19:23 AM -131.171 -114.231 Right

10:19:24 AM 6.042325 10.60209 5.999981 14.4998 10:19:24 AM -123.365 -113.751 Right

10:19:25 AM -115.057 -111.575 Right

10:19:26 AM 6.054657 10.94845 5.999981 14.49996 10:19:26 AM -111.461 -108.614 Right

10:19:27 AM 6.064 11.04171 5.999981 14.4998 10:19:27 AM -108.184 -110.313 Left

10:19:28 AM 6.07516 11.04531 5.999981 14.49997

10:19:29 AM 6.08661 11.05206 5.999981 14.49985 10:19:29 AM -110.842 -115.878 Left

10:19:30 AM 6.094765 11.06024 5.999981 14.4998 10:19:30 AM -110.859 -127.694 Left

10:19:31 AM 6.087481 11.07464 5.999981 14.49996 10:19:31 AM -111.013 -131.725 Left

10:19:32 AM 6.03866 11.10422 5.999981 14.4998 10:19:32 AM -111.532 -133.07 Left

10:19:33 AM 5.994674 11.12609 5.999981 14.49996 10:19:33 AM -112.063 -136.432 Left

10:19:35 AM 5.930947 11.14635 5.999981 14.49985 10:19:35 AM -113.141 -143.286 Left

10:19:36 AM 5.893939 11.15053 5.999981 14.4998 10:19:36 AM -110.257 -144.56 Left

10:19:37 AM 5.840425 11.15012 5.999981 14.49985 10:19:37 AM -110.728 -146.857 Left

10:19:38 AM 5.791776 11.15446 5.999981 14.49985 10:19:38 AM -107.782 -149.879 Left

10:19:39 AM -108.587 -144.693 Left

10:19:40 AM 5.741306 11.16485 5.999981 14.4998 10:19:40 AM -108.794 -139.969 Left

10:19:41 AM 5.692986 11.18245 5.999981 14.49996 10:19:41 AM -106.53 -140.074 Left

10:19:42 AM 5.649341 11.20973 5.999981 14.49996 10:19:42 AM -106.784 -136.214 Left

10:19:43 AM 5.609469 11.23792 5.999981 14.49997

10:19:44 AM 5.569406 11.27635 5.999981 14.49985 10:19:44 AM -106.668 -134.867 Left

10:19:45 AM 5.53048 11.32803 5.999981 14.49985 10:19:45 AM -107.426 -133.316 Left

10:19:46 AM 5.50492 11.36169 5.999981 14.49997 10:19:46 AM #NAME? #NAME? Mid

10:19:47 AM -112.825 -126.878 Left

10:19:48 AM 5.479592 11.41774 5.999981 14.49996 10:19:48 AM -112.049 -126.211 Left

10:19:49 AM 5.467192 11.46103 5.999981 14.4998 10:19:49 AM -107.627 -124.836 Left

10:19:50 AM 5.450711 11.51385 5.999981 14.49997 10:19:50 AM -107.473 -127.507 Left

10:19:51 AM 5.42677 11.56558 5.999981 14.49985 10:19:51 AM -109.242 -126.877 Left

10:19:52 AM 5.402543 11.61871 5.999981 14.49997 10:19:52 AM -108.853 -131.572 Left

10:19:53 AM 5.385859 11.65936 5.999981 14.4998 10:19:53 AM -108.143 -130.831 Left

10:19:54 AM 5.365652 11.7017 5.999981 14.49996 10:19:54 AM -109.436 -130.831 Left

10:19:55 AM 5.358759 11.73643 5.999981 14.49985

10:19:56 AM 5.347197 11.77669 5.999981 14.49996 10:19:56 AM -109.317 -126.808 Left

10:19:57 AM 5.336686 11.83338 5.999981 14.49996 10:19:57 AM -105.844 -129.257 Left

10:19:58 AM -111.388 -131.501 Left

10:19:59 AM 5.334013 11.8877 5.999981 14.49996 10:19:59 AM -111.453 -125.751 Left

10:20:00 AM 5.333127 11.92123 5.999981 14.49996 10:20:00 AM -109.896 -125.511 Left

10:20:01 AM 5.331847 11.92002 5.999981 14.49997 10:20:01 AM -111.903 -122.525 Left

10:20:02 AM 5.330778 11.92065 5.999981 14.49985 10:20:02 AM -111.511 -122.629 Left

10:20:03 AM 5.32952 11.92112 5.999981 14.49985 10:20:03 AM -113.588 -120.628 Left

10:20:04 AM 5.329747 11.92111 5.999981 14.49997 10:20:05 AM -115.97 -118.351 Left

10:20:05 AM 5.329026 11.92123 5.999981 14.49985 10:20:06 AM -115.939 -118.424 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 299: Modelling a real-time multi-sensor fusion-based navigation ...

281

X Z X Z L R

10:19:14 AM 6 9.5 6 14.5

10:19:15 AM 6 9.5 6 14.5

10:19:16 AM 6 9.5 6 14.5

10:19:18 AM 5.994834 9.500858 5.999981 14.4998

10:19:19 AM 6.005515 9.621281 5.999981 14.49997 10:19:19 AM -133.298 -112.951 Right

10:19:20 AM 6.006132 9.620813 5.999981 14.49996 10:19:20 AM -131.588 -112.189 Right

10:19:21 AM 6.00739 9.62138 5.999981 14.49996 10:19:21 AM -130.589 -112.424 Right

10:19:22 AM 6.017666 9.909372 5.999981 14.49996 10:19:22 AM -132.701 #NAME? Left

10:19:23 AM 6.029618 10.24524 5.999981 14.49985 10:19:23 AM -131.171 -114.231 Right

10:19:24 AM 6.042325 10.60209 5.999981 14.4998 10:19:24 AM -123.365 -113.751 Right

10:19:25 AM -115.057 -111.575 Right

10:19:26 AM 6.054657 10.94845 5.999981 14.49996 10:19:26 AM -111.461 -108.614 Right

10:19:27 AM 6.064 11.04171 5.999981 14.4998 10:19:27 AM -108.184 -110.313 Left

10:19:28 AM 6.07516 11.04531 5.999981 14.49997

10:19:29 AM 6.08661 11.05206 5.999981 14.49985 10:19:29 AM -110.842 -115.878 Left

10:19:30 AM 6.094765 11.06024 5.999981 14.4998 10:19:30 AM -110.859 -127.694 Left

10:19:31 AM 6.087481 11.07464 5.999981 14.49996 10:19:31 AM -111.013 -131.725 Left

10:19:32 AM 6.03866 11.10422 5.999981 14.4998 10:19:32 AM -111.532 -133.07 Left

10:19:33 AM 5.994674 11.12609 5.999981 14.49996 10:19:33 AM -112.063 -136.432 Left

10:19:35 AM 5.930947 11.14635 5.999981 14.49985 10:19:35 AM -113.141 -143.286 Left

10:19:36 AM 5.893939 11.15053 5.999981 14.4998 10:19:36 AM -110.257 -144.56 Left

10:19:37 AM 5.840425 11.15012 5.999981 14.49985 10:19:37 AM -110.728 -146.857 Left

10:19:38 AM 5.791776 11.15446 5.999981 14.49985 10:19:38 AM -107.782 -149.879 Left

10:19:39 AM -108.587 -144.693 Left

10:19:40 AM 5.741306 11.16485 5.999981 14.4998 10:19:40 AM -108.794 -139.969 Left

10:19:41 AM 5.692986 11.18245 5.999981 14.49996 10:19:41 AM -106.53 -140.074 Left

10:19:42 AM 5.649341 11.20973 5.999981 14.49996 10:19:42 AM -106.784 -136.214 Left

10:19:43 AM 5.609469 11.23792 5.999981 14.49997

10:19:44 AM 5.569406 11.27635 5.999981 14.49985 10:19:44 AM -106.668 -134.867 Left

10:19:45 AM 5.53048 11.32803 5.999981 14.49985 10:19:45 AM -107.426 -133.316 Left

10:19:46 AM 5.50492 11.36169 5.999981 14.49997 10:19:46 AM #NAME? #NAME? Mid

10:19:47 AM -112.825 -126.878 Left

10:19:48 AM 5.479592 11.41774 5.999981 14.49996 10:19:48 AM -112.049 -126.211 Left

10:19:49 AM 5.467192 11.46103 5.999981 14.4998 10:19:49 AM -107.627 -124.836 Left

10:19:50 AM 5.450711 11.51385 5.999981 14.49997 10:19:50 AM -107.473 -127.507 Left

10:19:51 AM 5.42677 11.56558 5.999981 14.49985 10:19:51 AM -109.242 -126.877 Left

10:19:52 AM 5.402543 11.61871 5.999981 14.49997 10:19:52 AM -108.853 -131.572 Left

10:19:53 AM 5.385859 11.65936 5.999981 14.4998 10:19:53 AM -108.143 -130.831 Left

10:19:54 AM 5.365652 11.7017 5.999981 14.49996 10:19:54 AM -109.436 -130.831 Left

10:19:55 AM 5.358759 11.73643 5.999981 14.49985

10:19:56 AM 5.347197 11.77669 5.999981 14.49996 10:19:56 AM -109.317 -126.808 Left

10:19:57 AM 5.336686 11.83338 5.999981 14.49996 10:19:57 AM -105.844 -129.257 Left

10:19:58 AM -111.388 -131.501 Left

10:19:59 AM 5.334013 11.8877 5.999981 14.49996 10:19:59 AM -111.453 -125.751 Left

10:20:00 AM 5.333127 11.92123 5.999981 14.49996 10:20:00 AM -109.896 -125.511 Left

10:20:01 AM 5.331847 11.92002 5.999981 14.49997 10:20:01 AM -111.903 -122.525 Left

10:20:02 AM 5.330778 11.92065 5.999981 14.49985 10:20:02 AM -111.511 -122.629 Left

10:20:03 AM 5.32952 11.92112 5.999981 14.49985 10:20:03 AM -113.588 -120.628 Left

10:20:04 AM 5.329747 11.92111 5.999981 14.49997 10:20:05 AM -115.97 -118.351 Left

10:20:05 AM 5.329026 11.92123 5.999981 14.49985 10:20:06 AM -115.939 -118.424 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 300: Modelling a real-time multi-sensor fusion-based navigation ...

282

Non-Uniform Obstruction with Left Scatter Sample 1

Motion Path Graph - Non-Uniform obstruction with Left Scatter – Sample 1.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.57

29-6-10-25

CARMI Child

Page 301: Modelling a real-time multi-sensor fusion-based navigation ...

283

Combined logs – Non-Uniform obstruction with Left Scatter – Sample 1.

X Z X Z L R

10:25:03 AM 6 9.5 6 14.5

10:25:04 AM 6 9.5 6 14.5

10:25:05 AM 6 9.5 6 14.5

10:25:06 AM 5.995187 9.501067 6.000019 14.49996

10:25:07 AM 5.987306 9.485507 6.000019 14.4998

10:25:08 AM 5.970524 9.462602 6.000019 14.49997

10:25:10 AM 5.939333 9.420695 6.000019 14.4998

10:25:11 AM 5.91065 9.380818 6.000019 14.49996

10:25:12 AM 5.86494 9.347048 6.000019 14.49985

10:25:13 AM 5.80814 9.307255 6.000019 14.4998

10:25:14 AM 5.771358 9.301133 6.000019 14.49997

10:25:15 AM 5.74946 9.292454 6.000019 14.49996 10:25:15 AM -151.439 -115.071 Right

10:25:16 AM 5.792623 9.304709 6.000019 14.49996 10:25:16 AM -148.92 -111.255 Right

10:25:17 AM 5.839953 9.324904 6.000019 14.49996

10:25:18 AM 5.876538 9.342795 6.000019 14.49985 10:25:18 AM -143.523 -116.122 Right

10:25:19 AM 5.920836 9.373268 6.000019 14.4998 10:25:19 AM -143.865 -116.753 Right

10:25:20 AM 5.956855 9.401843 6.000019 14.4998 10:25:20 AM -145.216 -113.429 Right

10:25:21 AM 5.984492 9.431147 6.000019 14.49996 10:25:21 AM -141.544 -111.869 Right

10:25:22 AM -140.688 -112.214 Right

10:25:23 AM 6.016403 9.478471 6.000019 14.49996

10:25:24 AM 6.038756 9.539188 6.000019 14.4998 10:25:24 AM -140.277 -112.204 Right

10:25:25 AM 6.057558 9.596663 6.000019 14.49997 10:25:25 AM -136.584 -112.203 Right

10:25:26 AM 6.065561 9.649698 6.000019 14.49996 10:25:26 AM -134.498 -110.22 Right

10:25:27 AM 6.06628 9.650797 6.000019 14.4998 10:25:27 AM -133.348 -111.044 Right

10:25:28 AM 6.065044 9.650555 6.000019 14.49997 10:25:28 AM -134.632 -119.773 Right

10:25:29 AM 6.067335 9.771054 6.000019 14.49996 10:25:29 AM -137.994 -111.852 Right

10:25:31 AM 6.067936 9.771342 6.000019 14.49985 10:25:31 AM -135.837 -112.944 Right

10:25:32 AM 6.068365 9.842364 6.000019 14.49997 10:25:32 AM -134.666 -114.553 Right

10:25:33 AM 6.068262 10.22044 6.000019 14.49997 10:25:33 AM -138.822 -115.072 Right

10:25:34 AM 6.068174 10.54075 6.000019 14.49985 10:25:34 AM -136.755 -114.54 Right

10:25:35 AM 6.068077 10.89782 6.000019 14.49985 10:25:35 AM -129.916 -112.009 Right

10:25:36 AM 6.078161 11.04485 6.000019 14.49996 10:25:36 AM -126.484 -112.288 Right

10:25:37 AM 6.085254 11.04809 6.000019 14.4998 10:25:37 AM -110.247 -109.924 Right

10:25:38 AM 6.091435 11.05302 6.000019 14.49996 10:25:38 AM -109.28 -121.522 Left

10:25:39 AM -110.316 -124.628 Left

10:25:40 AM 6.103316 11.06274 6.000019 14.49985 10:25:40 AM -110.355 -124.468 Left

10:25:41 AM 6.107357 11.06751 6.000019 14.49996 10:25:41 AM -106.815 -107.2 Left

10:25:42 AM 6.090384 11.07509 6.000019 14.49985 10:25:42 AM -107.014 -107.66 Left

10:25:43 AM 6.085586 11.06713 6.000019 14.4998

10:25:44 AM 6.078386 11.06144 6.000019 14.4998 10:25:44 AM -110.973 -107.608 Right

10:25:45 AM 6.069264 11.05665 6.000019 14.49997 10:25:45 AM -111.599 -107.683 Right

10:25:46 AM 6.074354 11.03458 6.000019 14.49996 10:25:46 AM -107.894 -112.265 Left

10:25:47 AM 6.09642 10.98835 6.000019 14.4998 10:25:47 AM -114.033 -118.694 Left

10:25:48 AM 6.119619 10.94732 6.000019 14.49997 10:25:48 AM -110.08 -119.56 Left

10:25:49 AM -109.253 -119.629 Left

10:25:50 AM 6.13285 10.94548 6.000019 14.49997 10:25:50 AM -110.986 -123.237 Left

10:25:51 AM 6.098702 10.98643 6.000019 14.4998 10:25:51 AM -109.596 -127.516 Left

10:25:52 AM 6.058999 11.02136 6.000019 14.49985 10:25:52 AM -110.28 -132.313 Left

10:25:53 AM 6.028584 11.04479 6.000019 14.49997 10:25:53 AM -110.774 -134.306 Left

10:25:54 AM 5.978067 11.07056 6.000019 14.49997 10:25:54 AM -110.927 -136.701 Left

10:25:55 AM 5.937562 11.09239 6.000019 14.49985 10:25:55 AM -111.41 -140.266 Left

10:25:56 AM 5.89775 11.10438 6.000019 14.49997 10:25:56 AM -109.803 -145.509 Left

10:25:57 AM 5.841621 11.11016 6.000019 14.49985

10:25:58 AM 5.802022 11.11083 6.000019 14.49985 10:25:58 AM -111 -147.54 Left

10:25:59 AM -108.824 -146.444 Left

10:26:00 AM 5.745181 11.11665 6.000019 14.49985 10:26:00 AM -109.196 -143.629 Left

10:26:01 AM 5.686121 11.13708 6.000019 14.4998 10:26:01 AM -106.548 -144.468 Left

10:26:02 AM 5.6387 11.15968 6.000019 14.4998 10:26:02 AM -106.622 -139.037 Left

10:26:03 AM 5.598574 11.18115 6.000019 14.49997 10:26:03 AM -106.313 -136.889 Left

10:26:04 AM 5.555991 11.21827 6.000019 14.4998 10:26:04 AM -106.821 -135.055 Left

10:26:05 AM 5.525547 11.24757 6.000019 14.49996 10:26:05 AM -107.028 -133.555 Left

10:26:06 AM 5.496115 11.29134 6.000019 14.49985 10:26:06 AM -111.213 -131.152 Left

10:26:07 AM 5.475601 11.34074 6.000019 14.49997 10:26:07 AM -108.231 -121.882 Left

10:26:08 AM 5.453451 11.38008 6.000019 14.4998 10:26:08 AM -111.383 -128.165 Left

10:26:09 AM 5.4383 11.43201 6.000019 14.49996 10:26:09 AM -107.701 -127.036 Left

10:26:10 AM 5.415946 11.48621 6.000019 14.49997 10:26:10 AM -107.315 -127.025 Left

10:26:11 AM -109.429 -125.322 Left

10:26:12 AM 5.39477 11.54291 6.000019 14.49985 10:26:12 AM -109.724 -130.219 Left

10:26:13 AM 5.371273 11.60596 6.000019 14.49997

10:26:14 AM 5.356312 11.63996 6.000019 14.49996 10:26:14 AM -108.485 -130.413 Left

10:26:15 AM 5.33979 11.67916 6.000019 14.49985 10:26:15 AM -109.427 -131.244 Left

10:26:16 AM 5.324664 11.719 6.000019 14.49997 10:26:16 AM -108.508 -128.93 Left

10:26:17 AM 5.311745 11.76943 6.000019 14.49997 10:26:17 AM -108.462 -133.452 Left

10:26:18 AM 5.306375 11.81803 6.000019 14.49997 10:26:18 AM -110.356 -128.779 Left

10:26:19 AM 5.301338 11.86718 6.000019 14.4998 10:26:19 AM -110.715 -123.74 Left

10:26:20 AM 5.306932 11.92381 6.000019 14.4998 10:26:20 AM -109.361 -129.663 Left

10:26:21 AM -109.629 -126.896 Left

10:26:22 AM 5.307473 11.94227 6.000019 14.4998 10:26:22 AM -114.044 -120.074 Left

10:26:23 AM 5.308829 11.94139 6.000019 14.49997 10:26:23 AM -111.106 -119.55 Left

10:26:24 AM 5.308735 11.94078 6.000019 14.49996 10:26:24 AM -110.797 -122.485 Left

10:26:25 AM 5.308735 11.94078 6.000019 14.49996 10:26:25 AM -112.035 -121.291 Left

10:26:26 AM 5.308735 11.94078 6.000019 14.49985

10:26:27 AM -110.459 -122.867 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 302: Modelling a real-time multi-sensor fusion-based navigation ...

284

X Z X Z L R

10:25:03 AM 6 9.5 6 14.5

10:25:04 AM 6 9.5 6 14.5

10:25:05 AM 6 9.5 6 14.5

10:25:06 AM 5.995187 9.501067 6.000019 14.49996

10:25:07 AM 5.987306 9.485507 6.000019 14.4998

10:25:08 AM 5.970524 9.462602 6.000019 14.49997

10:25:10 AM 5.939333 9.420695 6.000019 14.4998

10:25:11 AM 5.91065 9.380818 6.000019 14.49996

10:25:12 AM 5.86494 9.347048 6.000019 14.49985

10:25:13 AM 5.80814 9.307255 6.000019 14.4998

10:25:14 AM 5.771358 9.301133 6.000019 14.49997

10:25:15 AM 5.74946 9.292454 6.000019 14.49996 10:25:15 AM -151.439 -115.071 Right

10:25:16 AM 5.792623 9.304709 6.000019 14.49996 10:25:16 AM -148.92 -111.255 Right

10:25:17 AM 5.839953 9.324904 6.000019 14.49996

10:25:18 AM 5.876538 9.342795 6.000019 14.49985 10:25:18 AM -143.523 -116.122 Right

10:25:19 AM 5.920836 9.373268 6.000019 14.4998 10:25:19 AM -143.865 -116.753 Right

10:25:20 AM 5.956855 9.401843 6.000019 14.4998 10:25:20 AM -145.216 -113.429 Right

10:25:21 AM 5.984492 9.431147 6.000019 14.49996 10:25:21 AM -141.544 -111.869 Right

10:25:22 AM -140.688 -112.214 Right

10:25:23 AM 6.016403 9.478471 6.000019 14.49996

10:25:24 AM 6.038756 9.539188 6.000019 14.4998 10:25:24 AM -140.277 -112.204 Right

10:25:25 AM 6.057558 9.596663 6.000019 14.49997 10:25:25 AM -136.584 -112.203 Right

10:25:26 AM 6.065561 9.649698 6.000019 14.49996 10:25:26 AM -134.498 -110.22 Right

10:25:27 AM 6.06628 9.650797 6.000019 14.4998 10:25:27 AM -133.348 -111.044 Right

10:25:28 AM 6.065044 9.650555 6.000019 14.49997 10:25:28 AM -134.632 -119.773 Right

10:25:29 AM 6.067335 9.771054 6.000019 14.49996 10:25:29 AM -137.994 -111.852 Right

10:25:31 AM 6.067936 9.771342 6.000019 14.49985 10:25:31 AM -135.837 -112.944 Right

10:25:32 AM 6.068365 9.842364 6.000019 14.49997 10:25:32 AM -134.666 -114.553 Right

10:25:33 AM 6.068262 10.22044 6.000019 14.49997 10:25:33 AM -138.822 -115.072 Right

10:25:34 AM 6.068174 10.54075 6.000019 14.49985 10:25:34 AM -136.755 -114.54 Right

10:25:35 AM 6.068077 10.89782 6.000019 14.49985 10:25:35 AM -129.916 -112.009 Right

10:25:36 AM 6.078161 11.04485 6.000019 14.49996 10:25:36 AM -126.484 -112.288 Right

10:25:37 AM 6.085254 11.04809 6.000019 14.4998 10:25:37 AM -110.247 -109.924 Right

10:25:38 AM 6.091435 11.05302 6.000019 14.49996 10:25:38 AM -109.28 -121.522 Left

10:25:39 AM -110.316 -124.628 Left

10:25:40 AM 6.103316 11.06274 6.000019 14.49985 10:25:40 AM -110.355 -124.468 Left

10:25:41 AM 6.107357 11.06751 6.000019 14.49996 10:25:41 AM -106.815 -107.2 Left

10:25:42 AM 6.090384 11.07509 6.000019 14.49985 10:25:42 AM -107.014 -107.66 Left

10:25:43 AM 6.085586 11.06713 6.000019 14.4998

10:25:44 AM 6.078386 11.06144 6.000019 14.4998 10:25:44 AM -110.973 -107.608 Right

10:25:45 AM 6.069264 11.05665 6.000019 14.49997 10:25:45 AM -111.599 -107.683 Right

10:25:46 AM 6.074354 11.03458 6.000019 14.49996 10:25:46 AM -107.894 -112.265 Left

10:25:47 AM 6.09642 10.98835 6.000019 14.4998 10:25:47 AM -114.033 -118.694 Left

10:25:48 AM 6.119619 10.94732 6.000019 14.49997 10:25:48 AM -110.08 -119.56 Left

10:25:49 AM -109.253 -119.629 Left

10:25:50 AM 6.13285 10.94548 6.000019 14.49997 10:25:50 AM -110.986 -123.237 Left

10:25:51 AM 6.098702 10.98643 6.000019 14.4998 10:25:51 AM -109.596 -127.516 Left

10:25:52 AM 6.058999 11.02136 6.000019 14.49985 10:25:52 AM -110.28 -132.313 Left

10:25:53 AM 6.028584 11.04479 6.000019 14.49997 10:25:53 AM -110.774 -134.306 Left

10:25:54 AM 5.978067 11.07056 6.000019 14.49997 10:25:54 AM -110.927 -136.701 Left

10:25:55 AM 5.937562 11.09239 6.000019 14.49985 10:25:55 AM -111.41 -140.266 Left

10:25:56 AM 5.89775 11.10438 6.000019 14.49997 10:25:56 AM -109.803 -145.509 Left

10:25:57 AM 5.841621 11.11016 6.000019 14.49985

10:25:58 AM 5.802022 11.11083 6.000019 14.49985 10:25:58 AM -111 -147.54 Left

10:25:59 AM -108.824 -146.444 Left

10:26:00 AM 5.745181 11.11665 6.000019 14.49985 10:26:00 AM -109.196 -143.629 Left

10:26:01 AM 5.686121 11.13708 6.000019 14.4998 10:26:01 AM -106.548 -144.468 Left

10:26:02 AM 5.6387 11.15968 6.000019 14.4998 10:26:02 AM -106.622 -139.037 Left

10:26:03 AM 5.598574 11.18115 6.000019 14.49997 10:26:03 AM -106.313 -136.889 Left

10:26:04 AM 5.555991 11.21827 6.000019 14.4998 10:26:04 AM -106.821 -135.055 Left

10:26:05 AM 5.525547 11.24757 6.000019 14.49996 10:26:05 AM -107.028 -133.555 Left

10:26:06 AM 5.496115 11.29134 6.000019 14.49985 10:26:06 AM -111.213 -131.152 Left

10:26:07 AM 5.475601 11.34074 6.000019 14.49997 10:26:07 AM -108.231 -121.882 Left

10:26:08 AM 5.453451 11.38008 6.000019 14.4998 10:26:08 AM -111.383 -128.165 Left

10:26:09 AM 5.4383 11.43201 6.000019 14.49996 10:26:09 AM -107.701 -127.036 Left

10:26:10 AM 5.415946 11.48621 6.000019 14.49997 10:26:10 AM -107.315 -127.025 Left

10:26:11 AM -109.429 -125.322 Left

10:26:12 AM 5.39477 11.54291 6.000019 14.49985 10:26:12 AM -109.724 -130.219 Left

10:26:13 AM 5.371273 11.60596 6.000019 14.49997

10:26:14 AM 5.356312 11.63996 6.000019 14.49996 10:26:14 AM -108.485 -130.413 Left

10:26:15 AM 5.33979 11.67916 6.000019 14.49985 10:26:15 AM -109.427 -131.244 Left

10:26:16 AM 5.324664 11.719 6.000019 14.49997 10:26:16 AM -108.508 -128.93 Left

10:26:17 AM 5.311745 11.76943 6.000019 14.49997 10:26:17 AM -108.462 -133.452 Left

10:26:18 AM 5.306375 11.81803 6.000019 14.49997 10:26:18 AM -110.356 -128.779 Left

10:26:19 AM 5.301338 11.86718 6.000019 14.4998 10:26:19 AM -110.715 -123.74 Left

10:26:20 AM 5.306932 11.92381 6.000019 14.4998 10:26:20 AM -109.361 -129.663 Left

10:26:21 AM -109.629 -126.896 Left

10:26:22 AM 5.307473 11.94227 6.000019 14.4998 10:26:22 AM -114.044 -120.074 Left

10:26:23 AM 5.308829 11.94139 6.000019 14.49997 10:26:23 AM -111.106 -119.55 Left

10:26:24 AM 5.308735 11.94078 6.000019 14.49996 10:26:24 AM -110.797 -122.485 Left

10:26:25 AM 5.308735 11.94078 6.000019 14.49996 10:26:25 AM -112.035 -121.291 Left

10:26:26 AM 5.308735 11.94078 6.000019 14.49985

10:26:27 AM -110.459 -122.867 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 303: Modelling a real-time multi-sensor fusion-based navigation ...

285

Non-Uniform Obstruction with Left Scatter Sample 2

Motion Path Graph - Non-Uniform obstruction with Left Scatter – Sample 2.

Combined logs – Non-Uniform obstruction with Left Scatter – Sample 2.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.25.45.65.866.26.46.66.87

29-6-10-26

CARMI Child

X Z X Z L R

10:26:35 AM 6 9.5 6 14.5

10:26:36 AM 6 9.5 6 14.5

10:26:37 AM 6 9.5 6 14.5

10:26:38 AM 6.000052 9.500901 6.000019 14.49996

10:26:40 AM 5.993961 9.501074 6.000019 14.4998 10:26:40 AM -134.064 -116.402 Right

10:26:41 AM 6.011438 9.621527 6.000019 14.49996 10:26:41 AM -137.523 -110.395 Right

10:26:42 AM 6.012232 9.6207 6.000019 14.49997 10:26:42 AM -134.233 -112.492 Right

10:26:43 AM 6.012931 9.621051 6.000019 14.49996 10:26:43 AM -135.923 -111.547 Right

10:26:44 AM 6.013783 9.620283 6.000019 14.4998 10:26:44 AM -134.181 -112.518 Right

10:26:45 AM 6.014473 9.621649 6.000019 14.4998

10:26:46 AM 6.015044 9.679293 6.000019 14.49996 10:26:46 AM -134.205 -111.837 Right

10:26:47 AM -133.27 -112.384 Right

10:26:48 AM 6.015076 10.02583 6.000019 14.49997 10:26:48 AM -138.066 -113.3 Right

10:26:49 AM 6.015108 10.37238 6.000019 14.49985 10:26:49 AM -137.388 -113.761 Right

10:26:50 AM 6.01514 10.70842 6.000019 14.49985 10:26:50 AM -130.543 -111.832 Right

10:26:51 AM 6.015638 11.04551 6.000019 14.49985 10:26:51 AM -121.811 -109.251 Right

10:26:52 AM 6.03046 11.04189 6.000019 14.49996 10:26:52 AM -110.65 -108.992 Right

10:26:53 AM 6.041857 11.04336 6.000019 14.4998 10:26:53 AM -110.405 -118.983 Left

10:26:54 AM 6.053732 11.04617 6.000019 14.4998 10:26:54 AM -109.191 -125.626 Left

10:26:55 AM -110.817 -127.316 Left

10:26:56 AM 6.058239 11.05242 6.000019 14.49997 10:26:56 AM -111.159 -131.129 Left

10:26:57 AM 6.028661 11.07939 6.000019 14.49996

10:26:58 AM 5.990767 11.10788 6.000019 14.49985 10:26:58 AM -111.934 -134.315 Left

10:26:59 AM 5.93907 11.13029 6.000019 14.49985 10:26:59 AM -112.403 -137.93 Left

10:27:00 AM 5.897036 11.14688 6.000019 14.49996 10:27:00 AM -113.38 -141.15 Left

10:27:01 AM 5.837491 11.15529 6.000019 14.4998 10:27:01 AM -110.254 -143.621 Left

10:27:02 AM -110.873 -145.801 Left

10:27:04 AM 5.776953 11.16374 6.000019 14.49985 10:27:04 AM #NAME? #NAME? Mid

10:27:05 AM 5.726399 11.17293 6.000019 14.4998

10:27:06 AM 5.680781 11.18738 6.000019 14.49985 10:27:06 AM -109.413 -139.856 Left

10:27:07 AM 5.632584 11.21202 6.000019 14.4998 10:27:07 AM -106.452 -139.859 Left

10:27:08 AM 5.598484 11.23532 6.000019 14.49996 10:27:08 AM -106.64 -137.584 Left

10:27:09 AM 5.555641 11.27266 6.000019 14.49985 10:27:09 AM -107.584 -135.636 Left

10:27:10 AM 5.523552 11.30593 6.000019 14.49985 10:27:10 AM -107.971 -133.679 Left

10:27:11 AM 5.495688 11.34421 6.000019 14.49997 10:27:11 AM -107.867 -129.107 Left

10:27:12 AM 5.47869 11.38482 6.000019 14.4998 10:27:12 AM -107.227 -126.307 Left

10:27:13 AM 5.460893 11.43135 6.000019 14.49996 10:27:13 AM -107.853 -127.133 Left

10:27:14 AM -107.379 -126.857 Left

10:27:15 AM 5.443073 11.47096 6.000019 14.49985 10:27:15 AM -107.234 -126.9 Left

10:27:16 AM 5.423987 11.51841 6.000019 14.49996 10:27:16 AM -107.796 -127.278 Left

10:27:17 AM 5.411617 11.55918 6.000019 14.4998

10:27:18 AM 5.389745 11.61686 6.000019 14.49985 10:27:18 AM -109.904 -129.671 Left

10:27:19 AM 5.374179 11.66 6.000019 14.4998 10:27:19 AM -108.863 -128.983 Left

10:27:20 AM 5.352884 11.69341 6.000019 14.49997 10:27:20 AM -108.936 -131.065 Left

10:27:21 AM 5.334223 11.74571 6.000019 14.49997 10:27:21 AM -109.879 -130.404 Left

10:27:22 AM 5.320929 11.78638 6.000019 14.49996 10:27:22 AM -110.86 -129.563 Left

10:27:23 AM 5.313086 11.84917 6.000019 14.49985 10:27:23 AM -109.63 -127.733 Left

10:27:24 AM 5.309034 11.89369 6.000019 14.49997 10:27:24 AM -108.02 -126.813 Left

10:27:25 AM 5.3077 11.91376 6.000019 14.49985 10:27:25 AM -109.424 -125.642 Left

10:27:26 AM -111.797 -120.599 Left

10:27:27 AM 5.306705 11.91388 6.000019 14.4998

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 304: Modelling a real-time multi-sensor fusion-based navigation ...

286

X Z X Z L R

10:26:35 AM 6 9.5 6 14.5

10:26:36 AM 6 9.5 6 14.5

10:26:37 AM 6 9.5 6 14.5

10:26:38 AM 6.000052 9.500901 6.000019 14.49996

10:26:40 AM 5.993961 9.501074 6.000019 14.4998 10:26:40 AM -134.064 -116.402 Right

10:26:41 AM 6.011438 9.621527 6.000019 14.49996 10:26:41 AM -137.523 -110.395 Right

10:26:42 AM 6.012232 9.6207 6.000019 14.49997 10:26:42 AM -134.233 -112.492 Right

10:26:43 AM 6.012931 9.621051 6.000019 14.49996 10:26:43 AM -135.923 -111.547 Right

10:26:44 AM 6.013783 9.620283 6.000019 14.4998 10:26:44 AM -134.181 -112.518 Right

10:26:45 AM 6.014473 9.621649 6.000019 14.4998

10:26:46 AM 6.015044 9.679293 6.000019 14.49996 10:26:46 AM -134.205 -111.837 Right

10:26:47 AM -133.27 -112.384 Right

10:26:48 AM 6.015076 10.02583 6.000019 14.49997 10:26:48 AM -138.066 -113.3 Right

10:26:49 AM 6.015108 10.37238 6.000019 14.49985 10:26:49 AM -137.388 -113.761 Right

10:26:50 AM 6.01514 10.70842 6.000019 14.49985 10:26:50 AM -130.543 -111.832 Right

10:26:51 AM 6.015638 11.04551 6.000019 14.49985 10:26:51 AM -121.811 -109.251 Right

10:26:52 AM 6.03046 11.04189 6.000019 14.49996 10:26:52 AM -110.65 -108.992 Right

10:26:53 AM 6.041857 11.04336 6.000019 14.4998 10:26:53 AM -110.405 -118.983 Left

10:26:54 AM 6.053732 11.04617 6.000019 14.4998 10:26:54 AM -109.191 -125.626 Left

10:26:55 AM -110.817 -127.316 Left

10:26:56 AM 6.058239 11.05242 6.000019 14.49997 10:26:56 AM -111.159 -131.129 Left

10:26:57 AM 6.028661 11.07939 6.000019 14.49996

10:26:58 AM 5.990767 11.10788 6.000019 14.49985 10:26:58 AM -111.934 -134.315 Left

10:26:59 AM 5.93907 11.13029 6.000019 14.49985 10:26:59 AM -112.403 -137.93 Left

10:27:00 AM 5.897036 11.14688 6.000019 14.49996 10:27:00 AM -113.38 -141.15 Left

10:27:01 AM 5.837491 11.15529 6.000019 14.4998 10:27:01 AM -110.254 -143.621 Left

10:27:02 AM -110.873 -145.801 Left

10:27:04 AM 5.776953 11.16374 6.000019 14.49985 10:27:04 AM #NAME? #NAME? Mid

10:27:05 AM 5.726399 11.17293 6.000019 14.4998

10:27:06 AM 5.680781 11.18738 6.000019 14.49985 10:27:06 AM -109.413 -139.856 Left

10:27:07 AM 5.632584 11.21202 6.000019 14.4998 10:27:07 AM -106.452 -139.859 Left

10:27:08 AM 5.598484 11.23532 6.000019 14.49996 10:27:08 AM -106.64 -137.584 Left

10:27:09 AM 5.555641 11.27266 6.000019 14.49985 10:27:09 AM -107.584 -135.636 Left

10:27:10 AM 5.523552 11.30593 6.000019 14.49985 10:27:10 AM -107.971 -133.679 Left

10:27:11 AM 5.495688 11.34421 6.000019 14.49997 10:27:11 AM -107.867 -129.107 Left

10:27:12 AM 5.47869 11.38482 6.000019 14.4998 10:27:12 AM -107.227 -126.307 Left

10:27:13 AM 5.460893 11.43135 6.000019 14.49996 10:27:13 AM -107.853 -127.133 Left

10:27:14 AM -107.379 -126.857 Left

10:27:15 AM 5.443073 11.47096 6.000019 14.49985 10:27:15 AM -107.234 -126.9 Left

10:27:16 AM 5.423987 11.51841 6.000019 14.49996 10:27:16 AM -107.796 -127.278 Left

10:27:17 AM 5.411617 11.55918 6.000019 14.4998

10:27:18 AM 5.389745 11.61686 6.000019 14.49985 10:27:18 AM -109.904 -129.671 Left

10:27:19 AM 5.374179 11.66 6.000019 14.4998 10:27:19 AM -108.863 -128.983 Left

10:27:20 AM 5.352884 11.69341 6.000019 14.49997 10:27:20 AM -108.936 -131.065 Left

10:27:21 AM 5.334223 11.74571 6.000019 14.49997 10:27:21 AM -109.879 -130.404 Left

10:27:22 AM 5.320929 11.78638 6.000019 14.49996 10:27:22 AM -110.86 -129.563 Left

10:27:23 AM 5.313086 11.84917 6.000019 14.49985 10:27:23 AM -109.63 -127.733 Left

10:27:24 AM 5.309034 11.89369 6.000019 14.49997 10:27:24 AM -108.02 -126.813 Left

10:27:25 AM 5.3077 11.91376 6.000019 14.49985 10:27:25 AM -109.424 -125.642 Left

10:27:26 AM -111.797 -120.599 Left

10:27:27 AM 5.306705 11.91388 6.000019 14.4998

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 305: Modelling a real-time multi-sensor fusion-based navigation ...

287

Non-Uniform Obstruction with Left Scatter Sample 3

Motion Path Graph - Non-Uniform obstruction with Left Scatter – Sample 3.

Combined logs – Non-Uniform obstruction with Left Scatter – Sample 3.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.25.45.65.866.26.46.66.87

29-6-10-27

CARMI Child

X Z X Z L R

10:27:51 AM 6 9.5 6 14.5

10:27:52 AM 6 9.5 6 14.5

10:27:53 AM 6 9.5 6 14.5

10:27:54 AM 5.998454 9.500811 6.000019 14.49997

10:27:56 AM 5.998494 9.537737 6.000019 14.49997 10:27:56 AM -135.239 -116.065 Right

10:27:57 AM 6.017342 9.684696 6.000019 14.49996 10:27:57 AM -138.108 -110.325 Right

10:27:58 AM 6.018397 9.684046 6.000019 14.49985 10:27:58 AM -135.403 -111.447 Right

10:27:59 AM 6.019402 9.684674 6.000019 14.49997 10:27:59 AM -134.304 -113.401 Right

10:28:00 AM 6.026766 9.894994 6.000019 14.49996 10:28:00 AM -138.557 -113.6 Right

10:28:01 AM 6.038336 10.23513 6.000019 14.49985 10:28:01 AM -138.829 -114.629 Right

10:28:02 AM 6.049934 10.57411 6.000019 14.4998

10:28:03 AM 6.061596 10.91259 6.000019 14.49997 10:28:03 AM -134.814 -113.604 Right

10:28:04 AM -127.473 -110.742 Right

10:28:05 AM 6.070399 11.03259 6.000019 14.4998 10:28:05 AM -118.461 -109.984 Right

10:28:06 AM 6.078525 11.03496 6.000019 14.49997 10:28:06 AM -108.59 -111.097 Left

10:28:07 AM 6.086283 11.03974 6.000019 14.49997 10:28:07 AM -108.445 -118.76 Left

10:28:08 AM 6.092246 11.04339 6.000019 14.49996 10:28:08 AM -109.271 -122.474 Left

10:28:09 AM 6.097696 11.04694 6.000019 14.49997 10:28:09 AM -109.377 -122.747 Left

10:28:10 AM 6.110958 11.05076 6.000019 14.49997 10:28:10 AM -110.773 -125.34 Left

10:28:11 AM 6.073308 11.07943 6.000019 14.49985 10:28:11 AM -110.723 -132.627 Left

10:28:12 AM 6.031433 11.10486 6.000019 14.49996 10:28:12 AM -111.582 -134.965 Left

10:28:13 AM -112.039 -136.44 Left

10:28:14 AM 5.99231 11.12788 6.000019 14.49985

10:28:15 AM 5.950308 11.14446 6.000019 14.49997 10:28:15 AM -112.727 -140.56 Left

10:28:16 AM 5.885548 11.15042 6.000019 14.4998 10:28:16 AM -110.45 -146.169 Left

10:28:17 AM 5.836405 11.14352 6.000019 14.49996 10:28:17 AM -110.598 -149.71 Left

10:28:18 AM 5.784748 11.14578 6.000019 14.4998 10:28:18 AM -108.616 -145.864 Left

10:28:19 AM 5.738927 11.15322 6.000019 14.49996 10:28:19 AM -107.227 -143.446 Left

10:28:20 AM 5.692919 11.17212 6.000019 14.49997 10:28:20 AM -106.942 -141.51 Left

10:28:21 AM 5.650949 11.19277 6.000019 14.4998 10:28:21 AM -108.467 -137.858 Left

10:28:22 AM 5.607926 11.2202 6.000019 14.49996 10:28:22 AM -106.584 -138.086 Left

10:28:23 AM -109.344 -134.811 Left

10:28:24 AM 5.558819 11.26068 6.000019 14.49985 10:28:24 AM -108.263 -131.44 Left

10:28:25 AM 5.524682 11.2997 6.000019 14.49996 10:28:25 AM -107.905 -129.079 Left

10:28:26 AM 5.502385 11.33383 6.000019 14.4998

10:28:27 AM 5.488326 11.36887 6.000019 14.49996 10:28:27 AM -106.935 -125.873 Left

10:28:28 AM 5.469213 11.43046 6.000019 14.49997 10:28:28 AM -110.018 -125.943 Left

10:28:29 AM 5.447887 11.46063 6.000019 14.4998 10:28:29 AM -107.83 -128.344 Left

10:28:30 AM 5.427122 11.5077 6.000019 14.49997 10:28:30 AM -113.634 -121.826 Left

10:28:31 AM 5.395703 11.55596 6.000019 14.49985 10:28:31 AM -108.286 -130.209 Left

10:28:32 AM 5.380172 11.612 6.000019 14.49997 10:28:32 AM -110.945 -131.57 Left

10:28:33 AM -109.313 -129.094 Left

10:28:34 AM 5.358354 11.65795 6.000019 14.49985 10:28:34 AM -108.96 -130.161 Left

10:28:35 AM 5.337102 11.70815 6.000019 14.49996 10:28:35 AM -111.804 -131.204 Left

10:28:36 AM 5.32457 11.76071 6.000019 14.49985 10:28:36 AM -110.114 -128.007 Left

10:28:37 AM 5.309299 11.80063 6.000019 14.49997 10:28:37 AM -110.186 -131.025 Left

10:28:38 AM 5.304522 11.84407 6.000019 14.49997 10:28:38 AM -109.562 -126.029 Left

10:28:39 AM 5.305171 11.88791 6.000019 14.49985 10:28:39 AM -113.175 -121.476 Left

10:28:40 AM 5.304163 11.8889 6.000019 14.49997

10:28:41 AM 5.302428 11.88875 6.000019 14.49985 10:28:41 AM -111.232 -122.701 Left

10:28:42 AM 5.302111 11.89018 6.000019 14.49997 10:28:42 AM -111.554 -122.609 Left

10:28:43 AM 5.302137 11.89017 6.000019 14.49985 10:28:43 AM -112.03 -121.145 Left

10:28:44 AM 5.302223 11.89016 6.000019 14.4998 10:28:44 AM -113.275 -119.968 Left

10:28:45 AM -113.284 -119.785 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 306: Modelling a real-time multi-sensor fusion-based navigation ...

288

X Z X Z L R

10:27:51 AM 6 9.5 6 14.5

10:27:52 AM 6 9.5 6 14.5

10:27:53 AM 6 9.5 6 14.5

10:27:54 AM 5.998454 9.500811 6.000019 14.49997

10:27:56 AM 5.998494 9.537737 6.000019 14.49997 10:27:56 AM -135.239 -116.065 Right

10:27:57 AM 6.017342 9.684696 6.000019 14.49996 10:27:57 AM -138.108 -110.325 Right

10:27:58 AM 6.018397 9.684046 6.000019 14.49985 10:27:58 AM -135.403 -111.447 Right

10:27:59 AM 6.019402 9.684674 6.000019 14.49997 10:27:59 AM -134.304 -113.401 Right

10:28:00 AM 6.026766 9.894994 6.000019 14.49996 10:28:00 AM -138.557 -113.6 Right

10:28:01 AM 6.038336 10.23513 6.000019 14.49985 10:28:01 AM -138.829 -114.629 Right

10:28:02 AM 6.049934 10.57411 6.000019 14.4998

10:28:03 AM 6.061596 10.91259 6.000019 14.49997 10:28:03 AM -134.814 -113.604 Right

10:28:04 AM -127.473 -110.742 Right

10:28:05 AM 6.070399 11.03259 6.000019 14.4998 10:28:05 AM -118.461 -109.984 Right

10:28:06 AM 6.078525 11.03496 6.000019 14.49997 10:28:06 AM -108.59 -111.097 Left

10:28:07 AM 6.086283 11.03974 6.000019 14.49997 10:28:07 AM -108.445 -118.76 Left

10:28:08 AM 6.092246 11.04339 6.000019 14.49996 10:28:08 AM -109.271 -122.474 Left

10:28:09 AM 6.097696 11.04694 6.000019 14.49997 10:28:09 AM -109.377 -122.747 Left

10:28:10 AM 6.110958 11.05076 6.000019 14.49997 10:28:10 AM -110.773 -125.34 Left

10:28:11 AM 6.073308 11.07943 6.000019 14.49985 10:28:11 AM -110.723 -132.627 Left

10:28:12 AM 6.031433 11.10486 6.000019 14.49996 10:28:12 AM -111.582 -134.965 Left

10:28:13 AM -112.039 -136.44 Left

10:28:14 AM 5.99231 11.12788 6.000019 14.49985

10:28:15 AM 5.950308 11.14446 6.000019 14.49997 10:28:15 AM -112.727 -140.56 Left

10:28:16 AM 5.885548 11.15042 6.000019 14.4998 10:28:16 AM -110.45 -146.169 Left

10:28:17 AM 5.836405 11.14352 6.000019 14.49996 10:28:17 AM -110.598 -149.71 Left

10:28:18 AM 5.784748 11.14578 6.000019 14.4998 10:28:18 AM -108.616 -145.864 Left

10:28:19 AM 5.738927 11.15322 6.000019 14.49996 10:28:19 AM -107.227 -143.446 Left

10:28:20 AM 5.692919 11.17212 6.000019 14.49997 10:28:20 AM -106.942 -141.51 Left

10:28:21 AM 5.650949 11.19277 6.000019 14.4998 10:28:21 AM -108.467 -137.858 Left

10:28:22 AM 5.607926 11.2202 6.000019 14.49996 10:28:22 AM -106.584 -138.086 Left

10:28:23 AM -109.344 -134.811 Left

10:28:24 AM 5.558819 11.26068 6.000019 14.49985 10:28:24 AM -108.263 -131.44 Left

10:28:25 AM 5.524682 11.2997 6.000019 14.49996 10:28:25 AM -107.905 -129.079 Left

10:28:26 AM 5.502385 11.33383 6.000019 14.4998

10:28:27 AM 5.488326 11.36887 6.000019 14.49996 10:28:27 AM -106.935 -125.873 Left

10:28:28 AM 5.469213 11.43046 6.000019 14.49997 10:28:28 AM -110.018 -125.943 Left

10:28:29 AM 5.447887 11.46063 6.000019 14.4998 10:28:29 AM -107.83 -128.344 Left

10:28:30 AM 5.427122 11.5077 6.000019 14.49997 10:28:30 AM -113.634 -121.826 Left

10:28:31 AM 5.395703 11.55596 6.000019 14.49985 10:28:31 AM -108.286 -130.209 Left

10:28:32 AM 5.380172 11.612 6.000019 14.49997 10:28:32 AM -110.945 -131.57 Left

10:28:33 AM -109.313 -129.094 Left

10:28:34 AM 5.358354 11.65795 6.000019 14.49985 10:28:34 AM -108.96 -130.161 Left

10:28:35 AM 5.337102 11.70815 6.000019 14.49996 10:28:35 AM -111.804 -131.204 Left

10:28:36 AM 5.32457 11.76071 6.000019 14.49985 10:28:36 AM -110.114 -128.007 Left

10:28:37 AM 5.309299 11.80063 6.000019 14.49997 10:28:37 AM -110.186 -131.025 Left

10:28:38 AM 5.304522 11.84407 6.000019 14.49997 10:28:38 AM -109.562 -126.029 Left

10:28:39 AM 5.305171 11.88791 6.000019 14.49985 10:28:39 AM -113.175 -121.476 Left

10:28:40 AM 5.304163 11.8889 6.000019 14.49997

10:28:41 AM 5.302428 11.88875 6.000019 14.49985 10:28:41 AM -111.232 -122.701 Left

10:28:42 AM 5.302111 11.89018 6.000019 14.49997 10:28:42 AM -111.554 -122.609 Left

10:28:43 AM 5.302137 11.89017 6.000019 14.49985 10:28:43 AM -112.03 -121.145 Left

10:28:44 AM 5.302223 11.89016 6.000019 14.4998 10:28:44 AM -113.275 -119.968 Left

10:28:45 AM -113.284 -119.785 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 307: Modelling a real-time multi-sensor fusion-based navigation ...

289

Non-Uniform Obstruction with Left Scatter Sample 4

Motion Path Graph - Non-Uniform obstruction with Left Scatter – Sample 4.

Combined logs – Non-Uniform obstruction with Left Scatter – Sample 4.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.25.45.65.866.26.46.66.87

29-6-10-29

CARMI Child

X Z X Z L R

10:29:02 AM 6 9.5 6 14.5

10:29:03 AM 6 9.5 6 14.5

10:29:04 AM 6 9.5 6 14.5

10:29:06 AM 5.993899 9.500897 6.000019 14.49985

10:29:07 AM 6.001026 9.552586 6.000019 14.49997 10:29:07 AM -134.623 -117.155 Right

10:29:08 AM 6.001791 9.552605 6.000019 14.49997 10:29:08 AM -136.954 -111.392 Right

10:29:09 AM 6.002508 9.552366 6.000019 14.49996 10:29:09 AM -135.503 -111.929 Right

10:29:10 AM 6.003298 9.552692 6.000019 14.49985 10:29:10 AM -135.892 -111.842 Right

10:29:11 AM 6.00343 9.554021 6.000019 14.4998 10:29:11 AM -135.956 -111.422 Right

10:29:12 AM 6.004618 9.554567 6.000019 14.49997 10:29:12 AM -135.579 -111.32 Right

10:29:13 AM 6.005475 9.554563 6.000019 14.49996 10:29:13 AM -134.125 -111.692 Right

10:29:14 AM 6.006477 9.554731 6.000019 14.4998 10:29:14 AM -133.504 -111.174 Right

10:29:15 AM 6.013505 9.755704 6.000019 14.4998 10:29:15 AM -132.764 -113.254 Right

10:29:16 AM -138.87 -114.332 Right

10:29:17 AM 6.0254 10.10101 6.000019 14.49996 10:29:17 AM -138.776 -114.908 Right

10:29:18 AM 6.037297 10.44524 6.000019 14.4998 10:29:18 AM -133.915 -113.251 Right

10:29:19 AM 6.049037 10.78276 6.000019 14.4998 10:29:19 AM -126.403 -109.987 Right

10:29:20 AM 6.060687 11.03611 6.000019 14.49997

10:29:21 AM 6.074476 11.03667 6.000019 14.49985 10:29:21 AM -110.917 -110.393 Right

10:29:22 AM 6.086651 11.0353 6.000019 14.49997 10:29:22 AM -108.5 -115.974 Left

10:29:23 AM 6.108089 11.03732 6.000019 14.49997 10:29:23 AM -109.07 -121.546 Left

10:29:24 AM 6.112276 11.04187 6.000019 14.49997 10:29:24 AM -110.547 -124.69 Left

10:29:25 AM -110.488 -131.279 Left

10:29:26 AM 6.106983 11.05216 6.000019 14.49997 10:29:26 AM -110.847 -132.425 Left

10:29:27 AM 6.063187 11.08296 6.000019 14.49996 10:29:27 AM -111.246 -133.773 Left

10:29:28 AM 6.019441 11.11028 6.000019 14.49996 10:29:28 AM -112.018 -138.849 Left

10:29:29 AM 5.9763 11.12583 6.000019 14.4998 10:29:29 AM -112.37 -143.224 Left

10:29:30 AM 5.930936 11.13321 6.000019 14.4998 10:29:30 AM -109.969 -145.554 Left

10:29:31 AM 5.882591 11.12912 6.000019 14.4998 10:29:31 AM -109.882 -147.993 Left

10:29:32 AM 5.824316 11.12665 6.000019 14.49996 10:29:32 AM -106.531 -151.692 Left

10:29:33 AM 5.776063 11.13215 6.000019 14.49996

10:29:34 AM 5.734162 11.14516 6.000019 14.4998 10:29:34 AM -108.16 -143.569 Left

10:29:35 AM 5.691587 11.16363 6.000019 14.49997 10:29:35 AM -108.524 -138.639 Left

10:29:36 AM -106.406 -140.587 Left

10:29:37 AM 5.65205 11.18499 6.000019 14.49985 10:29:37 AM -108.121 -137.836 Left

10:29:38 AM 5.606458 11.2185 6.000019 14.49997 10:29:38 AM -106.897 -133.821 Left

10:29:39 AM 5.569025 11.25131 6.000019 14.49997 10:29:39 AM -106.656 -131.33 Left

10:29:40 AM 5.545821 11.27992 6.000019 14.4998 10:29:40 AM -108.524 -130.364 Left

10:29:41 AM 5.522609 11.3123 6.000019 14.49985

10:29:42 AM 5.508795 11.35174 6.000019 14.49997 10:29:42 AM -119.03 -129.514 Left

10:29:43 AM 5.496674 11.39544 6.000019 14.49985 10:29:43 AM -112.295 -122.14 Left

10:29:44 AM 5.473576 11.4423 6.000019 14.49997 10:29:44 AM -111.806 -126.558 Left

10:29:45 AM -110.352 -128.026 Left

10:29:46 AM 5.458086 11.48223 6.000019 14.49996 10:29:46 AM -109.969 -128.075 Left

10:29:47 AM 5.436299 11.53727 6.000019 14.49997 10:29:47 AM -109.281 -127.894 Left

10:29:48 AM 5.419587 11.58112 6.000019 14.49997 10:29:48 AM -109.301 -128.106 Left

10:29:49 AM 5.398224 11.63745 6.000019 14.49985 10:29:49 AM -109.305 -129.052 Left

10:29:50 AM 5.38126 11.67622 6.000019 14.49996 10:29:50 AM -108.783 -131.326 Left

10:29:51 AM 5.361076 11.72224 6.000019 14.4998 10:29:51 AM -110.035 -132.788 Left

10:29:52 AM 5.342669 11.78299 6.000019 14.49996 10:29:52 AM -110.785 -129.515 Left

10:29:53 AM 5.334421 11.82183 6.000019 14.49997 10:29:53 AM -110.064 -123.765 Left

10:29:54 AM 5.332245 11.88556 6.000019 14.49985 10:29:54 AM -109.403 -126.081 Left

10:29:55 AM 5.331328 11.92441 6.000019 14.4998

10:29:56 AM 5.329638 11.93354 6.000019 14.49997 10:29:56 AM -109.148 -127.626 Left

10:29:57 AM -109.639 -123.619 Left

10:29:58 AM 5.329639 11.93354 6.000019 14.49996 10:29:58 AM -113.845 -119.096 Left

10:29:59 AM 5.329639 11.93354 6.000019 14.49996 10:29:59 AM -112.018 -120.831 Left

10:30:00 AM 5.329639 11.93354 6.000019 14.4998 10:30:00 AM -112.023 -120.812 Left

10:30:01 AM 5.329639 11.93354 6.000019 14.49996 10:30:01 AM -112.023 -120.789 Left

10:30:02 AM 5.329639 11.93354 6.000019 14.49997 10:30:02 AM -112.023 -120.814 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 308: Modelling a real-time multi-sensor fusion-based navigation ...

290

X Z X Z L R

10:29:02 AM 6 9.5 6 14.5

10:29:03 AM 6 9.5 6 14.5

10:29:04 AM 6 9.5 6 14.5

10:29:06 AM 5.993899 9.500897 6.000019 14.49985

10:29:07 AM 6.001026 9.552586 6.000019 14.49997 10:29:07 AM -134.623 -117.155 Right

10:29:08 AM 6.001791 9.552605 6.000019 14.49997 10:29:08 AM -136.954 -111.392 Right

10:29:09 AM 6.002508 9.552366 6.000019 14.49996 10:29:09 AM -135.503 -111.929 Right

10:29:10 AM 6.003298 9.552692 6.000019 14.49985 10:29:10 AM -135.892 -111.842 Right

10:29:11 AM 6.00343 9.554021 6.000019 14.4998 10:29:11 AM -135.956 -111.422 Right

10:29:12 AM 6.004618 9.554567 6.000019 14.49997 10:29:12 AM -135.579 -111.32 Right

10:29:13 AM 6.005475 9.554563 6.000019 14.49996 10:29:13 AM -134.125 -111.692 Right

10:29:14 AM 6.006477 9.554731 6.000019 14.4998 10:29:14 AM -133.504 -111.174 Right

10:29:15 AM 6.013505 9.755704 6.000019 14.4998 10:29:15 AM -132.764 -113.254 Right

10:29:16 AM -138.87 -114.332 Right

10:29:17 AM 6.0254 10.10101 6.000019 14.49996 10:29:17 AM -138.776 -114.908 Right

10:29:18 AM 6.037297 10.44524 6.000019 14.4998 10:29:18 AM -133.915 -113.251 Right

10:29:19 AM 6.049037 10.78276 6.000019 14.4998 10:29:19 AM -126.403 -109.987 Right

10:29:20 AM 6.060687 11.03611 6.000019 14.49997

10:29:21 AM 6.074476 11.03667 6.000019 14.49985 10:29:21 AM -110.917 -110.393 Right

10:29:22 AM 6.086651 11.0353 6.000019 14.49997 10:29:22 AM -108.5 -115.974 Left

10:29:23 AM 6.108089 11.03732 6.000019 14.49997 10:29:23 AM -109.07 -121.546 Left

10:29:24 AM 6.112276 11.04187 6.000019 14.49997 10:29:24 AM -110.547 -124.69 Left

10:29:25 AM -110.488 -131.279 Left

10:29:26 AM 6.106983 11.05216 6.000019 14.49997 10:29:26 AM -110.847 -132.425 Left

10:29:27 AM 6.063187 11.08296 6.000019 14.49996 10:29:27 AM -111.246 -133.773 Left

10:29:28 AM 6.019441 11.11028 6.000019 14.49996 10:29:28 AM -112.018 -138.849 Left

10:29:29 AM 5.9763 11.12583 6.000019 14.4998 10:29:29 AM -112.37 -143.224 Left

10:29:30 AM 5.930936 11.13321 6.000019 14.4998 10:29:30 AM -109.969 -145.554 Left

10:29:31 AM 5.882591 11.12912 6.000019 14.4998 10:29:31 AM -109.882 -147.993 Left

10:29:32 AM 5.824316 11.12665 6.000019 14.49996 10:29:32 AM -106.531 -151.692 Left

10:29:33 AM 5.776063 11.13215 6.000019 14.49996

10:29:34 AM 5.734162 11.14516 6.000019 14.4998 10:29:34 AM -108.16 -143.569 Left

10:29:35 AM 5.691587 11.16363 6.000019 14.49997 10:29:35 AM -108.524 -138.639 Left

10:29:36 AM -106.406 -140.587 Left

10:29:37 AM 5.65205 11.18499 6.000019 14.49985 10:29:37 AM -108.121 -137.836 Left

10:29:38 AM 5.606458 11.2185 6.000019 14.49997 10:29:38 AM -106.897 -133.821 Left

10:29:39 AM 5.569025 11.25131 6.000019 14.49997 10:29:39 AM -106.656 -131.33 Left

10:29:40 AM 5.545821 11.27992 6.000019 14.4998 10:29:40 AM -108.524 -130.364 Left

10:29:41 AM 5.522609 11.3123 6.000019 14.49985

10:29:42 AM 5.508795 11.35174 6.000019 14.49997 10:29:42 AM -119.03 -129.514 Left

10:29:43 AM 5.496674 11.39544 6.000019 14.49985 10:29:43 AM -112.295 -122.14 Left

10:29:44 AM 5.473576 11.4423 6.000019 14.49997 10:29:44 AM -111.806 -126.558 Left

10:29:45 AM -110.352 -128.026 Left

10:29:46 AM 5.458086 11.48223 6.000019 14.49996 10:29:46 AM -109.969 -128.075 Left

10:29:47 AM 5.436299 11.53727 6.000019 14.49997 10:29:47 AM -109.281 -127.894 Left

10:29:48 AM 5.419587 11.58112 6.000019 14.49997 10:29:48 AM -109.301 -128.106 Left

10:29:49 AM 5.398224 11.63745 6.000019 14.49985 10:29:49 AM -109.305 -129.052 Left

10:29:50 AM 5.38126 11.67622 6.000019 14.49996 10:29:50 AM -108.783 -131.326 Left

10:29:51 AM 5.361076 11.72224 6.000019 14.4998 10:29:51 AM -110.035 -132.788 Left

10:29:52 AM 5.342669 11.78299 6.000019 14.49996 10:29:52 AM -110.785 -129.515 Left

10:29:53 AM 5.334421 11.82183 6.000019 14.49997 10:29:53 AM -110.064 -123.765 Left

10:29:54 AM 5.332245 11.88556 6.000019 14.49985 10:29:54 AM -109.403 -126.081 Left

10:29:55 AM 5.331328 11.92441 6.000019 14.4998

10:29:56 AM 5.329638 11.93354 6.000019 14.49997 10:29:56 AM -109.148 -127.626 Left

10:29:57 AM -109.639 -123.619 Left

10:29:58 AM 5.329639 11.93354 6.000019 14.49996 10:29:58 AM -113.845 -119.096 Left

10:29:59 AM 5.329639 11.93354 6.000019 14.49996 10:29:59 AM -112.018 -120.831 Left

10:30:00 AM 5.329639 11.93354 6.000019 14.4998 10:30:00 AM -112.023 -120.812 Left

10:30:01 AM 5.329639 11.93354 6.000019 14.49996 10:30:01 AM -112.023 -120.789 Left

10:30:02 AM 5.329639 11.93354 6.000019 14.49997 10:30:02 AM -112.023 -120.814 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 309: Modelling a real-time multi-sensor fusion-based navigation ...

291

Non-Uniform Obstruction with Left Scatter Sample 5

Motion Path Graph - Non-Uniform obstruction with Left Scatter – Sample 5.

Combined logs – Non-Uniform obstruction with Left Scatter – Sample 5.

0

2

4

6

8

10

12

14

16

5.2 5.3 5.4 5.5 5.6 5.7 5.8 5.9 6 6.1

29-6-10-31

CARMI Child

X Z X Z L R

10:31:49 AM 6 9.5 6 14.5

10:31:50 AM 6 9.5 6 14.5

10:31:51 AM 6 9.5 6 14.5

10:31:52 AM 6 9.5 6 14.5

10:31:54 AM 5.992441 9.500944 6.000019 14.49985

10:31:55 AM 6.015962 9.651443 6.000019 14.49996 10:31:55 AM -134.028 -117.627 Right

10:31:56 AM 6.017711 9.651225 6.000019 14.49997 10:31:56 AM -138.898 -111.76 Right

10:31:57 AM 6.018039 9.652341 6.000019 14.49997 10:31:57 AM -135.641 -110.615 Right

10:31:58 AM 6.019086 9.652852 6.000019 14.49985 10:31:58 AM -136.477 -112.367 Right

10:31:59 AM -136.42 -111.494 Right

10:32:00 AM 6.020158 9.652794 6.000019 14.49985 10:32:00 AM -134.61 -112.603 Right

10:32:01 AM 6.020701 9.652414 6.000019 14.49997 10:32:01 AM -134.423 -112.868 Right

10:32:02 AM 6.021148 9.653369 6.000019 14.49996 10:32:02 AM -134.427 -112.663 Right

10:32:03 AM 6.022253 9.652926 6.000019 14.4998 10:32:03 AM -134.311 -111.208 Right

10:32:04 AM 6.019351 9.800809 6.000019 14.49985 10:32:04 AM -135.271 -114.655 Right

10:32:05 AM 6.011848 10.14206 6.000019 14.4998 10:32:05 AM -138.447 -114.434 Right

10:32:06 AM 6.004345 10.48331 6.000019 14.4998 10:32:06 AM -135.086 -113.539 Right

10:32:07 AM 5.996724 10.82981 6.000019 14.49985 10:32:07 AM -125.475 -110.912 Right

10:32:08 AM 5.998251 11.03976 6.000019 14.49997

10:32:09 AM 6.007415 11.04215 6.000019 14.49997 10:32:09 AM -109.141 -111.038 Left

10:32:10 AM -109.4 -114.326 Left

10:32:11 AM 6.016801 11.04697 6.000019 14.49985 10:32:11 AM -111.038 -121.661 Left

10:32:12 AM 6.031215 11.04751 6.000019 14.49996 10:32:12 AM -110.616 -127.104 Left

10:32:13 AM 5.999884 11.07943 6.000019 14.49996 10:32:13 AM -111.077 -129.548 Left

10:32:14 AM 5.962042 11.11023 6.000019 14.49997 10:32:14 AM -111.982 -133.921 Left

10:32:15 AM 5.918005 11.13874 6.000019 14.49997 10:32:15 AM -112.83 -136.566 Left

10:32:16 AM 5.869604 11.15947 6.000019 14.49996 10:32:16 AM -114.13 -139.255 Left

10:32:17 AM 5.817402 11.17077 6.000019 14.49996 10:32:17 AM -110.474 -143.333 Left

10:32:18 AM 5.767788 11.1756 6.000019 14.49997 10:32:18 AM -106.671 -147.982 Left

10:32:19 AM 5.720627 11.18741 6.000019 14.4998 10:32:19 AM -106.796 -142.741 Left

10:32:20 AM -107.334 -139.671 Left

10:32:21 AM 5.668927 11.21321 6.000019 14.49997

10:32:22 AM 5.627196 11.23389 6.000019 14.49985 10:32:22 AM -106.302 -138.98 Left

10:32:23 AM 5.582974 11.26388 6.000019 14.49985 10:32:23 AM -106.696 -137.111 Left

10:32:24 AM 5.554739 11.29339 6.000019 14.49996 10:32:24 AM -107.714 -133.328 Left

10:32:25 AM 5.528443 11.32741 6.000019 14.4998 10:32:25 AM -108.426 -131.465 Left

10:32:26 AM 5.501231 11.36947 6.000019 14.49985 10:32:26 AM -108.066 -129.399 Left

10:32:27 AM 5.484584 11.41826 6.000019 14.49997 10:32:27 AM -111.852 -126.552 Left

10:32:28 AM 5.463676 11.47809 6.000019 14.49996 10:32:28 AM -110.017 -126.358 Left

10:32:29 AM 5.439849 11.52605 6.000019 14.49997 10:32:29 AM -110.59 -129.566 Left

10:32:30 AM 5.420736 11.57261 6.000019 14.49985 10:32:30 AM -111.504 -131.287 Left

10:32:31 AM -108.406 -127.863 Left

10:32:32 AM 5.403268 11.61649 6.000019 14.4998 10:32:32 AM -109.13 -129.686 Left

10:32:33 AM 5.383748 11.65424 6.000019 14.49997 10:32:33 AM -108.987 -130.957 Left

10:32:34 AM 5.361908 11.69757 6.000019 14.49997 10:32:34 AM -112.006 -128.823 Left

10:32:35 AM 5.352765 11.74458 6.000019 14.49997 10:32:35 AM -110.421 -127.427 Left

10:32:36 AM 5.343889 11.80588 6.000019 14.4998

10:32:37 AM 5.327412 11.84401 6.000019 14.49996 10:32:37 AM -110.991 -130.026 Left

10:32:38 AM 5.327576 11.8968 6.000019 14.49985 10:32:38 AM -110.08 -129.711 Left

10:32:39 AM 5.326814 11.92074 6.000019 14.4998 10:32:39 AM -111.935 -122.264 Left

10:32:40 AM 5.326798 11.92074 6.000019 14.49996 10:32:40 AM -111.33 -119.963 Left

10:32:41 AM 5.32645 11.92021 6.000019 14.49997 10:32:41 AM -110.89 -120.137 Left

10:32:42 AM 5.325131 11.9213 6.000019 14.49985 10:32:42 AM -111.223 -122.454 Left

10:32:43 AM -110.026 -124.116 Left

10:32:44 AM 5.324026 11.92062 6.000019 14.4998 10:32:44 AM -111.433 -124.272 Left

10:32:45 AM 5.322851 11.91928 6.000019 14.49985 10:32:45 AM -111.565 -122.669 Left

10:32:46 AM 5.322619 11.91943 6.000019 14.49996 10:32:46 AM -115.466 -119.337 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 310: Modelling a real-time multi-sensor fusion-based navigation ...

292

X Z X Z L R

10:31:49 AM 6 9.5 6 14.5

10:31:50 AM 6 9.5 6 14.5

10:31:51 AM 6 9.5 6 14.5

10:31:52 AM 6 9.5 6 14.5

10:31:54 AM 5.992441 9.500944 6.000019 14.49985

10:31:55 AM 6.015962 9.651443 6.000019 14.49996 10:31:55 AM -134.028 -117.627 Right

10:31:56 AM 6.017711 9.651225 6.000019 14.49997 10:31:56 AM -138.898 -111.76 Right

10:31:57 AM 6.018039 9.652341 6.000019 14.49997 10:31:57 AM -135.641 -110.615 Right

10:31:58 AM 6.019086 9.652852 6.000019 14.49985 10:31:58 AM -136.477 -112.367 Right

10:31:59 AM -136.42 -111.494 Right

10:32:00 AM 6.020158 9.652794 6.000019 14.49985 10:32:00 AM -134.61 -112.603 Right

10:32:01 AM 6.020701 9.652414 6.000019 14.49997 10:32:01 AM -134.423 -112.868 Right

10:32:02 AM 6.021148 9.653369 6.000019 14.49996 10:32:02 AM -134.427 -112.663 Right

10:32:03 AM 6.022253 9.652926 6.000019 14.4998 10:32:03 AM -134.311 -111.208 Right

10:32:04 AM 6.019351 9.800809 6.000019 14.49985 10:32:04 AM -135.271 -114.655 Right

10:32:05 AM 6.011848 10.14206 6.000019 14.4998 10:32:05 AM -138.447 -114.434 Right

10:32:06 AM 6.004345 10.48331 6.000019 14.4998 10:32:06 AM -135.086 -113.539 Right

10:32:07 AM 5.996724 10.82981 6.000019 14.49985 10:32:07 AM -125.475 -110.912 Right

10:32:08 AM 5.998251 11.03976 6.000019 14.49997

10:32:09 AM 6.007415 11.04215 6.000019 14.49997 10:32:09 AM -109.141 -111.038 Left

10:32:10 AM -109.4 -114.326 Left

10:32:11 AM 6.016801 11.04697 6.000019 14.49985 10:32:11 AM -111.038 -121.661 Left

10:32:12 AM 6.031215 11.04751 6.000019 14.49996 10:32:12 AM -110.616 -127.104 Left

10:32:13 AM 5.999884 11.07943 6.000019 14.49996 10:32:13 AM -111.077 -129.548 Left

10:32:14 AM 5.962042 11.11023 6.000019 14.49997 10:32:14 AM -111.982 -133.921 Left

10:32:15 AM 5.918005 11.13874 6.000019 14.49997 10:32:15 AM -112.83 -136.566 Left

10:32:16 AM 5.869604 11.15947 6.000019 14.49996 10:32:16 AM -114.13 -139.255 Left

10:32:17 AM 5.817402 11.17077 6.000019 14.49996 10:32:17 AM -110.474 -143.333 Left

10:32:18 AM 5.767788 11.1756 6.000019 14.49997 10:32:18 AM -106.671 -147.982 Left

10:32:19 AM 5.720627 11.18741 6.000019 14.4998 10:32:19 AM -106.796 -142.741 Left

10:32:20 AM -107.334 -139.671 Left

10:32:21 AM 5.668927 11.21321 6.000019 14.49997

10:32:22 AM 5.627196 11.23389 6.000019 14.49985 10:32:22 AM -106.302 -138.98 Left

10:32:23 AM 5.582974 11.26388 6.000019 14.49985 10:32:23 AM -106.696 -137.111 Left

10:32:24 AM 5.554739 11.29339 6.000019 14.49996 10:32:24 AM -107.714 -133.328 Left

10:32:25 AM 5.528443 11.32741 6.000019 14.4998 10:32:25 AM -108.426 -131.465 Left

10:32:26 AM 5.501231 11.36947 6.000019 14.49985 10:32:26 AM -108.066 -129.399 Left

10:32:27 AM 5.484584 11.41826 6.000019 14.49997 10:32:27 AM -111.852 -126.552 Left

10:32:28 AM 5.463676 11.47809 6.000019 14.49996 10:32:28 AM -110.017 -126.358 Left

10:32:29 AM 5.439849 11.52605 6.000019 14.49997 10:32:29 AM -110.59 -129.566 Left

10:32:30 AM 5.420736 11.57261 6.000019 14.49985 10:32:30 AM -111.504 -131.287 Left

10:32:31 AM -108.406 -127.863 Left

10:32:32 AM 5.403268 11.61649 6.000019 14.4998 10:32:32 AM -109.13 -129.686 Left

10:32:33 AM 5.383748 11.65424 6.000019 14.49997 10:32:33 AM -108.987 -130.957 Left

10:32:34 AM 5.361908 11.69757 6.000019 14.49997 10:32:34 AM -112.006 -128.823 Left

10:32:35 AM 5.352765 11.74458 6.000019 14.49997 10:32:35 AM -110.421 -127.427 Left

10:32:36 AM 5.343889 11.80588 6.000019 14.4998

10:32:37 AM 5.327412 11.84401 6.000019 14.49996 10:32:37 AM -110.991 -130.026 Left

10:32:38 AM 5.327576 11.8968 6.000019 14.49985 10:32:38 AM -110.08 -129.711 Left

10:32:39 AM 5.326814 11.92074 6.000019 14.4998 10:32:39 AM -111.935 -122.264 Left

10:32:40 AM 5.326798 11.92074 6.000019 14.49996 10:32:40 AM -111.33 -119.963 Left

10:32:41 AM 5.32645 11.92021 6.000019 14.49997 10:32:41 AM -110.89 -120.137 Left

10:32:42 AM 5.325131 11.9213 6.000019 14.49985 10:32:42 AM -111.223 -122.454 Left

10:32:43 AM -110.026 -124.116 Left

10:32:44 AM 5.324026 11.92062 6.000019 14.4998 10:32:44 AM -111.433 -124.272 Left

10:32:45 AM 5.322851 11.91928 6.000019 14.49985 10:32:45 AM -111.565 -122.669 Left

10:32:46 AM 5.322619 11.91943 6.000019 14.49996 10:32:46 AM -115.466 -119.337 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 311: Modelling a real-time multi-sensor fusion-based navigation ...

293

Non-Uniform Obstruction with Left Scatter Sample 6

Motion Path Graph - Non-Uniform obstruction with Left Scatter – Sample 6.

Combined logs – Non-Uniform obstruction with Left Scatter – Sample 6.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.25.45.65.866.26.46.66.87

29-6-10-35

CARMI Child

X Z X Z L R

10:35:42 AM 6 9.5 6 14.5

10:35:43 AM 6 9.5 6 14.5

10:35:44 AM 6 9.5 6 14.5

10:35:45 AM 5.997726 9.500861 6.000019 14.49985

10:35:47 AM 5.998831 9.538358 6.000019 14.49996 10:35:47 AM -131.762 -116.799 Right

10:35:48 AM 6.021124 9.687458 6.000019 14.49985 10:35:48 AM -139.042 -112.463 Right

10:35:49 AM 6.021504 9.688312 6.000019 14.49996 10:35:49 AM -136.254 -109.694 Right

10:35:50 AM 6.021977 9.688683 6.000019 14.4998 10:35:50 AM -134.902 -113.562 Right

10:35:51 AM 6.033209 9.825398 6.000019 14.49996 10:35:51 AM -137.137 -114.924 Right

10:35:52 AM 6.060241 10.16522 6.000019 14.49985 10:35:52 AM -139.954 -117.075 Right

10:35:53 AM 6.068151 10.24783 6.000019 14.49985 10:35:53 AM -139.076 -114.805 Right

10:35:54 AM 6.069379 10.24768 6.000019 14.49985 10:35:54 AM -136.638 -115.614 Right

10:35:55 AM 6.069839 10.24837 6.000019 14.49997 10:35:55 AM -135.859 -115.493 Right

10:35:56 AM 6.073844 10.41429 6.000019 14.4998 10:35:56 AM -138.212 -115.434 Right

10:35:57 AM 6.081593 10.75526 6.000019 14.4998 10:35:57 AM -133.549 -113.438 Right

10:35:59 AM 6.089834 11.01756 6.000019 14.49997 10:35:59 AM -127.751 -111.639 Right

10:36:00 AM 6.099893 11.01826 6.000019 14.4998 10:36:00 AM -108.259 -110.335 Left

10:36:01 AM 6.107203 11.02236 6.000019 14.4998 10:36:01 AM -108.686 -116.945 Left

10:36:02 AM 6.116633 11.02734 6.000019 14.49985 10:36:02 AM -108.901 -122.28 Left

10:36:03 AM 6.125064 11.03553 6.000019 14.4998 10:36:03 AM -110.376 -125.011 Left

10:36:04 AM 6.119187 11.04694 6.000019 14.4998 10:36:04 AM -110.353 -131.277 Left

10:36:05 AM 6.076388 11.07811 6.000019 14.4998 10:36:05 AM -111.141 -135.267 Left

10:36:06 AM 6.031554 11.09907 6.000019 14.49996 10:36:06 AM -111.604 -135.651 Left

10:36:07 AM 5.973724 11.12075 6.000019 14.49985 10:36:07 AM -113.38 -139.666 Left

10:36:08 AM 5.922245 11.13306 6.000019 14.4998 10:36:08 AM -112.208 -145.839 Left

10:36:09 AM -110.46 -148.786 Left

10:36:10 AM 5.871832 11.13128 6.000019 14.49985 10:36:10 AM -108.121 -151.425 Left

10:36:11 AM 5.823204 11.12848 6.000019 14.49996

10:36:12 AM 5.77553 11.13321 6.000019 14.4998 10:36:12 AM -109.121 -144.983 Left

10:36:13 AM 5.721715 11.1478 6.000019 14.49996 10:36:13 AM -110.377 -139.231 Left

10:36:14 AM 5.668785 11.16832 6.000019 14.49996 10:36:14 AM -107.699 -141.931 Left

10:36:15 AM 5.620876 11.19143 6.000019 14.4998 10:36:15 AM -106.352 -141.758 Left

10:36:16 AM 5.582611 11.21489 6.000019 14.49997 10:36:16 AM -106.704 -137.384 Left

10:36:17 AM 5.545479 11.24505 6.000019 14.49997 10:36:17 AM -107.339 -135.923 Left

10:36:18 AM 5.502849 11.28734 6.000019 14.49985 10:36:18 AM -106.431 -133.28 Left

10:36:19 AM -109.908 -132.686 Left

10:36:20 AM 5.473536 11.33265 6.000019 14.49997 10:36:20 AM -108.542 -128.464 Left

10:36:21 AM 5.451928 11.38219 6.000019 14.49985

10:36:22 AM 5.432741 11.42766 6.000019 14.4998 10:36:22 AM -109.107 -124.747 Left

10:36:23 AM 5.4143 11.47338 6.000019 14.49997 10:36:23 AM -109.315 -125.552 Left

10:36:24 AM 5.401236 11.51884 6.000019 14.4998 10:36:24 AM -107.486 -126.888 Left

10:36:25 AM 5.382052 11.55947 6.000019 14.49996 10:36:25 AM -106.629 -127.941 Left

10:36:26 AM 5.365375 11.60686 6.000019 14.4998 10:36:26 AM -109.273 -130.118 Left

10:36:27 AM -108.856 -130.367 Left

10:36:28 AM 5.344848 11.651 6.000019 14.49996 10:36:28 AM -108.969 -131.193 Left

10:36:29 AM 5.325261 11.69282 6.000019 14.4998 10:36:29 AM -107.763 -133.266 Left

10:36:30 AM 5.30911 11.73516 6.000019 14.49997 10:36:30 AM -113.472 -125.353 Left

10:36:31 AM 5.301552 11.77855 6.000019 14.49985 10:36:31 AM -109.109 -124.53 Left

10:36:32 AM 5.29265 11.82998 6.000019 14.49996 10:36:32 AM -110.135 -126.948 Left

10:36:33 AM 5.285671 11.88521 6.000019 14.49985 10:36:33 AM -108.693 -126.548 Left

10:36:34 AM 5.287668 11.90834 6.000019 14.4998 10:36:34 AM -110.894 -118.906 Left

10:36:35 AM 5.287676 11.90833 6.000019 14.49985 10:36:35 AM -109.864 -119.268 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 312: Modelling a real-time multi-sensor fusion-based navigation ...

294

X Z X Z L R

10:35:42 AM 6 9.5 6 14.5

10:35:43 AM 6 9.5 6 14.5

10:35:44 AM 6 9.5 6 14.5

10:35:45 AM 5.997726 9.500861 6.000019 14.49985

10:35:47 AM 5.998831 9.538358 6.000019 14.49996 10:35:47 AM -131.762 -116.799 Right

10:35:48 AM 6.021124 9.687458 6.000019 14.49985 10:35:48 AM -139.042 -112.463 Right

10:35:49 AM 6.021504 9.688312 6.000019 14.49996 10:35:49 AM -136.254 -109.694 Right

10:35:50 AM 6.021977 9.688683 6.000019 14.4998 10:35:50 AM -134.902 -113.562 Right

10:35:51 AM 6.033209 9.825398 6.000019 14.49996 10:35:51 AM -137.137 -114.924 Right

10:35:52 AM 6.060241 10.16522 6.000019 14.49985 10:35:52 AM -139.954 -117.075 Right

10:35:53 AM 6.068151 10.24783 6.000019 14.49985 10:35:53 AM -139.076 -114.805 Right

10:35:54 AM 6.069379 10.24768 6.000019 14.49985 10:35:54 AM -136.638 -115.614 Right

10:35:55 AM 6.069839 10.24837 6.000019 14.49997 10:35:55 AM -135.859 -115.493 Right

10:35:56 AM 6.073844 10.41429 6.000019 14.4998 10:35:56 AM -138.212 -115.434 Right

10:35:57 AM 6.081593 10.75526 6.000019 14.4998 10:35:57 AM -133.549 -113.438 Right

10:35:59 AM 6.089834 11.01756 6.000019 14.49997 10:35:59 AM -127.751 -111.639 Right

10:36:00 AM 6.099893 11.01826 6.000019 14.4998 10:36:00 AM -108.259 -110.335 Left

10:36:01 AM 6.107203 11.02236 6.000019 14.4998 10:36:01 AM -108.686 -116.945 Left

10:36:02 AM 6.116633 11.02734 6.000019 14.49985 10:36:02 AM -108.901 -122.28 Left

10:36:03 AM 6.125064 11.03553 6.000019 14.4998 10:36:03 AM -110.376 -125.011 Left

10:36:04 AM 6.119187 11.04694 6.000019 14.4998 10:36:04 AM -110.353 -131.277 Left

10:36:05 AM 6.076388 11.07811 6.000019 14.4998 10:36:05 AM -111.141 -135.267 Left

10:36:06 AM 6.031554 11.09907 6.000019 14.49996 10:36:06 AM -111.604 -135.651 Left

10:36:07 AM 5.973724 11.12075 6.000019 14.49985 10:36:07 AM -113.38 -139.666 Left

10:36:08 AM 5.922245 11.13306 6.000019 14.4998 10:36:08 AM -112.208 -145.839 Left

10:36:09 AM -110.46 -148.786 Left

10:36:10 AM 5.871832 11.13128 6.000019 14.49985 10:36:10 AM -108.121 -151.425 Left

10:36:11 AM 5.823204 11.12848 6.000019 14.49996

10:36:12 AM 5.77553 11.13321 6.000019 14.4998 10:36:12 AM -109.121 -144.983 Left

10:36:13 AM 5.721715 11.1478 6.000019 14.49996 10:36:13 AM -110.377 -139.231 Left

10:36:14 AM 5.668785 11.16832 6.000019 14.49996 10:36:14 AM -107.699 -141.931 Left

10:36:15 AM 5.620876 11.19143 6.000019 14.4998 10:36:15 AM -106.352 -141.758 Left

10:36:16 AM 5.582611 11.21489 6.000019 14.49997 10:36:16 AM -106.704 -137.384 Left

10:36:17 AM 5.545479 11.24505 6.000019 14.49997 10:36:17 AM -107.339 -135.923 Left

10:36:18 AM 5.502849 11.28734 6.000019 14.49985 10:36:18 AM -106.431 -133.28 Left

10:36:19 AM -109.908 -132.686 Left

10:36:20 AM 5.473536 11.33265 6.000019 14.49997 10:36:20 AM -108.542 -128.464 Left

10:36:21 AM 5.451928 11.38219 6.000019 14.49985

10:36:22 AM 5.432741 11.42766 6.000019 14.4998 10:36:22 AM -109.107 -124.747 Left

10:36:23 AM 5.4143 11.47338 6.000019 14.49997 10:36:23 AM -109.315 -125.552 Left

10:36:24 AM 5.401236 11.51884 6.000019 14.4998 10:36:24 AM -107.486 -126.888 Left

10:36:25 AM 5.382052 11.55947 6.000019 14.49996 10:36:25 AM -106.629 -127.941 Left

10:36:26 AM 5.365375 11.60686 6.000019 14.4998 10:36:26 AM -109.273 -130.118 Left

10:36:27 AM -108.856 -130.367 Left

10:36:28 AM 5.344848 11.651 6.000019 14.49996 10:36:28 AM -108.969 -131.193 Left

10:36:29 AM 5.325261 11.69282 6.000019 14.4998 10:36:29 AM -107.763 -133.266 Left

10:36:30 AM 5.30911 11.73516 6.000019 14.49997 10:36:30 AM -113.472 -125.353 Left

10:36:31 AM 5.301552 11.77855 6.000019 14.49985 10:36:31 AM -109.109 -124.53 Left

10:36:32 AM 5.29265 11.82998 6.000019 14.49996 10:36:32 AM -110.135 -126.948 Left

10:36:33 AM 5.285671 11.88521 6.000019 14.49985 10:36:33 AM -108.693 -126.548 Left

10:36:34 AM 5.287668 11.90834 6.000019 14.4998 10:36:34 AM -110.894 -118.906 Left

10:36:35 AM 5.287676 11.90833 6.000019 14.49985 10:36:35 AM -109.864 -119.268 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 313: Modelling a real-time multi-sensor fusion-based navigation ...

295

Non-Uniform Obstruction with Right Scatter Sample 1

Motion Path Graph – Non-Uniform obstruction with Right Scatter – Sample 1.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.25.45.65.866.26.46.66.87

29-6-10-45

CARMI Child

Page 314: Modelling a real-time multi-sensor fusion-based navigation ...

296

Combined logs – Non-Uniform obstruction with Right Scatter – Sample 1.

X Z X Z L R

10:45:30 AM 6 9.5 6 14.5

10:45:31 AM 6 9.5 6 14.5

10:45:32 AM 6 9.5 6 14.5

10:45:33 AM 6 9.5 6 14.5

10:45:34 AM 5.991169 9.498323 6.000019 14.49996

10:45:36 AM 5.980903 9.470992 6.000019 14.4998

10:45:37 AM 5.951832 9.432722 6.000019 14.49996

10:45:38 AM 5.918835 9.405048 6.000019 14.4998

10:45:39 AM 5.889732 9.392619 6.000019 14.4998

10:45:40 AM 5.857801 9.382647 6.000019 14.49985

10:45:41 AM 5.805346 9.375422 6.000019 14.4998

10:45:42 AM 5.766459 9.363517 6.000019 14.49996

10:45:43 AM 5.758475 9.361457 6.000019 14.4998

10:45:44 AM 5.805631 9.362635 6.000019 14.49997 10:45:44 AM -153.253 -119.599 Right

10:45:45 AM 5.850259 9.371466 6.000019 14.49997 10:45:45 AM -151.4 -117.636 Right

10:45:46 AM -145.025 -120.262 Right

10:45:47 AM 5.898289 9.38806 6.000019 14.49985 10:45:47 AM -143.613 -122.579 Right

10:45:48 AM 5.950254 9.410283 6.000019 14.4998 10:45:48 AM -144.469 -118.267 Right

10:45:49 AM 5.993522 9.438661 6.000019 14.49996 10:45:49 AM -142.009 -117.252 Right

10:45:50 AM 6.033807 9.473278 6.000019 14.4998 10:45:50 AM -139.094 -119.684 Right

10:45:51 AM 6.068832 9.515831 6.000019 14.49997 10:45:51 AM -138.598 -120.25 Right

10:45:52 AM 6.098202 9.559823 6.000019 14.49997 10:45:52 AM -138.117 -118.864 Right

10:45:53 AM 6.126612 9.618297 6.000019 14.4998 10:45:53 AM -135.211 -118.057 Right

10:45:54 AM 6.138898 9.670525 6.000019 14.49997 10:45:54 AM -131.223 -121.282 Right

10:45:55 AM 6.146984 9.787813 6.000019 14.4998 10:45:55 AM -132.739 -123.703 Right

10:45:56 AM 6.166361 10.10063 6.000019 14.49996 10:45:56 AM -132.337 -126.144 Right

10:45:57 AM -131.709 -126.399 Right

10:45:58 AM 6.166996 10.10195 6.000019 14.4998

10:45:59 AM 6.167827 10.10053 6.000019 14.4998 10:45:59 AM -129.633 -128.143 Right

10:46:00 AM 6.168312 10.10058 6.000019 14.4998 10:46:00 AM -129.589 -128.083 Right

10:46:01 AM 6.168487 10.12768 6.000019 14.49985 10:46:01 AM -128.695 -128.366 Right

10:46:02 AM 6.172882 10.45613 6.000019 14.49997 10:46:02 AM -125.826 -126.719 Left

10:46:03 AM 6.177131 10.80173 6.000019 14.49985 10:46:03 AM -117.256 -124.154 Left

10:46:04 AM 6.1734 11.0282 6.000019 14.49997 10:46:04 AM -109.48 -123.021 Left

10:46:05 AM 6.160651 11.0334 6.000019 14.4998 10:46:05 AM -112.477 -116.749 Left

10:46:06 AM 6.155008 11.03693 6.000019 14.49985 10:46:06 AM -123.527 -118.376 Right

10:46:07 AM -127.453 -121.532 Right

10:46:08 AM 6.149774 11.04185 6.000019 14.49996 10:46:08 AM -127.757 -111.029 Right

10:46:09 AM 6.143581 11.0509 6.000019 14.49997 10:46:09 AM -134.996 -108.71 Right

10:46:10 AM 6.14938 11.06598 6.000019 14.4998

10:46:11 AM 6.194893 11.08014 6.000019 14.49996 10:46:11 AM -142.535 -111.03 Right

10:46:12 AM 6.240885 11.0917 6.000019 14.49985 10:46:12 AM -146.783 -107.979 Right

10:46:13 AM 6.291157 11.09692 6.000019 14.49997 10:46:13 AM -146.949 -108.898 Right

10:46:14 AM 6.348597 11.09426 6.000019 14.49996 10:46:14 AM -151.145 -109.481 Right

10:46:15 AM 6.401205 11.08679 6.000019 14.4998 10:46:15 AM -155.375 -108.422 Right

10:46:16 AM 6.449315 11.0744 6.000019 14.49996 10:46:16 AM -157.406 -108.263 Right

10:46:17 AM -160.04 -109.135 Right

10:46:18 AM 6.498727 11.05199 6.000019 14.4998 10:46:18 AM -160.04 -108.423 Right

10:46:19 AM 6.541477 11.0306 6.000019 14.49996 10:46:19 AM 121 Recenter-Right

10:46:20 AM 6.56611 11.02435 6.000019 14.49996 10:46:20 AM -159.806 -107.807 Right

10:46:21 AM 6.56626 11.02353 6.000019 14.4998 10:46:21 AM -158.956 -107.945 Right

10:46:22 AM 6.566901 11.02335 6.000019 14.4998 10:46:22 AM -159.122 -107.475 Right

10:46:23 AM 6.567548 11.02321 6.000019 14.49996

10:46:24 AM -158.734 -107.302 Right

10:46:25 AM 6.567902 11.02105 6.000019 14.49997 10:46:25 AM -157.335 -107.629 Right

10:46:26 AM 6.567883 11.02042 6.000019 14.49985 10:46:26 AM -157.421 -107.581 Right

10:46:27 AM 6.568752 11.01934 6.000019 14.49997 10:46:27 AM -157.56 -106.949 Right

10:46:28 AM 6.569962 11.01767 6.000019 14.49985 10:46:28 AM -156.513 -107.091 Right

10:46:29 AM 6.56933 11.01732 6.000019 14.49996 10:46:29 AM -155.849 -107.5 Right

10:46:30 AM 6.569633 11.01657 6.000019 14.49996 10:46:30 AM -156.09 -107.31 Right

10:46:31 AM 6.570367 11.01589 6.000019 14.49996 10:46:31 AM -156.096 -106.878 Right

10:46:32 AM 6.570165 11.01528 6.000019 14.4998 10:46:32 AM -154.892 -106.888 Right

10:46:33 AM 6.570065 11.0145 6.000019 14.49997 10:46:33 AM -154.838 -106.856 Right

10:46:34 AM -155.12 -106.82 Right

10:46:35 AM 6.569612 11.01409 6.000019 14.49985 10:46:35 AM -154.389 -106.844 Right

10:46:36 AM 6.569494 11.01345 6.000019 14.4998 10:46:36 AM -153.744 -106.875 Right

10:46:37 AM 6.569689 11.01248 6.000019 14.49996

10:46:38 AM 6.569174 11.01228 6.000019 14.49997 10:46:38 AM -151.94 -107.435 Right

10:46:39 AM 6.56943 11.0114 6.000019 14.49985 10:46:39 AM -152.071 -107.06 Right

10:46:40 AM 6.570269 11.01109 6.000019 14.49985 10:46:40 AM -152.299 -106.911 Right

10:46:41 AM 6.570642 11.01048 6.000019 14.49996 10:46:41 AM -152.339 -106.956 Right

10:46:42 AM 6.571862 11.01055 6.000019 14.49997 10:46:42 AM -152.588 -106.869 Right

10:46:43 AM 6.571805 11.00985 6.000019 14.49985 10:46:43 AM -152.231 -107.164 Right

10:46:44 AM 6.571815 11.00929 6.000019 14.4998 10:46:44 AM -150.656 -106.878 Right

10:46:45 AM -150.598 -106.937 Right

10:46:46 AM 6.571054 11.00822 6.000019 14.49996 10:46:46 AM -150.701 -106.953 Right

10:46:47 AM 6.570355 11.00682 6.000019 14.49985 10:46:47 AM -150.279 -107.942 Right

10:46:48 AM 6.569802 11.00642 6.000019 14.4998 10:46:48 AM -149.358 -107.257 Right

10:46:49 AM 6.570101 11.00573 6.000019 14.4998 10:46:49 AM -149.345 -107.75 Right

10:46:50 AM 6.569285 11.00417 6.000019 14.49997

10:46:51 AM 6.571017 11.00324 6.000019 14.49997 10:46:51 AM -148.566 -109.288 Right

10:46:52 AM 6.571182 11.00173 6.000019 14.49997 10:46:52 AM -147.382 -108.46 Right

10:46:53 AM 6.571203 11.00113 6.000019 14.49985 10:46:53 AM -146.68 -108.069 Right

10:46:54 AM 6.57149 11.0007 6.000019 14.4998 10:46:54 AM -146.69 -108.233 Right

10:46:55 AM 6.572985 11.00078 6.000019 14.49985 10:46:55 AM -146.67 -109.069 Right

10:46:56 AM 6.573534 11 6.000019 14.49997 10:46:56 AM -146.225 -110.048 Right

10:46:57 AM -145.07 -109.098 Right

10:46:58 AM 6.572688 10.99891 6.000019 14.49996 10:46:58 AM -145.046 -109.714 Right

10:46:59 AM 6.571851 10.99755 6.000019 14.4998

10:47:00 AM 6.573171 10.99592 6.000019 14.49996 10:47:00 AM -144.327 -112.133 Right

10:47:01 AM 6.573877 10.99536 6.000019 14.4998 10:47:01 AM -143.394 -110.754 Right

10:47:02 AM 6.574798 10.99486 6.000019 14.49996 10:47:02 AM -142.292 -109.33 Right

10:47:03 AM 6.575389 10.99266 6.000019 14.4998 10:47:03 AM -142.515 -111.384 Right

10:47:04 AM 6.57592 10.99097 6.000019 14.49985 10:47:04 AM -141.775 -114.416 Right

10:47:05 AM 6.576553 10.98968 6.000019 14.49997 10:47:05 AM -140.203 -113.669 Right

10:47:06 AM 6.577266 10.9883 6.000019 14.49997 10:47:06 AM -139.523 -112.823 Right

10:47:07 AM 6.578104 10.98692 6.000019 14.49985 10:47:07 AM -139.225 -115.469 Right

10:47:08 AM -137.852 -117.125 Right

10:47:09 AM 6.579064 10.98588 6.000019 14.49996 10:47:09 AM -136.94 -115.426 Right

10:47:10 AM 6.580445 10.98445 6.000019 14.49996 10:47:10 AM -136.564 -116.943 Right

10:47:11 AM 6.581757 10.98374 6.000019 14.4998

10:47:12 AM 6.583086 10.98255 6.000019 14.49996 10:47:12 AM -135.306 -117.861 Right

10:47:13 AM 6.584021 10.98164 6.000019 14.49985 10:47:13 AM -134.285 -116.528 Right

10:47:14 AM 6.585552 10.98039 6.000019 14.49997 10:47:14 AM -134.32 -117.56 Right

10:47:15 AM 6.587013 10.97934 6.000019 14.49996 10:47:15 AM -132.897 -119.445 Right

10:47:16 AM 6.588536 10.97813 6.000019 14.4998 10:47:16 AM -131.702 -118.339 Right

10:47:17 AM 6.590101 10.97822 6.000019 14.49997 10:47:17 AM -131.055 -118.816 Right

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 315: Modelling a real-time multi-sensor fusion-based navigation ...

297

X Z X Z L R

10:45:30 AM 6 9.5 6 14.5

10:45:31 AM 6 9.5 6 14.5

10:45:32 AM 6 9.5 6 14.5

10:45:33 AM 6 9.5 6 14.5

10:45:34 AM 5.991169 9.498323 6.000019 14.49996

10:45:36 AM 5.980903 9.470992 6.000019 14.4998

10:45:37 AM 5.951832 9.432722 6.000019 14.49996

10:45:38 AM 5.918835 9.405048 6.000019 14.4998

10:45:39 AM 5.889732 9.392619 6.000019 14.4998

10:45:40 AM 5.857801 9.382647 6.000019 14.49985

10:45:41 AM 5.805346 9.375422 6.000019 14.4998

10:45:42 AM 5.766459 9.363517 6.000019 14.49996

10:45:43 AM 5.758475 9.361457 6.000019 14.4998

10:45:44 AM 5.805631 9.362635 6.000019 14.49997 10:45:44 AM -153.253 -119.599 Right

10:45:45 AM 5.850259 9.371466 6.000019 14.49997 10:45:45 AM -151.4 -117.636 Right

10:45:46 AM -145.025 -120.262 Right

10:45:47 AM 5.898289 9.38806 6.000019 14.49985 10:45:47 AM -143.613 -122.579 Right

10:45:48 AM 5.950254 9.410283 6.000019 14.4998 10:45:48 AM -144.469 -118.267 Right

10:45:49 AM 5.993522 9.438661 6.000019 14.49996 10:45:49 AM -142.009 -117.252 Right

10:45:50 AM 6.033807 9.473278 6.000019 14.4998 10:45:50 AM -139.094 -119.684 Right

10:45:51 AM 6.068832 9.515831 6.000019 14.49997 10:45:51 AM -138.598 -120.25 Right

10:45:52 AM 6.098202 9.559823 6.000019 14.49997 10:45:52 AM -138.117 -118.864 Right

10:45:53 AM 6.126612 9.618297 6.000019 14.4998 10:45:53 AM -135.211 -118.057 Right

10:45:54 AM 6.138898 9.670525 6.000019 14.49997 10:45:54 AM -131.223 -121.282 Right

10:45:55 AM 6.146984 9.787813 6.000019 14.4998 10:45:55 AM -132.739 -123.703 Right

10:45:56 AM 6.166361 10.10063 6.000019 14.49996 10:45:56 AM -132.337 -126.144 Right

10:45:57 AM -131.709 -126.399 Right

10:45:58 AM 6.166996 10.10195 6.000019 14.4998

10:45:59 AM 6.167827 10.10053 6.000019 14.4998 10:45:59 AM -129.633 -128.143 Right

10:46:00 AM 6.168312 10.10058 6.000019 14.4998 10:46:00 AM -129.589 -128.083 Right

10:46:01 AM 6.168487 10.12768 6.000019 14.49985 10:46:01 AM -128.695 -128.366 Right

10:46:02 AM 6.172882 10.45613 6.000019 14.49997 10:46:02 AM -125.826 -126.719 Left

10:46:03 AM 6.177131 10.80173 6.000019 14.49985 10:46:03 AM -117.256 -124.154 Left

10:46:04 AM 6.1734 11.0282 6.000019 14.49997 10:46:04 AM -109.48 -123.021 Left

10:46:05 AM 6.160651 11.0334 6.000019 14.4998 10:46:05 AM -112.477 -116.749 Left

10:46:06 AM 6.155008 11.03693 6.000019 14.49985 10:46:06 AM -123.527 -118.376 Right

10:46:07 AM -127.453 -121.532 Right

10:46:08 AM 6.149774 11.04185 6.000019 14.49996 10:46:08 AM -127.757 -111.029 Right

10:46:09 AM 6.143581 11.0509 6.000019 14.49997 10:46:09 AM -134.996 -108.71 Right

10:46:10 AM 6.14938 11.06598 6.000019 14.4998

10:46:11 AM 6.194893 11.08014 6.000019 14.49996 10:46:11 AM -142.535 -111.03 Right

10:46:12 AM 6.240885 11.0917 6.000019 14.49985 10:46:12 AM -146.783 -107.979 Right

10:46:13 AM 6.291157 11.09692 6.000019 14.49997 10:46:13 AM -146.949 -108.898 Right

10:46:14 AM 6.348597 11.09426 6.000019 14.49996 10:46:14 AM -151.145 -109.481 Right

10:46:15 AM 6.401205 11.08679 6.000019 14.4998 10:46:15 AM -155.375 -108.422 Right

10:46:16 AM 6.449315 11.0744 6.000019 14.49996 10:46:16 AM -157.406 -108.263 Right

10:46:17 AM -160.04 -109.135 Right

10:46:18 AM 6.498727 11.05199 6.000019 14.4998 10:46:18 AM -160.04 -108.423 Right

10:46:19 AM 6.541477 11.0306 6.000019 14.49996 10:46:19 AM 121 Recenter-Right

10:46:20 AM 6.56611 11.02435 6.000019 14.49996 10:46:20 AM -159.806 -107.807 Right

10:46:21 AM 6.56626 11.02353 6.000019 14.4998 10:46:21 AM -158.956 -107.945 Right

10:46:22 AM 6.566901 11.02335 6.000019 14.4998 10:46:22 AM -159.122 -107.475 Right

10:46:23 AM 6.567548 11.02321 6.000019 14.49996

10:46:24 AM -158.734 -107.302 Right

10:46:25 AM 6.567902 11.02105 6.000019 14.49997 10:46:25 AM -157.335 -107.629 Right

10:46:26 AM 6.567883 11.02042 6.000019 14.49985 10:46:26 AM -157.421 -107.581 Right

10:46:27 AM 6.568752 11.01934 6.000019 14.49997 10:46:27 AM -157.56 -106.949 Right

10:46:28 AM 6.569962 11.01767 6.000019 14.49985 10:46:28 AM -156.513 -107.091 Right

10:46:29 AM 6.56933 11.01732 6.000019 14.49996 10:46:29 AM -155.849 -107.5 Right

10:46:30 AM 6.569633 11.01657 6.000019 14.49996 10:46:30 AM -156.09 -107.31 Right

10:46:31 AM 6.570367 11.01589 6.000019 14.49996 10:46:31 AM -156.096 -106.878 Right

10:46:32 AM 6.570165 11.01528 6.000019 14.4998 10:46:32 AM -154.892 -106.888 Right

10:46:33 AM 6.570065 11.0145 6.000019 14.49997 10:46:33 AM -154.838 -106.856 Right

10:46:34 AM -155.12 -106.82 Right

10:46:35 AM 6.569612 11.01409 6.000019 14.49985 10:46:35 AM -154.389 -106.844 Right

10:46:36 AM 6.569494 11.01345 6.000019 14.4998 10:46:36 AM -153.744 -106.875 Right

10:46:37 AM 6.569689 11.01248 6.000019 14.49996

10:46:38 AM 6.569174 11.01228 6.000019 14.49997 10:46:38 AM -151.94 -107.435 Right

10:46:39 AM 6.56943 11.0114 6.000019 14.49985 10:46:39 AM -152.071 -107.06 Right

10:46:40 AM 6.570269 11.01109 6.000019 14.49985 10:46:40 AM -152.299 -106.911 Right

10:46:41 AM 6.570642 11.01048 6.000019 14.49996 10:46:41 AM -152.339 -106.956 Right

10:46:42 AM 6.571862 11.01055 6.000019 14.49997 10:46:42 AM -152.588 -106.869 Right

10:46:43 AM 6.571805 11.00985 6.000019 14.49985 10:46:43 AM -152.231 -107.164 Right

10:46:44 AM 6.571815 11.00929 6.000019 14.4998 10:46:44 AM -150.656 -106.878 Right

10:46:45 AM -150.598 -106.937 Right

10:46:46 AM 6.571054 11.00822 6.000019 14.49996 10:46:46 AM -150.701 -106.953 Right

10:46:47 AM 6.570355 11.00682 6.000019 14.49985 10:46:47 AM -150.279 -107.942 Right

10:46:48 AM 6.569802 11.00642 6.000019 14.4998 10:46:48 AM -149.358 -107.257 Right

10:46:49 AM 6.570101 11.00573 6.000019 14.4998 10:46:49 AM -149.345 -107.75 Right

10:46:50 AM 6.569285 11.00417 6.000019 14.49997

10:46:51 AM 6.571017 11.00324 6.000019 14.49997 10:46:51 AM -148.566 -109.288 Right

10:46:52 AM 6.571182 11.00173 6.000019 14.49997 10:46:52 AM -147.382 -108.46 Right

10:46:53 AM 6.571203 11.00113 6.000019 14.49985 10:46:53 AM -146.68 -108.069 Right

10:46:54 AM 6.57149 11.0007 6.000019 14.4998 10:46:54 AM -146.69 -108.233 Right

10:46:55 AM 6.572985 11.00078 6.000019 14.49985 10:46:55 AM -146.67 -109.069 Right

10:46:56 AM 6.573534 11 6.000019 14.49997 10:46:56 AM -146.225 -110.048 Right

10:46:57 AM -145.07 -109.098 Right

10:46:58 AM 6.572688 10.99891 6.000019 14.49996 10:46:58 AM -145.046 -109.714 Right

10:46:59 AM 6.571851 10.99755 6.000019 14.4998

10:47:00 AM 6.573171 10.99592 6.000019 14.49996 10:47:00 AM -144.327 -112.133 Right

10:47:01 AM 6.573877 10.99536 6.000019 14.4998 10:47:01 AM -143.394 -110.754 Right

10:47:02 AM 6.574798 10.99486 6.000019 14.49996 10:47:02 AM -142.292 -109.33 Right

10:47:03 AM 6.575389 10.99266 6.000019 14.4998 10:47:03 AM -142.515 -111.384 Right

10:47:04 AM 6.57592 10.99097 6.000019 14.49985 10:47:04 AM -141.775 -114.416 Right

10:47:05 AM 6.576553 10.98968 6.000019 14.49997 10:47:05 AM -140.203 -113.669 Right

10:47:06 AM 6.577266 10.9883 6.000019 14.49997 10:47:06 AM -139.523 -112.823 Right

10:47:07 AM 6.578104 10.98692 6.000019 14.49985 10:47:07 AM -139.225 -115.469 Right

10:47:08 AM -137.852 -117.125 Right

10:47:09 AM 6.579064 10.98588 6.000019 14.49996 10:47:09 AM -136.94 -115.426 Right

10:47:10 AM 6.580445 10.98445 6.000019 14.49996 10:47:10 AM -136.564 -116.943 Right

10:47:11 AM 6.581757 10.98374 6.000019 14.4998

10:47:12 AM 6.583086 10.98255 6.000019 14.49996 10:47:12 AM -135.306 -117.861 Right

10:47:13 AM 6.584021 10.98164 6.000019 14.49985 10:47:13 AM -134.285 -116.528 Right

10:47:14 AM 6.585552 10.98039 6.000019 14.49997 10:47:14 AM -134.32 -117.56 Right

10:47:15 AM 6.587013 10.97934 6.000019 14.49996 10:47:15 AM -132.897 -119.445 Right

10:47:16 AM 6.588536 10.97813 6.000019 14.4998 10:47:16 AM -131.702 -118.339 Right

10:47:17 AM 6.590101 10.97822 6.000019 14.49997 10:47:17 AM -131.055 -118.816 Right

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 316: Modelling a real-time multi-sensor fusion-based navigation ...

298

Non-Uniform Obstruction with Right Scatter Sample 2

Motion Path Graph – Non-Uniform obstruction with Right Scatter – Sample 2.

Combined logs – Non-Uniform obstruction with Right Scatter – Sample 2.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.577.588.59

29-6-10-48

CARMI Child

X Z X Z L R

10:48:28 AM 6 9.5 6 14.5

10:48:29 AM 6 9.5 6 14.5

10:48:30 AM 6 9.5 6 14.5

10:48:31 AM 5.99592 9.500831 6.000019 14.4998

10:48:32 AM 6.009068 9.629572 6.000019 14.4998

10:48:33 AM -132.71 -121.923 Right

10:48:34 AM 6.018241 9.722364 6.000019 14.49996 10:48:34 AM -134.793 -120.342 Right

10:48:35 AM 6.018951 9.722609 6.000019 14.49997 10:48:35 AM -132.379 -120.669 Right

10:48:36 AM 6.019325 9.72325 6.000019 14.49997 10:48:36 AM -131.267 -122.602 Right

10:48:37 AM 6.020663 9.723968 6.000019 14.49996 10:48:37 AM -132.982 -121.241 Right

10:48:38 AM 6.021132 9.724427 6.000019 14.4998 10:48:38 AM -132.61 -121.392 Right

10:48:39 AM 6.022233 9.724319 6.000019 14.49996 10:48:39 AM -130.948 -122.342 Right

10:48:40 AM 6.022468 9.724886 6.000019 14.4998 10:48:40 AM -130.999 -121.761 Right

10:48:41 AM 6.022222 9.767302 6.000019 14.49985 10:48:41 AM -130.463 -121.9 Right

10:48:42 AM 6.016362 10.10283 6.000019 14.49985 10:48:42 AM -130.08 -125.872 Right

10:48:43 AM -125.861 -126.63 Left

10:48:44 AM 6.00977 10.48014 6.000019 14.49996 10:48:44 AM -117.647 -125.408 Left

10:48:45 AM 6.00391 10.81541 6.000019 14.49985 10:48:45 AM -108.618 -122.762 Left

10:48:46 AM 5.987459 11.06953 6.000019 14.49996 10:48:46 AM -109.612 -123.527 Left

10:48:47 AM 5.973427 11.06703 6.000019 14.49985 10:48:47 AM -116.899 -114.422 Right

10:48:48 AM 5.96084 11.07514 6.000019 14.49985

10:48:49 AM 5.952713 11.08417 6.000019 14.49997 10:48:49 AM -125.39 -112.528 Right

10:48:50 AM 5.985577 11.10983 6.000019 14.4998 10:48:50 AM -134.342 -108.657 Right

10:48:51 AM 6.028952 11.12534 6.000019 14.49985 10:48:51 AM -140.865 -110.024 Right

10:48:52 AM 6.087961 11.13871 6.000019 14.49996 10:48:52 AM -148.442 -114.168 Right

10:48:53 AM 6.13534 11.14533 6.000019 14.4998 10:48:53 AM -146.531 -109.406 Right

10:48:54 AM 6.191909 11.13982 6.000019 14.49997 10:48:54 AM -149.177 -109.559 Right

10:48:55 AM -155.428 -109.39 Right

10:48:56 AM 6.240949 11.13314 6.000019 14.49997 10:48:56 AM -158.071 -108.683 Right

10:48:57 AM 6.282056 11.11869 6.000019 14.49997 10:48:57 AM -159.71 -107.75 Right

10:48:58 AM 6.332334 11.09471 6.000019 14.4998 10:48:58 AM -159.748 -107.205 Right

10:48:59 AM 6.392854 11.07143 6.000019 14.49985 10:48:59 AM -158.088 -107.451 Right

10:49:00 AM 6.441599 11.0538 6.000019 14.4998 10:49:00 AM -158.056 -108.785 Right

10:49:01 AM 6.488751 11.03888 6.000019 14.49997 10:49:01 AM -159.709 -108.889 Right

10:49:02 AM 6.535773 11.02227 6.000019 14.49996 10:49:02 AM -160.04 -108.38 Right

10:49:03 AM 6.588978 11.00392 6.000019 14.4998

10:49:04 AM 6.636698 10.98684 6.000019 14.4998 10:49:04 AM -159.712 -109.505 Right

10:49:05 AM 6.680408 10.96762 6.000019 14.4998 10:49:05 AM -159.402 -108.284 Right

10:49:06 AM 6.72623 10.95802 6.000019 14.49997 10:49:06 AM -159.332 -107.888 Right

10:49:07 AM 6.77289 10.94839 6.000019 14.49997 10:49:07 AM -158.034 -108.114 Right

10:49:08 AM -157.738 -108.273 Right

10:49:09 AM 6.828001 10.93324 6.000019 14.49985 10:49:09 AM -157.738 -108.324 Right

10:49:10 AM 6.892042 10.92458 6.000019 14.49996 10:49:10 AM -157.022 -107.175 Right

10:49:11 AM 6.935096 10.91855 6.000019 14.49985 10:49:11 AM -156.743 -108.717 Right

10:49:12 AM 6.980083 10.90996 6.000019 14.4998 10:49:12 AM -157.483 -108.025 Right

10:49:13 AM 7.017516 10.90872 6.000019 14.49997 10:49:13 AM -155.416 -109.184 Right

10:49:14 AM 7.063556 10.90722 6.000019 14.49997 10:49:14 AM -155.407 -108.072 Right

10:49:15 AM 7.118571 10.89903 6.000019 14.4998 10:49:15 AM -156.236 -109.088 Right

10:49:16 AM 7.174786 10.89298 6.000019 14.4998 10:49:16 AM -156.419 -109.466 Right

10:49:17 AM 7.215933 10.88938 6.000019 14.49996 10:49:17 AM -156.236 -110.046 Right

10:49:18 AM 7.264478 10.89126 6.000019 14.49985 10:49:18 AM -155.73 -111.292 Right

10:49:20 AM 7.317478 10.88759 6.000019 14.49985 10:49:20 AM -156.758 -110.175 Right

10:49:21 AM 7.370412 10.88661 6.000019 14.4998 10:49:21 AM -156.802 -109.903 Right

10:49:22 AM 7.431563 10.88682 6.000019 14.49996 10:49:22 AM -156.407 -109.974 Right

10:49:23 AM 7.481662 10.88469 6.000019 14.49985 10:49:23 AM -156.403 -111.606 Right

10:49:24 AM 7.534972 10.89207 6.000019 14.49997 10:49:24 AM -157.08 -110.23 Right

10:49:25 AM 7.587667 10.88978 6.000019 14.4998 10:49:25 AM -156.744 -112.294 Right

10:49:26 AM 7.630478 10.89282 6.000019 14.49996 10:49:26 AM -157.09 -112.263 Right

10:49:27 AM 7.681082 10.89302 6.000019 14.49985 10:49:27 AM -157.731 -110.933 Right

10:49:28 AM 7.729269 10.88848 6.000019 14.4998 10:49:28 AM -159.713 -110.795 Right

10:49:29 AM 7.784951 10.88866 6.000019 14.4998 10:49:29 AM -159.737 -111.53 Right

10:49:30 AM -158.392 -110.936 Right

10:49:31 AM 7.830725 10.88566 6.000019 14.4998 10:49:31 AM -158.401 -110.239 Right

10:49:32 AM 7.890501 10.89056 6.000019 14.49985 10:49:32 AM -158.374 -110.504 Right

10:49:33 AM 7.939007 10.88898 6.000019 14.49997 10:49:33 AM -159.051 -111.423 Right

10:49:34 AM 7.986757 10.88636 6.000019 14.49985 10:49:34 AM -159.405 -111.284 Right

10:49:35 AM 8.039586 10.88745 6.000019 14.49996 10:49:35 AM -158.385 -109.397 Right

10:49:36 AM 8.094811 10.88987 6.000019 14.49985

10:49:37 AM 8.131251 10.89508 6.000019 14.49997 10:49:37 AM -158.523 -110.836 Right

10:49:38 AM 8.176563 10.90366 6.000019 14.49997 10:49:38 AM -155.523 -111.281 Right

10:49:39 AM 8.231293 10.91576 6.000019 14.49985 10:49:39 AM -152.779 -110.483 Right

10:49:40 AM 8.289344 10.94089 6.000019 14.49997 10:49:40 AM -153.349 -112.228 Right

10:49:41 AM -150.696 -110.81 Right

10:49:42 AM 8.345519 10.96112 6.000019 14.49996 10:49:42 AM -151.937 -110.94 Right

10:49:43 AM 8.389972 10.98054 6.000019 14.49985 10:49:43 AM -148.683 -111.759 Right

10:49:44 AM 8.431435 11.00328 6.000019 14.49996 10:49:44 AM -145.437 -111.531 Right

10:49:45 AM 8.469328 11.03376 6.000019 14.49985 10:49:45 AM -143.15 -109.378 Right

10:49:46 AM 8.503297 11.06351 6.000019 14.49997 10:49:46 AM -143.65 -110.372 Right

10:49:47 AM 8.532343 11.09614 6.000019 14.49996 10:49:47 AM -139.962 -113.601 Right

10:49:48 AM 8.55698 11.13955 6.000019 14.4998 10:49:48 AM -136.877 -107.859 Right

10:49:49 AM 8.586799 11.1834 6.000019 14.49997 10:49:49 AM -136.966 -110.191 Right

10:49:50 AM 8.60596 11.22488 6.000019 14.49996 10:49:50 AM -134.525 -112.419 Right

10:49:51 AM 8.620514 11.27305 6.000019 14.49997 10:49:51 AM -129.555 -106.039 Right

10:49:52 AM 8.636838 11.32127 6.000019 14.49996 10:49:52 AM -135.688 -106.03 Right

10:49:53 AM -138.026 -106.087 Right

10:49:54 AM 8.660669 11.37096 6.000019 14.49985

10:49:55 AM 8.680889 11.4193 6.000019 14.49997 10:49:55 AM -138.332 -105.745 Right

10:49:56 AM 8.696641 11.45739 6.000019 14.49985 10:49:56 AM -138.194 -105.767 Right

10:49:57 AM 8.714813 11.50523 6.000019 14.49997 10:49:57 AM -137.605 -105.56 Right

10:49:58 AM 8.733751 11.55114 6.000019 14.49997 10:49:58 AM -138.131 -105.691 Right

10:49:59 AM 8.753076 11.59394 6.000019 14.49985 10:49:59 AM -138.673 -105.829 Right

10:50:00 AM 8.768402 11.63455 6.000019 14.49985 10:50:00 AM -137.724 -105.804 Right

10:50:01 AM 8.777872 11.67525 6.000019 14.49996 10:50:01 AM -134.969 -106.888 Right

10:50:02 AM 8.787521 11.71886 6.000019 14.49996 10:50:02 AM -134.061 -107.205 Right

10:50:03 AM 8.797791 11.76655 6.000019 14.4998 10:50:03 AM -133.896 -106.367 Right

10:50:04 AM 8.798552 11.81849 6.000019 14.49996 10:50:04 AM -133.149 -107.397 Right

10:50:05 AM -131.328 -107.105 Right

10:50:06 AM 8.801516 11.87918 6.000019 14.49996 10:50:06 AM -128.457 -108.196 Right

10:50:07 AM 8.792281 11.92729 6.000019 14.49996 10:50:07 AM -125.662 -109.687 Right

10:50:08 AM 8.782799 11.97552 6.000019 14.4998 10:50:08 AM -125.944 -108.996 Right

10:50:09 AM 8.772814 12.01702 6.000019 14.49997 10:50:09 AM -120.094 -109.701 Right

10:50:10 AM 8.753618 12.06511 6.000019 14.4998 10:50:10 AM -120.897 -109.682 Right

10:50:11 AM 8.738868 12.10582 6.000019 14.49997

10:50:12 AM 8.71206 12.1451 6.000019 14.4998 10:50:12 AM -119.367 -109.212 Right

10:50:13 AM 8.68756 12.1848 6.000019 14.4998 10:50:13 AM -118.161 -111.606 Right

10:50:14 AM 8.671518 12.20762 6.000019 14.49997 10:50:14 AM -109.393 -112.875 Left

10:50:15 AM 8.502569 12.34295 6.000019 14.49996 10:50:15 AM -111.599 -112.478 Left

10:50:16 AM 8.243203 12.55266 6.000019 14.49996 10:50:16 AM -109.938 -114.899 Left

10:50:17 AM 8.059642 12.70112 6.000019 14.49996 10:50:17 AM -108.31 -115.247 Left

10:50:18 AM 8.059722 12.70107 6.000019 14.4998 10:50:18 AM -109.198 -115.031 Left

10:50:19 AM -109.188 -115.021 Left

10:50:20 AM 8.059722 12.70107 6.000019 14.49985 10:50:20 AM -109.185 -115.018 Left

10:50:21 AM 8.059722 12.70107 6.000019 14.49996 10:50:21 AM -109.191 -115.031 Left

10:50:22 AM 8.059722 12.70107 6.000019 14.49997

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 317: Modelling a real-time multi-sensor fusion-based navigation ...

299

X Z X Z L R

10:48:28 AM 6 9.5 6 14.5

10:48:29 AM 6 9.5 6 14.5

10:48:30 AM 6 9.5 6 14.5

10:48:31 AM 5.99592 9.500831 6.000019 14.4998

10:48:32 AM 6.009068 9.629572 6.000019 14.4998

10:48:33 AM -132.71 -121.923 Right

10:48:34 AM 6.018241 9.722364 6.000019 14.49996 10:48:34 AM -134.793 -120.342 Right

10:48:35 AM 6.018951 9.722609 6.000019 14.49997 10:48:35 AM -132.379 -120.669 Right

10:48:36 AM 6.019325 9.72325 6.000019 14.49997 10:48:36 AM -131.267 -122.602 Right

10:48:37 AM 6.020663 9.723968 6.000019 14.49996 10:48:37 AM -132.982 -121.241 Right

10:48:38 AM 6.021132 9.724427 6.000019 14.4998 10:48:38 AM -132.61 -121.392 Right

10:48:39 AM 6.022233 9.724319 6.000019 14.49996 10:48:39 AM -130.948 -122.342 Right

10:48:40 AM 6.022468 9.724886 6.000019 14.4998 10:48:40 AM -130.999 -121.761 Right

10:48:41 AM 6.022222 9.767302 6.000019 14.49985 10:48:41 AM -130.463 -121.9 Right

10:48:42 AM 6.016362 10.10283 6.000019 14.49985 10:48:42 AM -130.08 -125.872 Right

10:48:43 AM -125.861 -126.63 Left

10:48:44 AM 6.00977 10.48014 6.000019 14.49996 10:48:44 AM -117.647 -125.408 Left

10:48:45 AM 6.00391 10.81541 6.000019 14.49985 10:48:45 AM -108.618 -122.762 Left

10:48:46 AM 5.987459 11.06953 6.000019 14.49996 10:48:46 AM -109.612 -123.527 Left

10:48:47 AM 5.973427 11.06703 6.000019 14.49985 10:48:47 AM -116.899 -114.422 Right

10:48:48 AM 5.96084 11.07514 6.000019 14.49985

10:48:49 AM 5.952713 11.08417 6.000019 14.49997 10:48:49 AM -125.39 -112.528 Right

10:48:50 AM 5.985577 11.10983 6.000019 14.4998 10:48:50 AM -134.342 -108.657 Right

10:48:51 AM 6.028952 11.12534 6.000019 14.49985 10:48:51 AM -140.865 -110.024 Right

10:48:52 AM 6.087961 11.13871 6.000019 14.49996 10:48:52 AM -148.442 -114.168 Right

10:48:53 AM 6.13534 11.14533 6.000019 14.4998 10:48:53 AM -146.531 -109.406 Right

10:48:54 AM 6.191909 11.13982 6.000019 14.49997 10:48:54 AM -149.177 -109.559 Right

10:48:55 AM -155.428 -109.39 Right

10:48:56 AM 6.240949 11.13314 6.000019 14.49997 10:48:56 AM -158.071 -108.683 Right

10:48:57 AM 6.282056 11.11869 6.000019 14.49997 10:48:57 AM -159.71 -107.75 Right

10:48:58 AM 6.332334 11.09471 6.000019 14.4998 10:48:58 AM -159.748 -107.205 Right

10:48:59 AM 6.392854 11.07143 6.000019 14.49985 10:48:59 AM -158.088 -107.451 Right

10:49:00 AM 6.441599 11.0538 6.000019 14.4998 10:49:00 AM -158.056 -108.785 Right

10:49:01 AM 6.488751 11.03888 6.000019 14.49997 10:49:01 AM -159.709 -108.889 Right

10:49:02 AM 6.535773 11.02227 6.000019 14.49996 10:49:02 AM -160.04 -108.38 Right

10:49:03 AM 6.588978 11.00392 6.000019 14.4998

10:49:04 AM 6.636698 10.98684 6.000019 14.4998 10:49:04 AM -159.712 -109.505 Right

10:49:05 AM 6.680408 10.96762 6.000019 14.4998 10:49:05 AM -159.402 -108.284 Right

10:49:06 AM 6.72623 10.95802 6.000019 14.49997 10:49:06 AM -159.332 -107.888 Right

10:49:07 AM 6.77289 10.94839 6.000019 14.49997 10:49:07 AM -158.034 -108.114 Right

10:49:08 AM -157.738 -108.273 Right

10:49:09 AM 6.828001 10.93324 6.000019 14.49985 10:49:09 AM -157.738 -108.324 Right

10:49:10 AM 6.892042 10.92458 6.000019 14.49996 10:49:10 AM -157.022 -107.175 Right

10:49:11 AM 6.935096 10.91855 6.000019 14.49985 10:49:11 AM -156.743 -108.717 Right

10:49:12 AM 6.980083 10.90996 6.000019 14.4998 10:49:12 AM -157.483 -108.025 Right

10:49:13 AM 7.017516 10.90872 6.000019 14.49997 10:49:13 AM -155.416 -109.184 Right

10:49:14 AM 7.063556 10.90722 6.000019 14.49997 10:49:14 AM -155.407 -108.072 Right

10:49:15 AM 7.118571 10.89903 6.000019 14.4998 10:49:15 AM -156.236 -109.088 Right

10:49:16 AM 7.174786 10.89298 6.000019 14.4998 10:49:16 AM -156.419 -109.466 Right

10:49:17 AM 7.215933 10.88938 6.000019 14.49996 10:49:17 AM -156.236 -110.046 Right

10:49:18 AM 7.264478 10.89126 6.000019 14.49985 10:49:18 AM -155.73 -111.292 Right

10:49:20 AM 7.317478 10.88759 6.000019 14.49985 10:49:20 AM -156.758 -110.175 Right

10:49:21 AM 7.370412 10.88661 6.000019 14.4998 10:49:21 AM -156.802 -109.903 Right

10:49:22 AM 7.431563 10.88682 6.000019 14.49996 10:49:22 AM -156.407 -109.974 Right

10:49:23 AM 7.481662 10.88469 6.000019 14.49985 10:49:23 AM -156.403 -111.606 Right

10:49:24 AM 7.534972 10.89207 6.000019 14.49997 10:49:24 AM -157.08 -110.23 Right

10:49:25 AM 7.587667 10.88978 6.000019 14.4998 10:49:25 AM -156.744 -112.294 Right

10:49:26 AM 7.630478 10.89282 6.000019 14.49996 10:49:26 AM -157.09 -112.263 Right

10:49:27 AM 7.681082 10.89302 6.000019 14.49985 10:49:27 AM -157.731 -110.933 Right

10:49:28 AM 7.729269 10.88848 6.000019 14.4998 10:49:28 AM -159.713 -110.795 Right

10:49:29 AM 7.784951 10.88866 6.000019 14.4998 10:49:29 AM -159.737 -111.53 Right

10:49:30 AM -158.392 -110.936 Right

10:49:31 AM 7.830725 10.88566 6.000019 14.4998 10:49:31 AM -158.401 -110.239 Right

10:49:32 AM 7.890501 10.89056 6.000019 14.49985 10:49:32 AM -158.374 -110.504 Right

10:49:33 AM 7.939007 10.88898 6.000019 14.49997 10:49:33 AM -159.051 -111.423 Right

10:49:34 AM 7.986757 10.88636 6.000019 14.49985 10:49:34 AM -159.405 -111.284 Right

10:49:35 AM 8.039586 10.88745 6.000019 14.49996 10:49:35 AM -158.385 -109.397 Right

10:49:36 AM 8.094811 10.88987 6.000019 14.49985

10:49:37 AM 8.131251 10.89508 6.000019 14.49997 10:49:37 AM -158.523 -110.836 Right

10:49:38 AM 8.176563 10.90366 6.000019 14.49997 10:49:38 AM -155.523 -111.281 Right

10:49:39 AM 8.231293 10.91576 6.000019 14.49985 10:49:39 AM -152.779 -110.483 Right

10:49:40 AM 8.289344 10.94089 6.000019 14.49997 10:49:40 AM -153.349 -112.228 Right

10:49:41 AM -150.696 -110.81 Right

10:49:42 AM 8.345519 10.96112 6.000019 14.49996 10:49:42 AM -151.937 -110.94 Right

10:49:43 AM 8.389972 10.98054 6.000019 14.49985 10:49:43 AM -148.683 -111.759 Right

10:49:44 AM 8.431435 11.00328 6.000019 14.49996 10:49:44 AM -145.437 -111.531 Right

10:49:45 AM 8.469328 11.03376 6.000019 14.49985 10:49:45 AM -143.15 -109.378 Right

10:49:46 AM 8.503297 11.06351 6.000019 14.49997 10:49:46 AM -143.65 -110.372 Right

10:49:47 AM 8.532343 11.09614 6.000019 14.49996 10:49:47 AM -139.962 -113.601 Right

10:49:48 AM 8.55698 11.13955 6.000019 14.4998 10:49:48 AM -136.877 -107.859 Right

10:49:49 AM 8.586799 11.1834 6.000019 14.49997 10:49:49 AM -136.966 -110.191 Right

10:49:50 AM 8.60596 11.22488 6.000019 14.49996 10:49:50 AM -134.525 -112.419 Right

10:49:51 AM 8.620514 11.27305 6.000019 14.49997 10:49:51 AM -129.555 -106.039 Right

10:49:52 AM 8.636838 11.32127 6.000019 14.49996 10:49:52 AM -135.688 -106.03 Right

10:49:53 AM -138.026 -106.087 Right

10:49:54 AM 8.660669 11.37096 6.000019 14.49985

10:49:55 AM 8.680889 11.4193 6.000019 14.49997 10:49:55 AM -138.332 -105.745 Right

10:49:56 AM 8.696641 11.45739 6.000019 14.49985 10:49:56 AM -138.194 -105.767 Right

10:49:57 AM 8.714813 11.50523 6.000019 14.49997 10:49:57 AM -137.605 -105.56 Right

10:49:58 AM 8.733751 11.55114 6.000019 14.49997 10:49:58 AM -138.131 -105.691 Right

10:49:59 AM 8.753076 11.59394 6.000019 14.49985 10:49:59 AM -138.673 -105.829 Right

10:50:00 AM 8.768402 11.63455 6.000019 14.49985 10:50:00 AM -137.724 -105.804 Right

10:50:01 AM 8.777872 11.67525 6.000019 14.49996 10:50:01 AM -134.969 -106.888 Right

10:50:02 AM 8.787521 11.71886 6.000019 14.49996 10:50:02 AM -134.061 -107.205 Right

10:50:03 AM 8.797791 11.76655 6.000019 14.4998 10:50:03 AM -133.896 -106.367 Right

10:50:04 AM 8.798552 11.81849 6.000019 14.49996 10:50:04 AM -133.149 -107.397 Right

10:50:05 AM -131.328 -107.105 Right

10:50:06 AM 8.801516 11.87918 6.000019 14.49996 10:50:06 AM -128.457 -108.196 Right

10:50:07 AM 8.792281 11.92729 6.000019 14.49996 10:50:07 AM -125.662 -109.687 Right

10:50:08 AM 8.782799 11.97552 6.000019 14.4998 10:50:08 AM -125.944 -108.996 Right

10:50:09 AM 8.772814 12.01702 6.000019 14.49997 10:50:09 AM -120.094 -109.701 Right

10:50:10 AM 8.753618 12.06511 6.000019 14.4998 10:50:10 AM -120.897 -109.682 Right

10:50:11 AM 8.738868 12.10582 6.000019 14.49997

10:50:12 AM 8.71206 12.1451 6.000019 14.4998 10:50:12 AM -119.367 -109.212 Right

10:50:13 AM 8.68756 12.1848 6.000019 14.4998 10:50:13 AM -118.161 -111.606 Right

10:50:14 AM 8.671518 12.20762 6.000019 14.49997 10:50:14 AM -109.393 -112.875 Left

10:50:15 AM 8.502569 12.34295 6.000019 14.49996 10:50:15 AM -111.599 -112.478 Left

10:50:16 AM 8.243203 12.55266 6.000019 14.49996 10:50:16 AM -109.938 -114.899 Left

10:50:17 AM 8.059642 12.70112 6.000019 14.49996 10:50:17 AM -108.31 -115.247 Left

10:50:18 AM 8.059722 12.70107 6.000019 14.4998 10:50:18 AM -109.198 -115.031 Left

10:50:19 AM -109.188 -115.021 Left

10:50:20 AM 8.059722 12.70107 6.000019 14.49985 10:50:20 AM -109.185 -115.018 Left

10:50:21 AM 8.059722 12.70107 6.000019 14.49996 10:50:21 AM -109.191 -115.031 Left

10:50:22 AM 8.059722 12.70107 6.000019 14.49997

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 318: Modelling a real-time multi-sensor fusion-based navigation ...

300

X Z X Z L R

10:48:28 AM 6 9.5 6 14.5

10:48:29 AM 6 9.5 6 14.5

10:48:30 AM 6 9.5 6 14.5

10:48:31 AM 5.99592 9.500831 6.000019 14.4998

10:48:32 AM 6.009068 9.629572 6.000019 14.4998

10:48:33 AM -132.71 -121.923 Right

10:48:34 AM 6.018241 9.722364 6.000019 14.49996 10:48:34 AM -134.793 -120.342 Right

10:48:35 AM 6.018951 9.722609 6.000019 14.49997 10:48:35 AM -132.379 -120.669 Right

10:48:36 AM 6.019325 9.72325 6.000019 14.49997 10:48:36 AM -131.267 -122.602 Right

10:48:37 AM 6.020663 9.723968 6.000019 14.49996 10:48:37 AM -132.982 -121.241 Right

10:48:38 AM 6.021132 9.724427 6.000019 14.4998 10:48:38 AM -132.61 -121.392 Right

10:48:39 AM 6.022233 9.724319 6.000019 14.49996 10:48:39 AM -130.948 -122.342 Right

10:48:40 AM 6.022468 9.724886 6.000019 14.4998 10:48:40 AM -130.999 -121.761 Right

10:48:41 AM 6.022222 9.767302 6.000019 14.49985 10:48:41 AM -130.463 -121.9 Right

10:48:42 AM 6.016362 10.10283 6.000019 14.49985 10:48:42 AM -130.08 -125.872 Right

10:48:43 AM -125.861 -126.63 Left

10:48:44 AM 6.00977 10.48014 6.000019 14.49996 10:48:44 AM -117.647 -125.408 Left

10:48:45 AM 6.00391 10.81541 6.000019 14.49985 10:48:45 AM -108.618 -122.762 Left

10:48:46 AM 5.987459 11.06953 6.000019 14.49996 10:48:46 AM -109.612 -123.527 Left

10:48:47 AM 5.973427 11.06703 6.000019 14.49985 10:48:47 AM -116.899 -114.422 Right

10:48:48 AM 5.96084 11.07514 6.000019 14.49985

10:48:49 AM 5.952713 11.08417 6.000019 14.49997 10:48:49 AM -125.39 -112.528 Right

10:48:50 AM 5.985577 11.10983 6.000019 14.4998 10:48:50 AM -134.342 -108.657 Right

10:48:51 AM 6.028952 11.12534 6.000019 14.49985 10:48:51 AM -140.865 -110.024 Right

10:48:52 AM 6.087961 11.13871 6.000019 14.49996 10:48:52 AM -148.442 -114.168 Right

10:48:53 AM 6.13534 11.14533 6.000019 14.4998 10:48:53 AM -146.531 -109.406 Right

10:48:54 AM 6.191909 11.13982 6.000019 14.49997 10:48:54 AM -149.177 -109.559 Right

10:48:55 AM -155.428 -109.39 Right

10:48:56 AM 6.240949 11.13314 6.000019 14.49997 10:48:56 AM -158.071 -108.683 Right

10:48:57 AM 6.282056 11.11869 6.000019 14.49997 10:48:57 AM -159.71 -107.75 Right

10:48:58 AM 6.332334 11.09471 6.000019 14.4998 10:48:58 AM -159.748 -107.205 Right

10:48:59 AM 6.392854 11.07143 6.000019 14.49985 10:48:59 AM -158.088 -107.451 Right

10:49:00 AM 6.441599 11.0538 6.000019 14.4998 10:49:00 AM -158.056 -108.785 Right

10:49:01 AM 6.488751 11.03888 6.000019 14.49997 10:49:01 AM -159.709 -108.889 Right

10:49:02 AM 6.535773 11.02227 6.000019 14.49996 10:49:02 AM -160.04 -108.38 Right

10:49:03 AM 6.588978 11.00392 6.000019 14.4998

10:49:04 AM 6.636698 10.98684 6.000019 14.4998 10:49:04 AM -159.712 -109.505 Right

10:49:05 AM 6.680408 10.96762 6.000019 14.4998 10:49:05 AM -159.402 -108.284 Right

10:49:06 AM 6.72623 10.95802 6.000019 14.49997 10:49:06 AM -159.332 -107.888 Right

10:49:07 AM 6.77289 10.94839 6.000019 14.49997 10:49:07 AM -158.034 -108.114 Right

10:49:08 AM -157.738 -108.273 Right

10:49:09 AM 6.828001 10.93324 6.000019 14.49985 10:49:09 AM -157.738 -108.324 Right

10:49:10 AM 6.892042 10.92458 6.000019 14.49996 10:49:10 AM -157.022 -107.175 Right

10:49:11 AM 6.935096 10.91855 6.000019 14.49985 10:49:11 AM -156.743 -108.717 Right

10:49:12 AM 6.980083 10.90996 6.000019 14.4998 10:49:12 AM -157.483 -108.025 Right

10:49:13 AM 7.017516 10.90872 6.000019 14.49997 10:49:13 AM -155.416 -109.184 Right

10:49:14 AM 7.063556 10.90722 6.000019 14.49997 10:49:14 AM -155.407 -108.072 Right

10:49:15 AM 7.118571 10.89903 6.000019 14.4998 10:49:15 AM -156.236 -109.088 Right

10:49:16 AM 7.174786 10.89298 6.000019 14.4998 10:49:16 AM -156.419 -109.466 Right

10:49:17 AM 7.215933 10.88938 6.000019 14.49996 10:49:17 AM -156.236 -110.046 Right

10:49:18 AM 7.264478 10.89126 6.000019 14.49985 10:49:18 AM -155.73 -111.292 Right

10:49:20 AM 7.317478 10.88759 6.000019 14.49985 10:49:20 AM -156.758 -110.175 Right

10:49:21 AM 7.370412 10.88661 6.000019 14.4998 10:49:21 AM -156.802 -109.903 Right

10:49:22 AM 7.431563 10.88682 6.000019 14.49996 10:49:22 AM -156.407 -109.974 Right

10:49:23 AM 7.481662 10.88469 6.000019 14.49985 10:49:23 AM -156.403 -111.606 Right

10:49:24 AM 7.534972 10.89207 6.000019 14.49997 10:49:24 AM -157.08 -110.23 Right

10:49:25 AM 7.587667 10.88978 6.000019 14.4998 10:49:25 AM -156.744 -112.294 Right

10:49:26 AM 7.630478 10.89282 6.000019 14.49996 10:49:26 AM -157.09 -112.263 Right

10:49:27 AM 7.681082 10.89302 6.000019 14.49985 10:49:27 AM -157.731 -110.933 Right

10:49:28 AM 7.729269 10.88848 6.000019 14.4998 10:49:28 AM -159.713 -110.795 Right

10:49:29 AM 7.784951 10.88866 6.000019 14.4998 10:49:29 AM -159.737 -111.53 Right

10:49:30 AM -158.392 -110.936 Right

10:49:31 AM 7.830725 10.88566 6.000019 14.4998 10:49:31 AM -158.401 -110.239 Right

10:49:32 AM 7.890501 10.89056 6.000019 14.49985 10:49:32 AM -158.374 -110.504 Right

10:49:33 AM 7.939007 10.88898 6.000019 14.49997 10:49:33 AM -159.051 -111.423 Right

10:49:34 AM 7.986757 10.88636 6.000019 14.49985 10:49:34 AM -159.405 -111.284 Right

10:49:35 AM 8.039586 10.88745 6.000019 14.49996 10:49:35 AM -158.385 -109.397 Right

10:49:36 AM 8.094811 10.88987 6.000019 14.49985

10:49:37 AM 8.131251 10.89508 6.000019 14.49997 10:49:37 AM -158.523 -110.836 Right

10:49:38 AM 8.176563 10.90366 6.000019 14.49997 10:49:38 AM -155.523 -111.281 Right

10:49:39 AM 8.231293 10.91576 6.000019 14.49985 10:49:39 AM -152.779 -110.483 Right

10:49:40 AM 8.289344 10.94089 6.000019 14.49997 10:49:40 AM -153.349 -112.228 Right

10:49:41 AM -150.696 -110.81 Right

10:49:42 AM 8.345519 10.96112 6.000019 14.49996 10:49:42 AM -151.937 -110.94 Right

10:49:43 AM 8.389972 10.98054 6.000019 14.49985 10:49:43 AM -148.683 -111.759 Right

10:49:44 AM 8.431435 11.00328 6.000019 14.49996 10:49:44 AM -145.437 -111.531 Right

10:49:45 AM 8.469328 11.03376 6.000019 14.49985 10:49:45 AM -143.15 -109.378 Right

10:49:46 AM 8.503297 11.06351 6.000019 14.49997 10:49:46 AM -143.65 -110.372 Right

10:49:47 AM 8.532343 11.09614 6.000019 14.49996 10:49:47 AM -139.962 -113.601 Right

10:49:48 AM 8.55698 11.13955 6.000019 14.4998 10:49:48 AM -136.877 -107.859 Right

10:49:49 AM 8.586799 11.1834 6.000019 14.49997 10:49:49 AM -136.966 -110.191 Right

10:49:50 AM 8.60596 11.22488 6.000019 14.49996 10:49:50 AM -134.525 -112.419 Right

10:49:51 AM 8.620514 11.27305 6.000019 14.49997 10:49:51 AM -129.555 -106.039 Right

10:49:52 AM 8.636838 11.32127 6.000019 14.49996 10:49:52 AM -135.688 -106.03 Right

10:49:53 AM -138.026 -106.087 Right

10:49:54 AM 8.660669 11.37096 6.000019 14.49985

10:49:55 AM 8.680889 11.4193 6.000019 14.49997 10:49:55 AM -138.332 -105.745 Right

10:49:56 AM 8.696641 11.45739 6.000019 14.49985 10:49:56 AM -138.194 -105.767 Right

10:49:57 AM 8.714813 11.50523 6.000019 14.49997 10:49:57 AM -137.605 -105.56 Right

10:49:58 AM 8.733751 11.55114 6.000019 14.49997 10:49:58 AM -138.131 -105.691 Right

10:49:59 AM 8.753076 11.59394 6.000019 14.49985 10:49:59 AM -138.673 -105.829 Right

10:50:00 AM 8.768402 11.63455 6.000019 14.49985 10:50:00 AM -137.724 -105.804 Right

10:50:01 AM 8.777872 11.67525 6.000019 14.49996 10:50:01 AM -134.969 -106.888 Right

10:50:02 AM 8.787521 11.71886 6.000019 14.49996 10:50:02 AM -134.061 -107.205 Right

10:50:03 AM 8.797791 11.76655 6.000019 14.4998 10:50:03 AM -133.896 -106.367 Right

10:50:04 AM 8.798552 11.81849 6.000019 14.49996 10:50:04 AM -133.149 -107.397 Right

10:50:05 AM -131.328 -107.105 Right

10:50:06 AM 8.801516 11.87918 6.000019 14.49996 10:50:06 AM -128.457 -108.196 Right

10:50:07 AM 8.792281 11.92729 6.000019 14.49996 10:50:07 AM -125.662 -109.687 Right

10:50:08 AM 8.782799 11.97552 6.000019 14.4998 10:50:08 AM -125.944 -108.996 Right

10:50:09 AM 8.772814 12.01702 6.000019 14.49997 10:50:09 AM -120.094 -109.701 Right

10:50:10 AM 8.753618 12.06511 6.000019 14.4998 10:50:10 AM -120.897 -109.682 Right

10:50:11 AM 8.738868 12.10582 6.000019 14.49997

10:50:12 AM 8.71206 12.1451 6.000019 14.4998 10:50:12 AM -119.367 -109.212 Right

10:50:13 AM 8.68756 12.1848 6.000019 14.4998 10:50:13 AM -118.161 -111.606 Right

10:50:14 AM 8.671518 12.20762 6.000019 14.49997 10:50:14 AM -109.393 -112.875 Left

10:50:15 AM 8.502569 12.34295 6.000019 14.49996 10:50:15 AM -111.599 -112.478 Left

10:50:16 AM 8.243203 12.55266 6.000019 14.49996 10:50:16 AM -109.938 -114.899 Left

10:50:17 AM 8.059642 12.70112 6.000019 14.49996 10:50:17 AM -108.31 -115.247 Left

10:50:18 AM 8.059722 12.70107 6.000019 14.4998 10:50:18 AM -109.198 -115.031 Left

10:50:19 AM -109.188 -115.021 Left

10:50:20 AM 8.059722 12.70107 6.000019 14.49985 10:50:20 AM -109.185 -115.018 Left

10:50:21 AM 8.059722 12.70107 6.000019 14.49996 10:50:21 AM -109.191 -115.031 Left

10:50:22 AM 8.059722 12.70107 6.000019 14.49997

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 319: Modelling a real-time multi-sensor fusion-based navigation ...

301

Non-Uniform Obstruction with Right Scatter Sample 3

Motion Path Graph – Non-Uniform obstruction with Right Scatter – Sample 3.

Combined logs – Non-Uniform obstruction with Right Scatter – Sample 3.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.566.577.588.59

29-6-10-52

CARMI Child

X Z X Z L R

10:52:40 AM 6 9.5 6 14.5

10:52:41 AM 6 9.5 6 14.5

10:52:42 AM 6 9.5 6 14.5

10:52:43 AM 5.990528 9.498259 6.000019 14.49985

10:52:44 AM 5.983474 9.48295 6.000019 14.49985

10:52:45 AM 5.964692 9.447685 6.000019 14.49997

10:52:46 AM -135.351 -112.688 Right

10:52:47 AM 5.976819 9.472078 6.000019 14.49997

10:52:48 AM 5.995047 9.528976 6.000019 14.49996 10:52:48 AM -132.694 -112.963 Right

10:52:49 AM 6.001012 9.544574 6.000019 14.49997 10:52:49 AM -129.037 -123.193 Right

10:52:50 AM 5.999889 9.54507 6.000019 14.49996 10:52:50 AM -130.817 -122.891 Right

10:52:51 AM 6.004016 9.72869 6.000019 14.4998 10:52:51 AM -133.053 -120.466 Right

10:52:52 AM 6.003926 9.733294 6.000019 14.49997 10:52:52 AM -131.333 -121.208 Right

10:52:53 AM 6.003814 9.891059 6.000019 14.49985 10:52:53 AM -130.561 -123.944 Right

10:52:54 AM 6.003565 10.23235 6.000019 14.49985 10:52:54 AM -130.185 -125.26 Right

10:52:55 AM 6.003317 10.57364 6.000019 14.49997 10:52:55 AM -123.365 -125.353 Left

10:52:56 AM 6.003062 10.92462 6.000019 14.4998 10:52:56 AM -114.681 -123.52 Left

10:52:57 AM 5.995727 11.01945 6.000019 14.49996 10:52:57 AM -108.929 -123.517 Left

10:52:58 AM -116.685 -114.294 Right

10:52:59 AM 5.987431 11.02194 6.000019 14.4998

10:53:00 AM 5.979954 11.02625 6.000019 14.49996 10:53:00 AM -125.151 -121.906 Right

10:53:01 AM 5.980797 11.04088 6.000019 14.49985 10:53:01 AM -126.948 -122.591 Right

10:53:02 AM 5.97635 11.04935 6.000019 14.49997 10:53:02 AM -129.48 -110.527 Right

10:53:03 AM 5.972887 11.05684 6.000019 14.4998 10:53:03 AM -135.907 -110.954 Right

10:53:04 AM 6.014514 11.08062 6.000019 14.4998 10:53:04 AM -140.432 -112.371 Right

10:53:05 AM 6.063323 11.10123 6.000019 14.4998 10:53:05 AM -142.056 -109.402 Right

10:53:06 AM 6.107279 11.10958 6.000019 14.49997 10:53:06 AM -146.553 -109.286 Right

10:53:07 AM 6.159662 11.10649 6.000019 14.49985 10:53:07 AM -155.076 -109.011 Right

10:53:08 AM 6.212564 11.09899 6.000019 14.49985 10:53:08 AM -155.751 -108.187 Right

10:53:09 AM -157.718 -109.506 Right

10:53:10 AM 6.25217 11.08564 6.000019 14.4998 10:53:10 AM -158.383 -108.117 Right

10:53:11 AM 6.29176 11.06707 6.000019 14.49997 10:53:11 AM -158.377 -106.647 Right

10:53:12 AM 6.333875 11.05068 6.000019 14.49996 10:53:12 AM -158.384 -107.564 Right

10:53:13 AM 6.381655 11.03246 6.000019 14.49996 10:53:13 AM -158.394 -108.101 Right

10:53:14 AM 6.417175 11.01487 6.000019 14.49996 10:53:14 AM -158.387 -109.07 Right

10:53:15 AM 6.461335 10.99534 6.000019 14.4998

10:53:16 AM 6.505663 10.97607 6.000019 14.49997 10:53:16 AM -158.054 -106.582 Right

10:53:17 AM 6.553511 10.96292 6.000019 14.49996 10:53:17 AM -158.308 -107.949 Right

10:53:18 AM 6.598693 10.9533 6.000019 14.49985 10:53:18 AM -157.61 -107.732 Right

10:53:19 AM 6.638103 10.94376 6.000019 14.49997 10:53:19 AM -156.833 -107.427 Right

10:53:20 AM 6.720123 10.93359 6.000019 14.49985 10:53:20 AM -157.147 -107.164 Right

10:53:21 AM -156.413 -109.544 Right

10:53:22 AM 6.757442 10.92128 6.000019 14.49997 10:53:22 AM -157.677 -107.981 Right

10:53:23 AM 6.813384 10.91371 6.000019 14.49985 10:53:23 AM -156.376 -108.387 Right

10:53:24 AM 6.875503 10.90206 6.000019 14.49996 10:53:24 AM -155.914 -109.041 Right

10:53:25 AM 6.929533 10.90117 6.000019 14.49985 10:53:25 AM -156.186 -110.265 Right

10:53:26 AM 6.986537 10.89872 6.000019 14.4998 10:53:26 AM -155.74 -109.148 Right

10:53:27 AM 7.038148 10.89448 6.000019 14.49997 10:53:27 AM -154.433 -109.864 Right

10:53:28 AM 7.083647 10.89595 6.000019 14.49985 10:53:28 AM -154.601 -112.082 Right

10:53:29 AM 7.128892 10.88597 6.000019 14.49985 10:53:29 AM -156.872 -109.209 Right

10:53:30 AM 7.179657 10.88296 6.000019 14.4998 10:53:30 AM -156.306 -109.468 Right

10:53:31 AM 7.23559 10.88592 6.000019 14.49996

10:53:32 AM 7.29436 10.88703 6.000019 14.49985 10:53:32 AM -155.111 -110.351 Right

10:53:33 AM -154.422 -109.447 Right

10:53:34 AM 7.348278 10.88949 6.000019 14.4998 10:53:34 AM -155.486 -112.373 Right

10:53:35 AM 7.405777 10.89291 6.000019 14.49996 10:53:35 AM -156.363 -107.649 Right

10:53:36 AM 7.458394 10.89381 6.000019 14.49997 10:53:36 AM -156.399 -108.95 Right

10:53:37 AM 7.506371 10.8902 6.000019 14.49996 10:53:37 AM -157.064 -111.3 Right

10:53:38 AM 7.558899 10.8893 6.000019 14.4998 10:53:38 AM -158.081 -110.418 Right

10:53:39 AM 7.601651 10.88891 6.000019 14.4998 10:53:39 AM -158.058 -109.881 Right

10:53:40 AM 7.653618 10.88245 6.000019 14.49996 10:53:40 AM -158.054 -111.747 Right

10:53:41 AM 7.704514 10.88886 6.000019 14.49997 10:53:41 AM -157.73 -110.094 Right

10:53:42 AM 7.750939 10.8865 6.000019 14.49996 10:53:42 AM -157.74 -111.849 Right

10:53:43 AM 7.807344 10.8842 6.000019 14.49997 10:53:43 AM -158.396 -111.206 Right

10:53:44 AM 7.859224 10.88212 6.000019 14.4998 10:53:44 AM -158.389 -111.773 Right

10:53:45 AM -158.397 -110.244 Right

10:53:46 AM 7.911872 10.89171 6.000019 14.49997 10:53:46 AM -159.069 -113.59 Right

10:53:47 AM 7.966067 10.88703 6.000019 14.4998 10:53:47 AM -159.71 -111.984 Right

10:53:48 AM 8.021511 10.88443 6.000019 14.49996 10:53:48 AM -159.763 -110.955 Right

10:53:49 AM 8.072015 10.88268 6.000019 14.4998

10:53:50 AM 8.120579 10.88816 6.000019 14.49985 10:53:50 AM -158.223 -110.914 Right

10:53:51 AM 8.165326 10.89625 6.000019 14.4998 10:53:51 AM -154.928 -111.881 Right

10:53:52 AM 8.218029 10.90933 6.000019 14.49985 10:53:52 AM -153.377 -111.077 Right

10:53:53 AM 8.261042 10.92638 6.000019 14.49985 10:53:53 AM -154.02 -110.843 Right

10:53:54 AM 8.298958 10.93854 6.000019 14.49997 10:53:54 AM -151.259 -110.928 Right

10:53:55 AM 8.348174 10.96123 6.000019 14.49985 10:53:55 AM -150.58 -111.4 Right

10:53:56 AM -149.91 -110.882 Right

10:53:57 AM 8.388549 10.98051 6.000019 14.4998 10:53:57 AM -148.548 -112.013 Right

10:53:58 AM 8.427267 11.00776 6.000019 14.49985 10:53:58 AM -144.717 -112.751 Right

10:53:59 AM 8.473227 11.04639 6.000019 14.49997 10:53:59 AM -143.321 -110.777 Right

10:54:00 AM 8.506916 11.08338 6.000019 14.49985 10:54:00 AM -142.628 -111.792 Right

10:54:01 AM 8.542924 11.12478 6.000019 14.49997 10:54:01 AM -139.227 -110.664 Right

10:54:02 AM 8.568088 11.15876 6.000019 14.49996 10:54:02 AM -136.317 -109.043 Right

10:54:03 AM 8.591011 11.2053 6.000019 14.49985 10:54:03 AM -135.81 -106.855 Right

10:54:04 AM 8.610962 11.25364 6.000019 14.49997 10:54:04 AM -135.477 -106.463 Right

10:54:05 AM 8.623846 11.30002 6.000019 14.49996 10:54:05 AM -135.518 -105.044 Right

10:54:06 AM 8.646067 11.35177 6.000019 14.4998 10:54:06 AM -136.213 -106.328 Right

10:54:08 AM 8.662441 11.39136 6.000019 14.49997 10:54:08 AM -136.133 -104.911 Right

10:54:09 AM 8.676679 11.43199 6.000019 14.4998 10:54:09 AM -138.061 -104.817 Right

10:54:10 AM 8.694728 11.47722 6.000019 14.49997 10:54:10 AM -138.703 -105.996 Right

10:54:11 AM 8.715306 11.52419 6.000019 14.49985 10:54:11 AM -138.371 -105.558 Right

10:54:12 AM 8.732287 11.57507 6.000019 14.49997 10:54:12 AM -136.94 -104.998 Right

10:54:13 AM 8.754978 11.62936 6.000019 14.49985 10:54:13 AM -135.752 -107.284 Right

10:54:14 AM 8.767001 11.67349 6.000019 14.4998 10:54:14 AM -137.311 -105.438 Right

10:54:15 AM 8.779992 11.72279 6.000019 14.49996 10:54:15 AM -132.107 -107.746 Right

10:54:16 AM 8.787507 11.77128 6.000019 14.49996 10:54:16 AM -131.524 -105.908 Right

10:54:17 AM 8.791326 11.81702 6.000019 14.49996 10:54:17 AM -132.436 -107.511 Right

10:54:18 AM 8.793837 11.86645 6.000019 14.49996 10:54:18 AM -130.448 -107.309 Right

10:54:19 AM -128.027 -108.854 Right

10:54:20 AM 8.786958 11.91652 6.000019 14.49985 10:54:20 AM -127.804 -108.292 Right

10:54:21 AM 8.782507 11.96116 6.000019 14.49997 10:54:21 AM -127.025 -109.036 Right

10:54:22 AM 8.769834 12.00736 6.000019 14.49996 10:54:22 AM -117.526 -111.993 Right

10:54:23 AM 8.755476 12.04625 6.000019 14.49985 10:54:23 AM -121.104 -107.875 Right

10:54:24 AM 8.738588 12.08933 6.000019 14.49997 10:54:24 AM -123.031 -110.738 Right

10:54:25 AM 8.716022 12.14155 6.000019 14.49996 10:54:25 AM -113.126 -112.385 Right

10:54:26 AM 8.68796 12.18189 6.000019 14.49985

10:54:27 AM 8.681116 12.19664 6.000019 14.49997 10:54:27 AM -110.794 -113.232 Left

10:54:28 AM 8.621801 12.25184 6.000019 14.49996 10:54:28 AM -113.744 -114.925 Left

10:54:29 AM 8.373569 12.49205 6.000019 14.4998 10:54:29 AM -116.175 -111.088 Right

10:54:30 AM 8.140544 12.71758 6.000019 14.49996 10:54:30 AM -117.775 -111.673 Right

10:54:31 AM -110.326 -111.802 Left

10:54:32 AM 8.139504 12.71844 6.000019 14.49985 10:54:32 AM -110.243 -111.927 Left

10:54:33 AM 8.139413 12.71875 6.000019 14.49997 10:54:33 AM -110.108 -111.913 Left

10:54:34 AM 8.139585 12.71886 6.000019 14.49985 10:54:34 AM -110.14 -111.911 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 320: Modelling a real-time multi-sensor fusion-based navigation ...

302

X Z X Z L R

10:52:40 AM 6 9.5 6 14.5

10:52:41 AM 6 9.5 6 14.5

10:52:42 AM 6 9.5 6 14.5

10:52:43 AM 5.990528 9.498259 6.000019 14.49985

10:52:44 AM 5.983474 9.48295 6.000019 14.49985

10:52:45 AM 5.964692 9.447685 6.000019 14.49997

10:52:46 AM -135.351 -112.688 Right

10:52:47 AM 5.976819 9.472078 6.000019 14.49997

10:52:48 AM 5.995047 9.528976 6.000019 14.49996 10:52:48 AM -132.694 -112.963 Right

10:52:49 AM 6.001012 9.544574 6.000019 14.49997 10:52:49 AM -129.037 -123.193 Right

10:52:50 AM 5.999889 9.54507 6.000019 14.49996 10:52:50 AM -130.817 -122.891 Right

10:52:51 AM 6.004016 9.72869 6.000019 14.4998 10:52:51 AM -133.053 -120.466 Right

10:52:52 AM 6.003926 9.733294 6.000019 14.49997 10:52:52 AM -131.333 -121.208 Right

10:52:53 AM 6.003814 9.891059 6.000019 14.49985 10:52:53 AM -130.561 -123.944 Right

10:52:54 AM 6.003565 10.23235 6.000019 14.49985 10:52:54 AM -130.185 -125.26 Right

10:52:55 AM 6.003317 10.57364 6.000019 14.49997 10:52:55 AM -123.365 -125.353 Left

10:52:56 AM 6.003062 10.92462 6.000019 14.4998 10:52:56 AM -114.681 -123.52 Left

10:52:57 AM 5.995727 11.01945 6.000019 14.49996 10:52:57 AM -108.929 -123.517 Left

10:52:58 AM -116.685 -114.294 Right

10:52:59 AM 5.987431 11.02194 6.000019 14.4998

10:53:00 AM 5.979954 11.02625 6.000019 14.49996 10:53:00 AM -125.151 -121.906 Right

10:53:01 AM 5.980797 11.04088 6.000019 14.49985 10:53:01 AM -126.948 -122.591 Right

10:53:02 AM 5.97635 11.04935 6.000019 14.49997 10:53:02 AM -129.48 -110.527 Right

10:53:03 AM 5.972887 11.05684 6.000019 14.4998 10:53:03 AM -135.907 -110.954 Right

10:53:04 AM 6.014514 11.08062 6.000019 14.4998 10:53:04 AM -140.432 -112.371 Right

10:53:05 AM 6.063323 11.10123 6.000019 14.4998 10:53:05 AM -142.056 -109.402 Right

10:53:06 AM 6.107279 11.10958 6.000019 14.49997 10:53:06 AM -146.553 -109.286 Right

10:53:07 AM 6.159662 11.10649 6.000019 14.49985 10:53:07 AM -155.076 -109.011 Right

10:53:08 AM 6.212564 11.09899 6.000019 14.49985 10:53:08 AM -155.751 -108.187 Right

10:53:09 AM -157.718 -109.506 Right

10:53:10 AM 6.25217 11.08564 6.000019 14.4998 10:53:10 AM -158.383 -108.117 Right

10:53:11 AM 6.29176 11.06707 6.000019 14.49997 10:53:11 AM -158.377 -106.647 Right

10:53:12 AM 6.333875 11.05068 6.000019 14.49996 10:53:12 AM -158.384 -107.564 Right

10:53:13 AM 6.381655 11.03246 6.000019 14.49996 10:53:13 AM -158.394 -108.101 Right

10:53:14 AM 6.417175 11.01487 6.000019 14.49996 10:53:14 AM -158.387 -109.07 Right

10:53:15 AM 6.461335 10.99534 6.000019 14.4998

10:53:16 AM 6.505663 10.97607 6.000019 14.49997 10:53:16 AM -158.054 -106.582 Right

10:53:17 AM 6.553511 10.96292 6.000019 14.49996 10:53:17 AM -158.308 -107.949 Right

10:53:18 AM 6.598693 10.9533 6.000019 14.49985 10:53:18 AM -157.61 -107.732 Right

10:53:19 AM 6.638103 10.94376 6.000019 14.49997 10:53:19 AM -156.833 -107.427 Right

10:53:20 AM 6.720123 10.93359 6.000019 14.49985 10:53:20 AM -157.147 -107.164 Right

10:53:21 AM -156.413 -109.544 Right

10:53:22 AM 6.757442 10.92128 6.000019 14.49997 10:53:22 AM -157.677 -107.981 Right

10:53:23 AM 6.813384 10.91371 6.000019 14.49985 10:53:23 AM -156.376 -108.387 Right

10:53:24 AM 6.875503 10.90206 6.000019 14.49996 10:53:24 AM -155.914 -109.041 Right

10:53:25 AM 6.929533 10.90117 6.000019 14.49985 10:53:25 AM -156.186 -110.265 Right

10:53:26 AM 6.986537 10.89872 6.000019 14.4998 10:53:26 AM -155.74 -109.148 Right

10:53:27 AM 7.038148 10.89448 6.000019 14.49997 10:53:27 AM -154.433 -109.864 Right

10:53:28 AM 7.083647 10.89595 6.000019 14.49985 10:53:28 AM -154.601 -112.082 Right

10:53:29 AM 7.128892 10.88597 6.000019 14.49985 10:53:29 AM -156.872 -109.209 Right

10:53:30 AM 7.179657 10.88296 6.000019 14.4998 10:53:30 AM -156.306 -109.468 Right

10:53:31 AM 7.23559 10.88592 6.000019 14.49996

10:53:32 AM 7.29436 10.88703 6.000019 14.49985 10:53:32 AM -155.111 -110.351 Right

10:53:33 AM -154.422 -109.447 Right

10:53:34 AM 7.348278 10.88949 6.000019 14.4998 10:53:34 AM -155.486 -112.373 Right

10:53:35 AM 7.405777 10.89291 6.000019 14.49996 10:53:35 AM -156.363 -107.649 Right

10:53:36 AM 7.458394 10.89381 6.000019 14.49997 10:53:36 AM -156.399 -108.95 Right

10:53:37 AM 7.506371 10.8902 6.000019 14.49996 10:53:37 AM -157.064 -111.3 Right

10:53:38 AM 7.558899 10.8893 6.000019 14.4998 10:53:38 AM -158.081 -110.418 Right

10:53:39 AM 7.601651 10.88891 6.000019 14.4998 10:53:39 AM -158.058 -109.881 Right

10:53:40 AM 7.653618 10.88245 6.000019 14.49996 10:53:40 AM -158.054 -111.747 Right

10:53:41 AM 7.704514 10.88886 6.000019 14.49997 10:53:41 AM -157.73 -110.094 Right

10:53:42 AM 7.750939 10.8865 6.000019 14.49996 10:53:42 AM -157.74 -111.849 Right

10:53:43 AM 7.807344 10.8842 6.000019 14.49997 10:53:43 AM -158.396 -111.206 Right

10:53:44 AM 7.859224 10.88212 6.000019 14.4998 10:53:44 AM -158.389 -111.773 Right

10:53:45 AM -158.397 -110.244 Right

10:53:46 AM 7.911872 10.89171 6.000019 14.49997 10:53:46 AM -159.069 -113.59 Right

10:53:47 AM 7.966067 10.88703 6.000019 14.4998 10:53:47 AM -159.71 -111.984 Right

10:53:48 AM 8.021511 10.88443 6.000019 14.49996 10:53:48 AM -159.763 -110.955 Right

10:53:49 AM 8.072015 10.88268 6.000019 14.4998

10:53:50 AM 8.120579 10.88816 6.000019 14.49985 10:53:50 AM -158.223 -110.914 Right

10:53:51 AM 8.165326 10.89625 6.000019 14.4998 10:53:51 AM -154.928 -111.881 Right

10:53:52 AM 8.218029 10.90933 6.000019 14.49985 10:53:52 AM -153.377 -111.077 Right

10:53:53 AM 8.261042 10.92638 6.000019 14.49985 10:53:53 AM -154.02 -110.843 Right

10:53:54 AM 8.298958 10.93854 6.000019 14.49997 10:53:54 AM -151.259 -110.928 Right

10:53:55 AM 8.348174 10.96123 6.000019 14.49985 10:53:55 AM -150.58 -111.4 Right

10:53:56 AM -149.91 -110.882 Right

10:53:57 AM 8.388549 10.98051 6.000019 14.4998 10:53:57 AM -148.548 -112.013 Right

10:53:58 AM 8.427267 11.00776 6.000019 14.49985 10:53:58 AM -144.717 -112.751 Right

10:53:59 AM 8.473227 11.04639 6.000019 14.49997 10:53:59 AM -143.321 -110.777 Right

10:54:00 AM 8.506916 11.08338 6.000019 14.49985 10:54:00 AM -142.628 -111.792 Right

10:54:01 AM 8.542924 11.12478 6.000019 14.49997 10:54:01 AM -139.227 -110.664 Right

10:54:02 AM 8.568088 11.15876 6.000019 14.49996 10:54:02 AM -136.317 -109.043 Right

10:54:03 AM 8.591011 11.2053 6.000019 14.49985 10:54:03 AM -135.81 -106.855 Right

10:54:04 AM 8.610962 11.25364 6.000019 14.49997 10:54:04 AM -135.477 -106.463 Right

10:54:05 AM 8.623846 11.30002 6.000019 14.49996 10:54:05 AM -135.518 -105.044 Right

10:54:06 AM 8.646067 11.35177 6.000019 14.4998 10:54:06 AM -136.213 -106.328 Right

10:54:08 AM 8.662441 11.39136 6.000019 14.49997 10:54:08 AM -136.133 -104.911 Right

10:54:09 AM 8.676679 11.43199 6.000019 14.4998 10:54:09 AM -138.061 -104.817 Right

10:54:10 AM 8.694728 11.47722 6.000019 14.49997 10:54:10 AM -138.703 -105.996 Right

10:54:11 AM 8.715306 11.52419 6.000019 14.49985 10:54:11 AM -138.371 -105.558 Right

10:54:12 AM 8.732287 11.57507 6.000019 14.49997 10:54:12 AM -136.94 -104.998 Right

10:54:13 AM 8.754978 11.62936 6.000019 14.49985 10:54:13 AM -135.752 -107.284 Right

10:54:14 AM 8.767001 11.67349 6.000019 14.4998 10:54:14 AM -137.311 -105.438 Right

10:54:15 AM 8.779992 11.72279 6.000019 14.49996 10:54:15 AM -132.107 -107.746 Right

10:54:16 AM 8.787507 11.77128 6.000019 14.49996 10:54:16 AM -131.524 -105.908 Right

10:54:17 AM 8.791326 11.81702 6.000019 14.49996 10:54:17 AM -132.436 -107.511 Right

10:54:18 AM 8.793837 11.86645 6.000019 14.49996 10:54:18 AM -130.448 -107.309 Right

10:54:19 AM -128.027 -108.854 Right

10:54:20 AM 8.786958 11.91652 6.000019 14.49985 10:54:20 AM -127.804 -108.292 Right

10:54:21 AM 8.782507 11.96116 6.000019 14.49997 10:54:21 AM -127.025 -109.036 Right

10:54:22 AM 8.769834 12.00736 6.000019 14.49996 10:54:22 AM -117.526 -111.993 Right

10:54:23 AM 8.755476 12.04625 6.000019 14.49985 10:54:23 AM -121.104 -107.875 Right

10:54:24 AM 8.738588 12.08933 6.000019 14.49997 10:54:24 AM -123.031 -110.738 Right

10:54:25 AM 8.716022 12.14155 6.000019 14.49996 10:54:25 AM -113.126 -112.385 Right

10:54:26 AM 8.68796 12.18189 6.000019 14.49985

10:54:27 AM 8.681116 12.19664 6.000019 14.49997 10:54:27 AM -110.794 -113.232 Left

10:54:28 AM 8.621801 12.25184 6.000019 14.49996 10:54:28 AM -113.744 -114.925 Left

10:54:29 AM 8.373569 12.49205 6.000019 14.4998 10:54:29 AM -116.175 -111.088 Right

10:54:30 AM 8.140544 12.71758 6.000019 14.49996 10:54:30 AM -117.775 -111.673 Right

10:54:31 AM -110.326 -111.802 Left

10:54:32 AM 8.139504 12.71844 6.000019 14.49985 10:54:32 AM -110.243 -111.927 Left

10:54:33 AM 8.139413 12.71875 6.000019 14.49997 10:54:33 AM -110.108 -111.913 Left

10:54:34 AM 8.139585 12.71886 6.000019 14.49985 10:54:34 AM -110.14 -111.911 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 321: Modelling a real-time multi-sensor fusion-based navigation ...

303

X Z X Z L R

10:52:40 AM 6 9.5 6 14.5

10:52:41 AM 6 9.5 6 14.5

10:52:42 AM 6 9.5 6 14.5

10:52:43 AM 5.990528 9.498259 6.000019 14.49985

10:52:44 AM 5.983474 9.48295 6.000019 14.49985

10:52:45 AM 5.964692 9.447685 6.000019 14.49997

10:52:46 AM -135.351 -112.688 Right

10:52:47 AM 5.976819 9.472078 6.000019 14.49997

10:52:48 AM 5.995047 9.528976 6.000019 14.49996 10:52:48 AM -132.694 -112.963 Right

10:52:49 AM 6.001012 9.544574 6.000019 14.49997 10:52:49 AM -129.037 -123.193 Right

10:52:50 AM 5.999889 9.54507 6.000019 14.49996 10:52:50 AM -130.817 -122.891 Right

10:52:51 AM 6.004016 9.72869 6.000019 14.4998 10:52:51 AM -133.053 -120.466 Right

10:52:52 AM 6.003926 9.733294 6.000019 14.49997 10:52:52 AM -131.333 -121.208 Right

10:52:53 AM 6.003814 9.891059 6.000019 14.49985 10:52:53 AM -130.561 -123.944 Right

10:52:54 AM 6.003565 10.23235 6.000019 14.49985 10:52:54 AM -130.185 -125.26 Right

10:52:55 AM 6.003317 10.57364 6.000019 14.49997 10:52:55 AM -123.365 -125.353 Left

10:52:56 AM 6.003062 10.92462 6.000019 14.4998 10:52:56 AM -114.681 -123.52 Left

10:52:57 AM 5.995727 11.01945 6.000019 14.49996 10:52:57 AM -108.929 -123.517 Left

10:52:58 AM -116.685 -114.294 Right

10:52:59 AM 5.987431 11.02194 6.000019 14.4998

10:53:00 AM 5.979954 11.02625 6.000019 14.49996 10:53:00 AM -125.151 -121.906 Right

10:53:01 AM 5.980797 11.04088 6.000019 14.49985 10:53:01 AM -126.948 -122.591 Right

10:53:02 AM 5.97635 11.04935 6.000019 14.49997 10:53:02 AM -129.48 -110.527 Right

10:53:03 AM 5.972887 11.05684 6.000019 14.4998 10:53:03 AM -135.907 -110.954 Right

10:53:04 AM 6.014514 11.08062 6.000019 14.4998 10:53:04 AM -140.432 -112.371 Right

10:53:05 AM 6.063323 11.10123 6.000019 14.4998 10:53:05 AM -142.056 -109.402 Right

10:53:06 AM 6.107279 11.10958 6.000019 14.49997 10:53:06 AM -146.553 -109.286 Right

10:53:07 AM 6.159662 11.10649 6.000019 14.49985 10:53:07 AM -155.076 -109.011 Right

10:53:08 AM 6.212564 11.09899 6.000019 14.49985 10:53:08 AM -155.751 -108.187 Right

10:53:09 AM -157.718 -109.506 Right

10:53:10 AM 6.25217 11.08564 6.000019 14.4998 10:53:10 AM -158.383 -108.117 Right

10:53:11 AM 6.29176 11.06707 6.000019 14.49997 10:53:11 AM -158.377 -106.647 Right

10:53:12 AM 6.333875 11.05068 6.000019 14.49996 10:53:12 AM -158.384 -107.564 Right

10:53:13 AM 6.381655 11.03246 6.000019 14.49996 10:53:13 AM -158.394 -108.101 Right

10:53:14 AM 6.417175 11.01487 6.000019 14.49996 10:53:14 AM -158.387 -109.07 Right

10:53:15 AM 6.461335 10.99534 6.000019 14.4998

10:53:16 AM 6.505663 10.97607 6.000019 14.49997 10:53:16 AM -158.054 -106.582 Right

10:53:17 AM 6.553511 10.96292 6.000019 14.49996 10:53:17 AM -158.308 -107.949 Right

10:53:18 AM 6.598693 10.9533 6.000019 14.49985 10:53:18 AM -157.61 -107.732 Right

10:53:19 AM 6.638103 10.94376 6.000019 14.49997 10:53:19 AM -156.833 -107.427 Right

10:53:20 AM 6.720123 10.93359 6.000019 14.49985 10:53:20 AM -157.147 -107.164 Right

10:53:21 AM -156.413 -109.544 Right

10:53:22 AM 6.757442 10.92128 6.000019 14.49997 10:53:22 AM -157.677 -107.981 Right

10:53:23 AM 6.813384 10.91371 6.000019 14.49985 10:53:23 AM -156.376 -108.387 Right

10:53:24 AM 6.875503 10.90206 6.000019 14.49996 10:53:24 AM -155.914 -109.041 Right

10:53:25 AM 6.929533 10.90117 6.000019 14.49985 10:53:25 AM -156.186 -110.265 Right

10:53:26 AM 6.986537 10.89872 6.000019 14.4998 10:53:26 AM -155.74 -109.148 Right

10:53:27 AM 7.038148 10.89448 6.000019 14.49997 10:53:27 AM -154.433 -109.864 Right

10:53:28 AM 7.083647 10.89595 6.000019 14.49985 10:53:28 AM -154.601 -112.082 Right

10:53:29 AM 7.128892 10.88597 6.000019 14.49985 10:53:29 AM -156.872 -109.209 Right

10:53:30 AM 7.179657 10.88296 6.000019 14.4998 10:53:30 AM -156.306 -109.468 Right

10:53:31 AM 7.23559 10.88592 6.000019 14.49996

10:53:32 AM 7.29436 10.88703 6.000019 14.49985 10:53:32 AM -155.111 -110.351 Right

10:53:33 AM -154.422 -109.447 Right

10:53:34 AM 7.348278 10.88949 6.000019 14.4998 10:53:34 AM -155.486 -112.373 Right

10:53:35 AM 7.405777 10.89291 6.000019 14.49996 10:53:35 AM -156.363 -107.649 Right

10:53:36 AM 7.458394 10.89381 6.000019 14.49997 10:53:36 AM -156.399 -108.95 Right

10:53:37 AM 7.506371 10.8902 6.000019 14.49996 10:53:37 AM -157.064 -111.3 Right

10:53:38 AM 7.558899 10.8893 6.000019 14.4998 10:53:38 AM -158.081 -110.418 Right

10:53:39 AM 7.601651 10.88891 6.000019 14.4998 10:53:39 AM -158.058 -109.881 Right

10:53:40 AM 7.653618 10.88245 6.000019 14.49996 10:53:40 AM -158.054 -111.747 Right

10:53:41 AM 7.704514 10.88886 6.000019 14.49997 10:53:41 AM -157.73 -110.094 Right

10:53:42 AM 7.750939 10.8865 6.000019 14.49996 10:53:42 AM -157.74 -111.849 Right

10:53:43 AM 7.807344 10.8842 6.000019 14.49997 10:53:43 AM -158.396 -111.206 Right

10:53:44 AM 7.859224 10.88212 6.000019 14.4998 10:53:44 AM -158.389 -111.773 Right

10:53:45 AM -158.397 -110.244 Right

10:53:46 AM 7.911872 10.89171 6.000019 14.49997 10:53:46 AM -159.069 -113.59 Right

10:53:47 AM 7.966067 10.88703 6.000019 14.4998 10:53:47 AM -159.71 -111.984 Right

10:53:48 AM 8.021511 10.88443 6.000019 14.49996 10:53:48 AM -159.763 -110.955 Right

10:53:49 AM 8.072015 10.88268 6.000019 14.4998

10:53:50 AM 8.120579 10.88816 6.000019 14.49985 10:53:50 AM -158.223 -110.914 Right

10:53:51 AM 8.165326 10.89625 6.000019 14.4998 10:53:51 AM -154.928 -111.881 Right

10:53:52 AM 8.218029 10.90933 6.000019 14.49985 10:53:52 AM -153.377 -111.077 Right

10:53:53 AM 8.261042 10.92638 6.000019 14.49985 10:53:53 AM -154.02 -110.843 Right

10:53:54 AM 8.298958 10.93854 6.000019 14.49997 10:53:54 AM -151.259 -110.928 Right

10:53:55 AM 8.348174 10.96123 6.000019 14.49985 10:53:55 AM -150.58 -111.4 Right

10:53:56 AM -149.91 -110.882 Right

10:53:57 AM 8.388549 10.98051 6.000019 14.4998 10:53:57 AM -148.548 -112.013 Right

10:53:58 AM 8.427267 11.00776 6.000019 14.49985 10:53:58 AM -144.717 -112.751 Right

10:53:59 AM 8.473227 11.04639 6.000019 14.49997 10:53:59 AM -143.321 -110.777 Right

10:54:00 AM 8.506916 11.08338 6.000019 14.49985 10:54:00 AM -142.628 -111.792 Right

10:54:01 AM 8.542924 11.12478 6.000019 14.49997 10:54:01 AM -139.227 -110.664 Right

10:54:02 AM 8.568088 11.15876 6.000019 14.49996 10:54:02 AM -136.317 -109.043 Right

10:54:03 AM 8.591011 11.2053 6.000019 14.49985 10:54:03 AM -135.81 -106.855 Right

10:54:04 AM 8.610962 11.25364 6.000019 14.49997 10:54:04 AM -135.477 -106.463 Right

10:54:05 AM 8.623846 11.30002 6.000019 14.49996 10:54:05 AM -135.518 -105.044 Right

10:54:06 AM 8.646067 11.35177 6.000019 14.4998 10:54:06 AM -136.213 -106.328 Right

10:54:08 AM 8.662441 11.39136 6.000019 14.49997 10:54:08 AM -136.133 -104.911 Right

10:54:09 AM 8.676679 11.43199 6.000019 14.4998 10:54:09 AM -138.061 -104.817 Right

10:54:10 AM 8.694728 11.47722 6.000019 14.49997 10:54:10 AM -138.703 -105.996 Right

10:54:11 AM 8.715306 11.52419 6.000019 14.49985 10:54:11 AM -138.371 -105.558 Right

10:54:12 AM 8.732287 11.57507 6.000019 14.49997 10:54:12 AM -136.94 -104.998 Right

10:54:13 AM 8.754978 11.62936 6.000019 14.49985 10:54:13 AM -135.752 -107.284 Right

10:54:14 AM 8.767001 11.67349 6.000019 14.4998 10:54:14 AM -137.311 -105.438 Right

10:54:15 AM 8.779992 11.72279 6.000019 14.49996 10:54:15 AM -132.107 -107.746 Right

10:54:16 AM 8.787507 11.77128 6.000019 14.49996 10:54:16 AM -131.524 -105.908 Right

10:54:17 AM 8.791326 11.81702 6.000019 14.49996 10:54:17 AM -132.436 -107.511 Right

10:54:18 AM 8.793837 11.86645 6.000019 14.49996 10:54:18 AM -130.448 -107.309 Right

10:54:19 AM -128.027 -108.854 Right

10:54:20 AM 8.786958 11.91652 6.000019 14.49985 10:54:20 AM -127.804 -108.292 Right

10:54:21 AM 8.782507 11.96116 6.000019 14.49997 10:54:21 AM -127.025 -109.036 Right

10:54:22 AM 8.769834 12.00736 6.000019 14.49996 10:54:22 AM -117.526 -111.993 Right

10:54:23 AM 8.755476 12.04625 6.000019 14.49985 10:54:23 AM -121.104 -107.875 Right

10:54:24 AM 8.738588 12.08933 6.000019 14.49997 10:54:24 AM -123.031 -110.738 Right

10:54:25 AM 8.716022 12.14155 6.000019 14.49996 10:54:25 AM -113.126 -112.385 Right

10:54:26 AM 8.68796 12.18189 6.000019 14.49985

10:54:27 AM 8.681116 12.19664 6.000019 14.49997 10:54:27 AM -110.794 -113.232 Left

10:54:28 AM 8.621801 12.25184 6.000019 14.49996 10:54:28 AM -113.744 -114.925 Left

10:54:29 AM 8.373569 12.49205 6.000019 14.4998 10:54:29 AM -116.175 -111.088 Right

10:54:30 AM 8.140544 12.71758 6.000019 14.49996 10:54:30 AM -117.775 -111.673 Right

10:54:31 AM -110.326 -111.802 Left

10:54:32 AM 8.139504 12.71844 6.000019 14.49985 10:54:32 AM -110.243 -111.927 Left

10:54:33 AM 8.139413 12.71875 6.000019 14.49997 10:54:33 AM -110.108 -111.913 Left

10:54:34 AM 8.139585 12.71886 6.000019 14.49985 10:54:34 AM -110.14 -111.911 Left

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 322: Modelling a real-time multi-sensor fusion-based navigation ...

304

Non-Uniform Obstruction with Right Scatter Sample 4

Motion Path Graph – Non-Uniform obstruction with Right Scatter – Sample 4.

Combined logs – Non-Uniform obstruction with Right Scatter – Sample 4.

9

9.5

10

10.5

11

11.5

12

12.5

13

13.5

14

14.5

15

55.25.45.65.866.26.46.66.87

29-6-10-55

CARMI Child

X Z X Z L R

10:55:01 AM 6 9.5 6 14.5

10:55:02 AM 6 9.5 6 14.5

10:55:03 AM 6 9.5 6 14.5

10:55:05 AM 5.99464 9.501786 6.000019 14.49996

10:55:06 AM 6.006845 9.621773 6.000019 14.4998 10:55:06 AM -132.732 -120.382 Right

10:55:07 AM 6.00764 9.621551 6.000019 14.49997 10:55:07 AM -133.257 -119.024 Right

10:55:08 AM 6.00829 9.62191 6.000019 14.4998 10:55:08 AM -131.037 -119.924 Right

10:55:09 AM 6.015332 9.805927 6.000019 14.49996 10:55:09 AM -131.831 -120.684 Right

10:55:10 AM 6.027221 10.14706 6.000019 14.49985 10:55:10 AM -132.337 -124.046 Right

10:55:11 AM -127.567 -125.826 Right

10:55:12 AM 6.039108 10.48818 6.000019 14.4998 10:55:12 AM -119.081 -124.387 Left

10:55:13 AM 6.050994 10.82931 6.000019 14.49997 10:55:13 AM -109.898 -121.835 Left

10:55:14 AM 6.053479 11.04907 6.000019 14.4998 10:55:14 AM -111.485 -115.142 Left

10:55:15 AM 6.042463 11.05197 6.000019 14.49996 10:55:15 AM -120.429 -110.739 Right

10:55:16 AM 6.014875 11.0544 6.000019 14.4998

10:55:17 AM 6.008541 11.06881 6.000019 14.49997 10:55:17 AM -132.948 -108.783 Right

10:55:18 AM 6.058201 11.09011 6.000019 14.4998 10:55:18 AM -141.23 -109.531 Right

10:55:19 AM 6.106407 11.10443 6.000019 14.49996 10:55:19 AM -144.5 -110.117 Right

10:55:20 AM 6.154459 11.11225 6.000019 14.49985 10:55:20 AM -144.966 -107.954 Right

10:55:21 AM -151.79 -109.332 Right

10:55:22 AM 6.201403 11.11016 6.000019 14.49985 10:55:22 AM -155.436 -109.908 Right

10:55:23 AM 6.257267 11.10237 6.000019 14.4998 10:55:23 AM -156.397 -108.47 Right

10:55:24 AM 6.311574 11.07715 6.000019 14.49996 10:55:24 AM 121 Recenter-Right

10:55:25 AM 6.33921 11.06219 6.000019 14.49997 10:55:25 AM -159.71 -108.207 Right

10:55:26 AM 6.339808 11.06081 6.000019 14.49996 10:55:26 AM -155.983 -107.931 Right

10:55:27 AM 6.340755 11.05966 6.000019 14.4998 10:55:27 AM -155.426 -108.139 Right

10:55:28 AM 6.340847 11.05806 6.000019 14.49997 10:55:28 AM -155.548 -108.232 Right

10:55:29 AM 6.341109 11.05728 6.000019 14.49985 10:55:29 AM -155.989 -107.883 Right

10:55:30 AM 6.340553 11.05637 6.000019 14.4998

10:55:31 AM 6.341751 11.0553 6.000019 14.49997 10:55:31 AM -156.481 -107.382 Right

10:55:32 AM 6.342431 11.05495 6.000019 14.49985 10:55:32 AM -156.302 -107.033 Right

10:55:33 AM -155.435 -107.111 Right

10:55:34 AM 6.342547 11.05361 6.000019 14.4998 10:55:34 AM -154.012 -107.586 Right

10:55:35 AM 6.342362 11.05349 6.000019 14.49996 10:55:35 AM -154.3 -107.257 Right

10:55:36 AM 6.342402 11.05256 6.000019 14.49996 10:55:36 AM -154.509 -107.1 Right

10:55:37 AM 6.342904 11.05222 6.000019 14.49985 10:55:37 AM -154.401 -107.262 Right

10:55:38 AM 6.342176 11.05195 6.000019 14.49997 10:55:38 AM -153.941 -107.888 Right

10:55:39 AM 6.344245 11.05095 6.000019 14.49997 10:55:39 AM -152.166 -107.229 Right

10:55:40 AM 6.344777 11.05007 6.000019 14.49985 10:55:40 AM -152.037 -107.299 Right

10:55:41 AM 6.345498 11.04932 6.000019 14.49996 10:55:41 AM -152.316 -107.51 Right

10:55:42 AM 6.346025 11.0486 6.000019 14.49985 10:55:42 AM -152.249 -107.796 Right

10:55:43 AM 6.346815 11.04809 6.000019 14.4998 10:55:43 AM -152.238 -108.433 Right

10:55:44 AM 6.347118 11.04744 6.000019 14.49996 10:55:44 AM -151.463 -108.472 Right

10:55:45 AM -150.587 -107.691 Right

10:55:46 AM 6.347539 11.04639 6.000019 14.49996 10:55:46 AM -150.645 -108.084 Right

10:55:47 AM 6.348439 11.0453 6.000019 14.49997

10:55:48 AM 6.348692 11.04462 6.000019 14.49985 10:55:48 AM -150.529 -108.861 Right

10:55:49 AM 6.348737 11.04416 6.000019 14.49997 10:55:49 AM -150.049 -109.66 Right

10:55:50 AM 6.348502 11.04354 6.000019 14.49997 10:55:50 AM -148.899 -108.117 Right

10:55:51 AM 6.349028 11.04311 6.000019 14.49997 10:55:51 AM -148.956 -108.633 Right

10:55:52 AM 6.350311 11.04244 6.000019 14.49985 10:55:52 AM -148.994 -109.037 Right

10:55:53 AM 6.350255 11.04188 6.000019 14.4998 10:55:53 AM -148.675 -110.084 Right

10:55:54 AM 6.350607 11.04123 6.000019 14.49997 10:55:54 AM -148.408 -110.438 Right

10:55:55 AM 6.351353 11.04047 6.000019 14.49985 10:55:55 AM -146.97 -109.437 Right

10:55:56 AM -146.982 -109.519 Right

10:55:57 AM 6.352343 11.04001 6.000019 14.49997 10:55:57 AM -147.318 -110.058 Right

10:55:58 AM 6.352273 11.03846 6.000019 14.49996 10:55:58 AM -147.105 -110.726 Right

10:55:59 AM 6.352028 11.03798 6.000019 14.4998 10:55:59 AM -146.174 -111.632 Right

10:56:00 AM 6.351964 11.03701 6.000019 14.4998 10:56:00 AM -145.251 -109.969 Right

10:56:01 AM 6.350976 11.03564 6.000019 14.49996 10:56:01 AM -145.345 -110.616 Right

10:56:02 AM 6.351787 11.03488 6.000019 14.4998 10:56:02 AM -145.292 -111.425 Right

10:56:03 AM 6.35277 11.03427 6.000019 14.49997 10:56:03 AM -144.874 -112.449 Right

10:56:04 AM 6.352327 11.03283 6.000019 14.49996

10:56:05 AM 6.352884 11.03106 6.000019 14.49985 10:56:05 AM -143.681 -111.995 Right

10:56:06 AM 6.353895 11.02998 6.000019 14.49997 10:56:06 AM -143.014 -111.851 Right

10:56:07 AM 6.354368 11.0292 6.000019 14.49997 10:56:07 AM -143.284 -112.98 Right

10:56:08 AM -142.688 -114.614 Right

10:56:09 AM 6.354547 11.02813 6.000019 14.49997 10:56:09 AM -141.341 -114.004 Right

10:56:10 AM 6.354764 11.02725 6.000019 14.49996 10:56:10 AM -141.029 -113.367 Right

10:56:11 AM 6.355392 11.02581 6.000019 14.49997 10:56:11 AM -140.585 -116.503 Right

10:56:12 AM 6.356277 11.02415 6.000019 14.49996

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 323: Modelling a real-time multi-sensor fusion-based navigation ...

305

X Z X Z L R

10:55:01 AM 6 9.5 6 14.5

10:55:02 AM 6 9.5 6 14.5

10:55:03 AM 6 9.5 6 14.5

10:55:05 AM 5.99464 9.501786 6.000019 14.49996

10:55:06 AM 6.006845 9.621773 6.000019 14.4998 10:55:06 AM -132.732 -120.382 Right

10:55:07 AM 6.00764 9.621551 6.000019 14.49997 10:55:07 AM -133.257 -119.024 Right

10:55:08 AM 6.00829 9.62191 6.000019 14.4998 10:55:08 AM -131.037 -119.924 Right

10:55:09 AM 6.015332 9.805927 6.000019 14.49996 10:55:09 AM -131.831 -120.684 Right

10:55:10 AM 6.027221 10.14706 6.000019 14.49985 10:55:10 AM -132.337 -124.046 Right

10:55:11 AM -127.567 -125.826 Right

10:55:12 AM 6.039108 10.48818 6.000019 14.4998 10:55:12 AM -119.081 -124.387 Left

10:55:13 AM 6.050994 10.82931 6.000019 14.49997 10:55:13 AM -109.898 -121.835 Left

10:55:14 AM 6.053479 11.04907 6.000019 14.4998 10:55:14 AM -111.485 -115.142 Left

10:55:15 AM 6.042463 11.05197 6.000019 14.49996 10:55:15 AM -120.429 -110.739 Right

10:55:16 AM 6.014875 11.0544 6.000019 14.4998

10:55:17 AM 6.008541 11.06881 6.000019 14.49997 10:55:17 AM -132.948 -108.783 Right

10:55:18 AM 6.058201 11.09011 6.000019 14.4998 10:55:18 AM -141.23 -109.531 Right

10:55:19 AM 6.106407 11.10443 6.000019 14.49996 10:55:19 AM -144.5 -110.117 Right

10:55:20 AM 6.154459 11.11225 6.000019 14.49985 10:55:20 AM -144.966 -107.954 Right

10:55:21 AM -151.79 -109.332 Right

10:55:22 AM 6.201403 11.11016 6.000019 14.49985 10:55:22 AM -155.436 -109.908 Right

10:55:23 AM 6.257267 11.10237 6.000019 14.4998 10:55:23 AM -156.397 -108.47 Right

10:55:24 AM 6.311574 11.07715 6.000019 14.49996 10:55:24 AM 121 Recenter-Right

10:55:25 AM 6.33921 11.06219 6.000019 14.49997 10:55:25 AM -159.71 -108.207 Right

10:55:26 AM 6.339808 11.06081 6.000019 14.49996 10:55:26 AM -155.983 -107.931 Right

10:55:27 AM 6.340755 11.05966 6.000019 14.4998 10:55:27 AM -155.426 -108.139 Right

10:55:28 AM 6.340847 11.05806 6.000019 14.49997 10:55:28 AM -155.548 -108.232 Right

10:55:29 AM 6.341109 11.05728 6.000019 14.49985 10:55:29 AM -155.989 -107.883 Right

10:55:30 AM 6.340553 11.05637 6.000019 14.4998

10:55:31 AM 6.341751 11.0553 6.000019 14.49997 10:55:31 AM -156.481 -107.382 Right

10:55:32 AM 6.342431 11.05495 6.000019 14.49985 10:55:32 AM -156.302 -107.033 Right

10:55:33 AM -155.435 -107.111 Right

10:55:34 AM 6.342547 11.05361 6.000019 14.4998 10:55:34 AM -154.012 -107.586 Right

10:55:35 AM 6.342362 11.05349 6.000019 14.49996 10:55:35 AM -154.3 -107.257 Right

10:55:36 AM 6.342402 11.05256 6.000019 14.49996 10:55:36 AM -154.509 -107.1 Right

10:55:37 AM 6.342904 11.05222 6.000019 14.49985 10:55:37 AM -154.401 -107.262 Right

10:55:38 AM 6.342176 11.05195 6.000019 14.49997 10:55:38 AM -153.941 -107.888 Right

10:55:39 AM 6.344245 11.05095 6.000019 14.49997 10:55:39 AM -152.166 -107.229 Right

10:55:40 AM 6.344777 11.05007 6.000019 14.49985 10:55:40 AM -152.037 -107.299 Right

10:55:41 AM 6.345498 11.04932 6.000019 14.49996 10:55:41 AM -152.316 -107.51 Right

10:55:42 AM 6.346025 11.0486 6.000019 14.49985 10:55:42 AM -152.249 -107.796 Right

10:55:43 AM 6.346815 11.04809 6.000019 14.4998 10:55:43 AM -152.238 -108.433 Right

10:55:44 AM 6.347118 11.04744 6.000019 14.49996 10:55:44 AM -151.463 -108.472 Right

10:55:45 AM -150.587 -107.691 Right

10:55:46 AM 6.347539 11.04639 6.000019 14.49996 10:55:46 AM -150.645 -108.084 Right

10:55:47 AM 6.348439 11.0453 6.000019 14.49997

10:55:48 AM 6.348692 11.04462 6.000019 14.49985 10:55:48 AM -150.529 -108.861 Right

10:55:49 AM 6.348737 11.04416 6.000019 14.49997 10:55:49 AM -150.049 -109.66 Right

10:55:50 AM 6.348502 11.04354 6.000019 14.49997 10:55:50 AM -148.899 -108.117 Right

10:55:51 AM 6.349028 11.04311 6.000019 14.49997 10:55:51 AM -148.956 -108.633 Right

10:55:52 AM 6.350311 11.04244 6.000019 14.49985 10:55:52 AM -148.994 -109.037 Right

10:55:53 AM 6.350255 11.04188 6.000019 14.4998 10:55:53 AM -148.675 -110.084 Right

10:55:54 AM 6.350607 11.04123 6.000019 14.49997 10:55:54 AM -148.408 -110.438 Right

10:55:55 AM 6.351353 11.04047 6.000019 14.49985 10:55:55 AM -146.97 -109.437 Right

10:55:56 AM -146.982 -109.519 Right

10:55:57 AM 6.352343 11.04001 6.000019 14.49997 10:55:57 AM -147.318 -110.058 Right

10:55:58 AM 6.352273 11.03846 6.000019 14.49996 10:55:58 AM -147.105 -110.726 Right

10:55:59 AM 6.352028 11.03798 6.000019 14.4998 10:55:59 AM -146.174 -111.632 Right

10:56:00 AM 6.351964 11.03701 6.000019 14.4998 10:56:00 AM -145.251 -109.969 Right

10:56:01 AM 6.350976 11.03564 6.000019 14.49996 10:56:01 AM -145.345 -110.616 Right

10:56:02 AM 6.351787 11.03488 6.000019 14.4998 10:56:02 AM -145.292 -111.425 Right

10:56:03 AM 6.35277 11.03427 6.000019 14.49997 10:56:03 AM -144.874 -112.449 Right

10:56:04 AM 6.352327 11.03283 6.000019 14.49996

10:56:05 AM 6.352884 11.03106 6.000019 14.49985 10:56:05 AM -143.681 -111.995 Right

10:56:06 AM 6.353895 11.02998 6.000019 14.49997 10:56:06 AM -143.014 -111.851 Right

10:56:07 AM 6.354368 11.0292 6.000019 14.49997 10:56:07 AM -143.284 -112.98 Right

10:56:08 AM -142.688 -114.614 Right

10:56:09 AM 6.354547 11.02813 6.000019 14.49997 10:56:09 AM -141.341 -114.004 Right

10:56:10 AM 6.354764 11.02725 6.000019 14.49996 10:56:10 AM -141.029 -113.367 Right

10:56:11 AM 6.355392 11.02581 6.000019 14.49997 10:56:11 AM -140.585 -116.503 Right

10:56:12 AM 6.356277 11.02415 6.000019 14.49996

Referee Log PathDecider Log

TimeStamp

CARMI Child

TimeStamp

Tendencies

Verdict

Page 324: Modelling a real-time multi-sensor fusion-based navigation ...

306

Local Minima Test Result Dataset

Combine logs for the Local Minima Scenario

Referee Log PathDecider Log

TimeStamp CARMI Child TimeStamp Tendencies Verdict

X Z X Z L R

1:51:19 AM 6 9.5 6 14.5

1:51:20 AM 6 9.5 6 14.5

1:51:22 AM 6 9.5 6 14.5

1:51:23 AM 6 9.5 6 14.5

1:51:24 AM 5.997074 9.500979 6.000019 14.49985

1:51:25 AM 5.998401 9.538817 6.000019 14.49985 1:51:25 AM -29.0057 -15.1729 Right

1:51:26 AM 6.00367 9.555366 6.000019 14.49985 1:51:26 AM -33.5997 -17.3274 Right

1:51:27 AM 6.006526 9.557668 6.000019 14.49997 1:51:27 AM -22.5675 -17.3554 Right

1:51:28 AM 6.012127 9.648153 6.000019 14.49985 1:51:28 AM -19.8175 -15.9363 Right

1:51:29 AM 6.029491 9.987835 6.000019 14.4998

1:51:30 AM 6.046899 10.32726 6.000019 14.49997 1:51:30 AM -21.6316 -18.2782 Right

1:51:31 AM 6.064361 10.66666 6.000019 14.49996 1:51:31 AM -26.7912 -23.9104 Right

1:51:32 AM 6.081869 11.00602 6.000019 14.49996 1:51:32 AM -39.3131 -38.2815 Right

1:51:33 AM -101.506 -108.018 Left

1:51:34 AM 6.073533 11.05243 6.000019 14.49985 1:51:34 AM -68.2009 -153.093 Left

1:51:35 AM 6.064908 11.0612 6.000019 14.4998 1:51:35 AM -99.0617 -176.442 Left

1:51:36 AM 6.058323 11.07219 6.000019 14.49996 1:51:36 AM -107.03 -206.2 Left

1:51:37 AM 6.053562 11.08579 6.000019 14.4998 1:51:37 AM -110.555 -230.716 Left

1:51:38 AM 6.064655 11.10001 6.000019 14.49997 1:51:38 AM -112.048 -231.451 Left

1:51:39 AM 6.1245 11.1245 6.000019 14.49985 1:51:39 AM -120.252 -232.122 Left

1:51:40 AM 6.171119 11.13704 6.000019 14.49997 1:51:40 AM -132.594 -218.977 Left

1:51:41 AM 6.232832 11.13502 6.000019 14.49985

1:51:42 AM 6.28772 11.12419 6.000019 14.49997 1:51:42 AM -144.502 -147.204 Left

1:51:43 AM 6.341294 11.10684 6.000019 14.49997 1:51:43 AM -157.182 -99.5754 Right

1:51:44 AM -165.578 -30.0441 Right

1:51:45 AM 6.390291 11.08646 6.000019 14.49985 1:51:45 AM -166.838 -75.06 Right

1:51:46 AM 6.427298 11.0713 6.000019 14.4998 1:51:46 AM -166.784 -77.3409 Right

1:51:47 AM 6.473382 11.0588 6.000019 14.49985 1:51:47 AM -162.965 -68.358 Right

1:51:48 AM 6.52168 11.03084 6.000019 14.4998 1:51:48 AM 121 Recenter-Right

1:51:49 AM 6.548649 11.01481 6.000019 14.49985 1:51:49 AM 124 Recenter-Right

1:51:50 AM 6.547762 11.0135 6.000019 14.49996 1:51:50 AM -156.974 -71.7713 Right

1:51:51 AM 6.548935 11.00974 6.000019 14.49996 1:51:51 AM -151.843 -81.6761 Right

1:51:52 AM 6.551022 11.0071 6.000019 14.49996 1:51:52 AM -151.541 -84.7626 Right

1:51:53 AM 6.551851 11.00519 6.000019 14.49985 1:51:53 AM -151.526 -90.6611 Right

Page 325: Modelling a real-time multi-sensor fusion-based navigation ...

307

1:51:54 AM 6.554352 11.00197 6.000019 14.49997 1:51:54 AM -146.308 -97.8878 Right

1:51:55 AM 6.55461 10.99957 6.000019 14.49996 1:51:55 AM -140.979 -101.76 Right

1:51:56 AM 6.556025 10.99797 6.000019 14.49997 1:51:56 AM -141.889 -105.246 Right

1:51:57 AM -138.373 -105.58 Right

1:51:58 AM 6.557708 10.99454 6.000019 14.49997

1:51:59 AM 6.557666 10.99328 6.000019 14.49997 1:51:59 AM -131.936 -109.814 Right

1:52:00 AM 6.559336 10.99023 6.000019 14.49985 1:52:00 AM -134.938 -109.825 Right

1:52:01 AM 6.561836 10.98822 6.000019 14.49997 1:52:01 AM -131.356 -109.878 Right

1:52:02 AM 6.563061 10.98617 6.000019 14.49996 1:52:02 AM -127.084 -114.562 Right

1:52:03 AM 6.564486 10.98434 6.000019 14.49997 1:52:03 AM -125.99 -110.029 Right

1:52:04 AM 6.566859 10.98199 6.000019 14.4998 1:52:04 AM -129.545 -113.996 Right

1:52:05 AM 6.569016 10.98002 6.000019 14.49997 1:52:05 AM -125.631 -114.956 Right

1:52:06 AM 6.570813 10.97804 6.000019 14.49996 1:52:06 AM -123.292 -116.024 Right

1:52:07 AM 6.572488 10.9769 6.000019 14.4998 1:52:07 AM -125.25 -113.466 Right

1:52:08 AM 6.574524 10.97521 6.000019 14.49996 1:52:08 AM -122.625 -115.393 Right

1:52:09 AM -118.959 -112.457 Right

1:52:10 AM 6.576846 10.97412 6.000019 14.49985 1:52:10 AM -117.667 -114.244 Right

1:52:11 AM 6.579556 10.97285 6.000019 14.49985 1:52:11 AM -118.379 -110.556 Right

1:52:12 AM 6.58041 10.97136 6.000019 14.49997 1:52:12 AM -114.753 -111.195 Right

1:52:13 AM 6.583171 10.97096 6.000019 14.49996 1:52:13 AM -113.136 -109.027 Right

1:52:14 AM 6.585874 10.96978 6.000019 14.49985

1:52:15 AM 6.589977 10.96939 6.000019 14.4998 1:52:15 AM -113.693 -106.551 Right

1:52:16 AM 6.5924 10.96931 6.000019 14.4998 1:52:16 AM -106.56 -104.324 Right

1:52:17 AM 6.596512 10.96926 6.000019 14.49985 1:52:17 AM -102.795 -102.229 Right

1:52:18 AM 6.598096 10.96913 6.000019 14.4998 1:52:18 AM -103.265 -99.5557 Right

1:52:19 AM 6.598648 10.96869 6.000019 14.49997 1:52:19 AM -96.7862 -99.5568 Left

1:52:20 AM 6.600674 10.96803 6.000019 14.49985 1:52:20 AM -96.794 -99.5349 Left

1:52:21 AM -98.1552 -98.1942 Left

1:52:22 AM 6.602701 10.96889 6.000019 14.4998 1:52:22 AM -94.9518 -95.8136 Left

1:52:23 AM 6.60552 10.96992 6.000019 14.49996 1:52:23 AM -93.0473 -93.5202 Left

1:52:24 AM 6.607229 10.96921 6.000019 14.49996 1:52:24 AM -95.7424 -89.3737 Right

1:52:25 AM 6.609035 10.96995 6.000019 14.49996 1:52:25 AM -91.5863 -87.4561 Right

1:52:26 AM 6.610603 10.97059 6.000019 14.49996 1:52:26 AM -94.3097 -85.8247 Right

1:52:27 AM 6.613195 10.97157 6.000019 14.4998 1:52:27 AM -93.4186 -85.3743 Right

1:52:28 AM 6.616239 10.97251 6.000019 14.49985 1:52:28 AM -90.5772 -80.9013 Right

1:52:29 AM 6.618249 10.9736 6.000019 14.49985 1:52:29 AM -94.829 -77.6625 Right

1:52:30 AM 6.620165 10.97572 6.000019 14.4998

1:52:31 AM 6.621376 10.97638 6.000019 14.49985 1:52:31 AM -95.8926 -74.8453 Right

1:52:32 AM 6.62289 10.97761 6.000019 14.4998 1:52:32 AM -91.158 -73.7209 Right

1:52:33 AM 6.624775 10.97787 6.000019 14.4998 1:52:33 AM -94.1889 -71.2993 Right

1:52:34 AM -95.6216 -69.0873 Right

Page 326: Modelling a real-time multi-sensor fusion-based navigation ...

308

1:52:35 AM 6.626234 10.97903 6.000019 14.49997 1:52:35 AM -93.9236 -67.5178 Right

1:52:36 AM 6.63363 10.98716 6.000019 14.49996 1:52:36 AM -111.039 -47.8742 Right

1:52:37 AM 6.643041 10.9958 6.000019 14.49985 1:52:37 AM -123.153 -66.4864 Right

1:52:38 AM 6.648644 11.00886 6.000019 14.49996 1:52:38 AM -137.975 -83.6072 Right

1:52:39 AM 6.651331 11.02271 6.000019 14.4998 1:52:39 AM -131.944 -79.5927 Right

1:52:40 AM 6.61236 11.04557 6.000019 14.49985 1:52:40 AM -146.173 -98.9936 Right

1:52:41 AM 6.572775 11.05748 6.000019 14.4998 1:52:41 AM -138.816 -105.585 Right

1:52:42 AM 6.526691 11.06667 6.000019 14.4998 1:52:42 AM -126.642 -112.235 Right

1:52:43 AM 6.478302 11.06718 6.000019 14.49997 1:52:43 AM -97.7547 -123.433 Left

1:52:44 AM 6.417551 11.0624 6.000019 14.4998 1:52:44 AM -68.2804 -135.3 Left

1:52:45 AM 6.349144 11.03582 6.000019 14.49997

1:52:46 AM -56.2662 -147.393 Left

1:52:47 AM 6.306427 11.01443 6.000019 14.49985 1:52:47 AM -52.1288 -149.342 Left

1:52:48 AM 6.260409 11.0026 6.000019 14.49985 1:52:48 AM -56.1821 -143.883 Left

1:52:49 AM 6.201631 10.97643 6.000019 14.4998 1:52:49 AM -50.8526 -143.024 Left

1:52:50 AM 6.168406 10.96397 6.000019 14.49996 1:52:50 AM -55.2349 -147.046 Left

1:52:51 AM 6.11854 10.95276 6.000019 14.4998 1:52:51 AM -54.1744 -139.75 Left

1:52:52 AM 6.076892 10.94891 6.000019 14.49985 1:52:52 AM -51.6418 -138.903 Left

1:52:53 AM 6.0286 10.94015 6.000019 14.4998 1:52:53 AM -57.3512 -139.283 Left

1:52:54 AM 5.958687 10.92706 6.000019 14.49985 1:52:54 AM -56.1437 -139.51 Left

1:52:55 AM 5.903365 10.91479 6.000019 14.4998 1:52:55 AM -58.0218 -140.735 Left

1:52:56 AM 5.850225 10.91505 6.000019 14.49997 1:52:56 AM -54.0321 -136.25 Left

1:52:57 AM 5.780518 10.9062 6.000019 14.49996

1:52:58 AM -57.0427 -137.634 Left

1:52:59 AM 5.732518 10.90266 6.000019 14.49985 1:52:59 AM -54.6161 -137.817 Left

1:53:00 AM 5.685184 10.90153 6.000019 14.49985 1:53:00 AM -53.7877 -138.206 Left

1:53:01 AM 5.636422 10.89761 6.000019 14.4998 1:53:01 AM -55.3123 -141.307 Left

1:53:02 AM 5.585737 10.89337 6.000019 14.4998 1:53:02 AM -55.7246 -139.886 Left

1:53:03 AM 5.52403 10.8875 6.000019 14.49997 1:53:03 AM -60.6025 -140.763 Left

1:53:04 AM 5.478683 10.88959 6.000019 14.4998 1:53:04 AM -54.0615 -140.834 Left

1:53:05 AM 5.432499 10.88104 6.000019 14.49996 1:53:05 AM -60.4018 -148.573 Left

1:53:06 AM 5.382638 10.88239 6.000019 14.49985

1:53:07 AM 5.334692 10.88439 6.000019 14.4998 1:53:07 AM -62.5412 -144.644 Left

1:53:08 AM 5.287015 10.88934 6.000019 14.49985 1:53:08 AM -61.6562 -145.279 Left

1:53:09 AM 5.219532 10.89126 6.000019 14.4998 1:53:09 AM -60.454 -146.228 Left

1:53:10 AM 5.15249 10.88888 6.000019 14.49997 1:53:10 AM -62.9963 -153.653 Left

1:53:11 AM -65.9307 -155.404 Left

1:53:12 AM 5.093613 10.8905 6.000019 14.49996 1:53:12 AM -74.3723 -156.662 Left

1:53:13 AM 5.04456 10.89211 6.000019 14.49985 1:53:13 AM -87.4061 -162.053 Left

1:53:14 AM 4.996026 10.88948 6.000019 14.4998 1:53:14 AM -126 Recenter-Left

1:53:15 AM 4.991217 10.88767 6.000019 14.49996 1:53:15 AM -130 Recenter-Left

Page 327: Modelling a real-time multi-sensor fusion-based navigation ...

309

1:53:16 AM 4.990352 10.88381 6.000019 14.49996 1:53:16 AM -124 Recenter-Left

1:53:17 AM 4.99016 10.88246 6.000019 14.4998 1:53:17 AM -121 Recenter-Left

1:53:18 AM 4.989205 10.88002 6.000019 14.49996 1:53:18 AM -102.304 -177.617 Left

1:53:19 AM 4.988693 10.87982 6.000019 14.49996 1:53:19 AM -102.17 -31.9602 Right

1:53:20 AM 4.980302 10.88846 6.000019 14.49997 1:53:20 AM -105.96 -28.6357 Right

1:53:21 AM 4.974521 10.8948 6.000019 14.49985 1:53:21 AM -99.0141 -15.0242 Right

1:53:22 AM 4.985075 10.88896 6.000019 14.49985

1:53:23 AM -104.962 -27.5926 Right

1:53:24 AM 4.990876 10.87805 6.000019 14.49997 1:53:24 AM -99.4879 -33.9593 Right

1:53:25 AM 4.992637 10.86631 6.000019 14.49997 1:53:25 AM -92.1063 -36.4425 Right

1:53:26 AM 4.99206 10.85928 6.000019 14.4998 1:53:26 AM -82.3122 -56.3929 Right

1:53:27 AM 4.987722 10.84992 6.000019 14.49997 1:53:27 AM -78.8108 -71.4661 Right

1:53:28 AM 4.980475 10.84083 6.000019 14.49996 1:53:28 AM -73.5009 -83.6646 Left

1:53:29 AM 4.975312 10.82903 6.000019 14.4998 1:53:29 AM -68.0154 -90.709 Left

1:53:30 AM 4.968103 10.81765 6.000019 14.49996 1:53:30 AM -55.5609 -93.1641 Left

1:53:31 AM 4.95587 10.81778 6.000019 14.49996 1:53:31 AM -43.5889 -97.0433 Left

1:53:32 AM 4.959801 10.85633 6.000019 14.4998 1:53:32 AM -43.7389 -106.511 Left

1:53:33 AM 4.976748 10.92349 6.000019 14.49996 1:53:33 AM -47.392 -112.999 Left

1:53:34 AM 4.982579 10.94744 6.000019 14.4998 1:53:34 AM -39.3973 -119.776 Left

1:53:35 AM -51.5808 -104.772 Left

1:53:36 AM 5.0058 10.98978 6.000019 14.49997 1:53:36 AM -69.5573 -117.347 Left

1:53:37 AM 5.014694 11.00246 6.000019 14.49985 1:53:37 AM -78.1664 -152.557 Left

1:53:38 AM 5.059988 11.04233 6.000019 14.49985

1:53:39 AM 5.100826 11.07482 6.000019 14.4998 1:53:39 AM -84.2212 -194.002 Left

1:53:40 AM 5.149418 11.10312 6.000019 14.4998 1:53:40 AM -81.3581 -200.172 Left

1:53:41 AM 5.20497 11.11898 6.000019 14.49997 1:53:41 AM -95.2547 -200.254 Left

1:53:42 AM 5.255413 11.12372 6.000019 14.49997 1:53:42 AM -112.726 -187.776 Left

1:53:43 AM 5.306905 11.12421 6.000019 14.49985 1:53:43 AM -119.38 -128.332 Left

1:53:44 AM 5.371315 11.10587 6.000019 14.4998 1:53:44 AM -137.001 -15.3879 Right

1:53:45 AM 5.439966 11.08478 6.000019 14.49996 1:53:45 AM -141.609 -73.4359 Right

1:53:46 AM 5.493051 11.06112 6.000019 14.49996 1:53:46 AM -143.465 -13.569 Right

1:53:47 AM 5.538978 11.04218 6.000019 14.4998 1:53:47 AM -143.472 -13.8734 Right

1:53:48 AM -143.501 -14.3689 Right

1:53:49 AM 5.597509 11.01698 6.000019 14.49996 1:53:49 AM -144.594 -52.0097 Right

1:53:50 AM 5.649308 10.99419 6.000019 14.4998 1:53:50 AM -143.828 -51.0071 Right

1:53:51 AM 5.70221 10.97442 6.000019 14.49996 1:53:51 AM -142.183 -54.9048 Right

1:53:52 AM 5.744753 10.96664 6.000019 14.49997 1:53:52 AM -138.585 -53.8142 Right

1:53:53 AM 5.807028 10.94724 6.000019 14.4998

1:53:54 AM 5.846809 10.93905 6.000019 14.49996 1:53:54 AM -138.739 -52.7229 Right

1:53:55 AM 5.924026 10.91969 6.000019 14.49985 1:53:55 AM -136.974 -50.4903 Right

1:53:56 AM 5.978089 10.91452 6.000019 14.49997 1:53:56 AM -134.698 -56.2263 Right

Page 328: Modelling a real-time multi-sensor fusion-based navigation ...

310

1:53:57 AM 6.043155 10.90553 6.000019 14.49985 1:53:57 AM -133.75 -51.0146 Right

1:53:58 AM 6.087174 10.90328 6.000019 14.4998 1:53:58 AM -134.019 -55.0196 Right

1:53:59 AM 6.14807 10.89994 6.000019 14.49985 1:53:59 AM -134.2 -54.074 Right

1:54:00 AM -134.435 -51.9361 Right

1:54:01 AM 6.194035 10.88963 6.000019 14.4998 1:54:01 AM -136.688 -57.2412 Right

1:54:02 AM 6.249125 10.89418 6.000019 14.49996 1:54:02 AM -135.816 -54.8457 Right

1:54:03 AM 6.309861 10.88464 6.000019 14.49996 1:54:03 AM -136.301 -56.5503 Right

1:54:04 AM 6.355663 10.88891 6.000019 14.4998 1:54:04 AM -136.455 -53.5466 Right

1:54:05 AM 6.402854 10.88107 6.000019 14.49997 1:54:05 AM -139.855 -58.6116 Right

1:54:06 AM 6.464009 10.88349 6.000019 14.49985 1:54:06 AM -140.498 -59.3407 Right

1:54:07 AM 6.511384 10.8805 6.000019 14.49996 1:54:07 AM -140.995 -58.4003 Right

1:54:08 AM 6.564715 10.88709 6.000019 14.4998 1:54:08 AM -142.051 -53.5457 Right

1:54:09 AM 6.607385 10.88495 6.000019 14.49997 1:54:09 AM -145.704 -59.4671 Right

1:54:10 AM 6.657025 10.8846 6.000019 14.49996 1:54:10 AM -146.667 -60.2929 Right

1:54:11 AM 6.702734 10.88588 6.000019 14.49996

1:54:12 AM -145.24 -59.9032 Right

1:54:13 AM 6.763561 10.88866 6.000019 14.4998 1:54:13 AM -149.631 -62.8854 Right

1:54:14 AM 6.806781 10.88923 6.000019 14.49997 1:54:14 AM -154.306 -62.5465 Right

1:54:15 AM 6.859155 10.88654 6.000019 14.49985 1:54:15 AM -156.043 -65.5184 Right

1:54:16 AM 6.907557 10.88963 6.000019 14.49997 1:54:16 AM -158.798 -73.7818 Right

1:54:17 AM 6.955798 10.88258 6.000019 14.49996 1:54:17 AM -169.409 -82.7007 Right

1:54:18 AM 6.997734 10.87064 6.000019 14.49985 1:54:18 AM 126 Recenter-Right

1:54:19 AM 6.997715 10.86932 6.000019 14.49996 1:54:19 AM 127 Recenter-Right

1:54:20 AM 6.996756 10.86789 6.000019 14.49997 1:54:20 AM 122 Recenter-Right

Page 329: Modelling a real-time multi-sensor fusion-based navigation ...

311

APPENDIX F – BENCHMARK SCENARIOS SIMULATION RESULTS

Benchmark 1 - Sample 1

Page 330: Modelling a real-time multi-sensor fusion-based navigation ...

312

Benchmark 1 - Sample 2

Page 331: Modelling a real-time multi-sensor fusion-based navigation ...

313

Benchmark 1 - Sample 3

Page 332: Modelling a real-time multi-sensor fusion-based navigation ...

314

Benchmark 1 - Sample 4

Page 333: Modelling a real-time multi-sensor fusion-based navigation ...

315

Benchmark 1 - Sample 5

Page 334: Modelling a real-time multi-sensor fusion-based navigation ...

316

Benchmark 2 - Sample 1

Page 335: Modelling a real-time multi-sensor fusion-based navigation ...

317

Benchmark 2 - Sample 2

Page 336: Modelling a real-time multi-sensor fusion-based navigation ...

318

Benchmark 2 - Sample 3

Page 337: Modelling a real-time multi-sensor fusion-based navigation ...

319

Benchmark 2 - Sample 4

Page 338: Modelling a real-time multi-sensor fusion-based navigation ...

320

Benchmark 2 - Sample 5

Page 339: Modelling a real-time multi-sensor fusion-based navigation ...

321

Benchmark 3 - Sample 1

Page 340: Modelling a real-time multi-sensor fusion-based navigation ...

322

Benchmark 3 - Sample 2

Page 341: Modelling a real-time multi-sensor fusion-based navigation ...

323

Page 342: Modelling a real-time multi-sensor fusion-based navigation ...

324

Benchmark 3 - Sample 3

Page 343: Modelling a real-time multi-sensor fusion-based navigation ...

325

Benchmark 3 - Sample 4

Page 344: Modelling a real-time multi-sensor fusion-based navigation ...

326

Benchmark 3 - Sample 5