USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on...

15
1 USAGE OF GESTURE CONTROL TECHNOLOGY ON REMOTE-OPERATED UNMANNED GROUND VEHICLES Phee Yong Han 1 , Sandheep Ransilu Piyasanka 1 , Tay Kai Jun ,2 , Toh Ya Wei ,2 , Bryan Ng Kin Loong 3 , Sharon Ang Mei Ling 3 1 Victoria School, 2 Siglap Link, Singapore 448880 2 NUS High School of Mathematics and Science, 20 Clementi Avenue 1, Singapore 129957 3 Defence Science and Technology Agency, 1 Depot Road, Singapore 109679 ___________________________________________________________________________ ABSTRACT Recent developments in gesture control technology have the potential to allow a soldier to control his devices while keeping his hands free to react to dangerous situations. The objective of this project is to find out whether a soldier can employ this technology to his advantage, controlling an unmanned ground vehicle (UGV) with gesture control technology. This study made use of the Myo armband by Thalmic Labs to develop a prototype for concept demonstration of the technology and its potential applications. The study on the use of gesture control technology on commercial devices has been conducted, and through this the design considerations on the use of gesture control have been determined. The results from this study suggests that, while gesture control technology was not as effective as conventional means of control in the control of the UGV, it freed up the hands of the soldier and improved the soldier‟s response time to alerts. INTRODUCTION The project aimed to demonstrate the potential of gesture-based wearable technology that could improve the intuitiveness and convenience of controlling a UGV, compared to that of present-day remote control options. Convenience, compatibility and reactiveness to combat situations were evaluated in this study. An UGV was built using a commercially available robotic kit and an Internet Protocol (IP) camera. Two separate Android applications that controlled the UGV over a Wi-Fi network was created. One simulated the conventional remote controller with on-screen buttons to serve as the control experiment, while the other application allowed the Myo armband to communicate with the UGV through it. Both applications have the live stream video (from the IP camera) embedded. Tests were carried out to compare both forms of control (gesture-based control versus traditional remote control with on-screen buttons) by measuring how long it would take the UGV to complete the pre-determined course consisting of numerous turns under varying circumstances. Stress testing on the accuracy of Myo armband was also conducted to identify the success-fail rate of gesture recognition for the armband. MATERIALS Rover 5 The Rover 5 from Dagu Robot, with motor rated voltage of 7.2 V and motor stall current of 2.5 A was selected to be the rolling chassis of this project. The tracked robot chassis, of dimensions 24.5 cm x 22.5 cm x 7.4 cm (as shown) and of weight 0.72 kg i (without the addition of other devices), is affixed with a 17.7 cm x 10.4 cm acrylic board via the mounting

Transcript of USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on...

Page 1: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

1

USAGE OF GESTURE CONTROL TECHNOLOGY ON REMOTE-OPERATED

UNMANNED GROUND VEHICLES

Phee Yong Han

1, Sandheep Ransilu Piyasanka

1, Tay Kai Jun

,2, Toh Ya Wei

,2, Bryan Ng Kin Loong

3, Sharon

Ang Mei Ling3

1Victoria School, 2 Siglap Link, Singapore 448880

2 NUS High School of Mathematics and Science, 20 Clementi Avenue 1, Singapore 129957

3Defence Science and Technology Agency, 1 Depot Road, Singapore 109679

___________________________________________________________________________

ABSTRACT

Recent developments in gesture control technology have the potential to allow a soldier to

control his devices while keeping his hands free to react to dangerous situations. The

objective of this project is to find out whether a soldier can employ this technology to his

advantage, controlling an unmanned ground vehicle (UGV) with gesture control technology.

This study made use of the Myo armband by Thalmic Labs to develop a prototype for

concept demonstration of the technology and its potential applications. The study on the use

of gesture control technology on commercial devices has been conducted, and through this

the design considerations on the use of gesture control have been determined. The results

from this study suggests that, while gesture control technology was not as effective as

conventional means of control in the control of the UGV, it freed up the hands of the soldier

and improved the soldier‟s response time to alerts.

INTRODUCTION

The project aimed to demonstrate the potential of gesture-based wearable technology that

could improve the intuitiveness and convenience of controlling a UGV, compared to that of

present-day remote control options. Convenience, compatibility and reactiveness to combat

situations were evaluated in this study.

An UGV was built using a commercially available robotic kit and an Internet Protocol (IP)

camera. Two separate Android applications that controlled the UGV over a Wi-Fi network

was created. One simulated the conventional remote controller with on-screen buttons to

serve as the control experiment, while the other application allowed the Myo armband to

communicate with the UGV through it. Both applications have the live stream video (from

the IP camera) embedded.

Tests were carried out to compare both forms of control (gesture-based control versus

traditional remote control with on-screen buttons) by measuring how long it would take the

UGV to complete the pre-determined course consisting of numerous turns under varying

circumstances. Stress testing on the accuracy of Myo armband was also conducted to identify

the success-fail rate of gesture recognition for the armband.

MATERIALS

Rover 5

The Rover 5 from Dagu Robot, with motor rated voltage of 7.2 V and motor stall current of

2.5 A was selected to be the rolling chassis of this project. The tracked robot chassis, of

dimensions 24.5 cm x 22.5 cm x 7.4 cm (as shown) and of weight 0.72 kgi (without the

addition of other devices), is affixed with a 17.7 cm x 10.4 cm acrylic board via the mounting

Page 2: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

2

holes. This platform thereafter allowed us to secure other electronic devices including but not

limited to batteries, shields, camera and router onto the vehicle (as shown in Figure 1.2 and

Figure 1.3).

Figure 1.1: Dimensions of Rover 5. Figure obtained from Dagu Robot, accessed on

21/12/2015.

<https://www.dropbox.com/s/2zrl5hcrstc0gat/Rover%205%20Introduction.pdf?dl=0>

Figure 1.2: Top profile of UGV

Page 3: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

3

Figure 1.3: Side profile of UGV

Arduino Uno, Wi-Fi and Motor shields

The development board acts as the brain of the UGV to collect and send appropriate

information between the controller and the UGV in order to execute the desired control

actions.

a. Arduino Uno SMD

The Arduino Uno SMD is a microcontroller board based on the ATmega328P chip, and

comes with 14 digital input/output pins (of which 6 can be used as PWM outputs), 6 analog

inputs, a USB connection, a power jack, an ICSP header and a reset button. The

recommended input voltage is 7 - 12 Vii.

Figure 2.1.1: Schematic diagram of Arduino Uno SMD. Figure obtained from Arduino,

accessed on 21/12/2015. <https://www.arduino.cc/en/uploads/Main/arduino-uno-smd-

schematic.pdf>

Page 4: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

4

Figure 2.1.2: Arduino UNO SMD. Figure obtained from Arduino, accessed on 21/12/2015.

https://www.arduino.cc/en/uploads/Main/ArduinoUnoSMDFront.jpg

b. Arduino Wi-Fi Shield

The Arduino Wi-Fi Shield allows an Arduino board to connect to the internet using the

802.11 wireless specification (Wi-Fi). It is based on the HDG104 Wireless LAN 802.11b/g

System in-Package. An AT32UC3 provides a network (IP) stack capable of both TCP and

UDP. The Wi-Fi shield connects to the Arduino UNO SMD using long wire-wrap headers

which extend through the shield, keeping the pin layout intact and allowing the motor shield

to be stacked on top. The Arduino UNO SMD communicates with both the Wi-Fi shield's

processor and SD card using the SPI bus (through the ICSP header), which is on digital pins

11, 12, and 13 on the Uno; pin 10 is used to select the HDG104 and pin 4 for the SD card.

Note that digital pin 7 is used as a handshake pin between the Wi-Fi shield and the UNO

SMP, and should not be used by the motor shield. It has an operating voltage of 5 Viii

.

Figure 2.2.1: Arduino Wi-Fi Shield. Figure obtained from Arduino, accessed on 22/12/2015.

https://www.arduino.cc/en/Main/ArduinoWiFiShield

c. Arduino Compatible Motor Shield (2A)

The 55mm x 55mm Arduino-compatible motor shield (2A), with a L298P chip, is able to

drive four DC motors (ranging 7 to 12 V) with a maximum current output of 2A per channel.

The control of the speed of the vehicle can be obtained from Arduino‟s PWM output pin 5

and 6, whist the enable-disable function of the motor control is signaled by Arduino digital

pin 4 and 7 - however, as pin 7 is the handshake pin for the Wi-Fi shield, pin 8 would be used

Page 5: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

5

instead. This shield features a logic control voltage of 5 V, and a motor driven voltage

ranging from 4.8 to 35 Viv

.

Figure 2.3.1: Motor shield. Figure obtained from SG Botic, accessed on 22/12/2015.

<http://www.sgbotic.com/index.php?dispatch=products.view&product_id=1749>

Wireless N Network Camera - D-Link DCS-920L

The D-Link DCS-920L allows the user to access the live feed easily while connecting to the

Wi-Fi router. It is small and portable which allows it to be mounted on the platform of the

Rover 5. It has an operating voltage of 5 V.

Router - TP-Link TL-WR702N

The TP-Link TL-WR702N nano router allows for the IP camera to be connected to a wireless

network, so as to send the live feed onto the applications. It is small and portable which

allows it to be mounted on the platform of the Rover 5. It has an operating voltage of 5 V.

Power sources

An onboard power source is needed to supply power to the UGV system. The table below

summarizes the amount of voltage needed for each part of the UGV.

UNO SMD 12 V

Wi-Fi shield 5 V (obtained from UNO SMD)

Motor shield 5 V (obtained from UNO SMD)

IP camera 5 V

Router 5 V

10 GP Rechargeable NiMH LSD AA 2700 mAh batteries (with 1.2 V output each) were used

to meet the 12 V needed by the Arduino UNO SMD, Wi-Fi shield and motor shield. The IP

camera and router are connected in parallel to a power bank of 16800 mAh (output of 5 V

and 2.0 A).

Page 6: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

6

Myo gesture control armband

The Myo Armband by Thalmic Labs was used to conduct the concept demonstration. With 8

electromyography (EMG) sensors and a 9-axis inertial measurement unit (IMU), it is able to

measure electrical activity produced by muscles whilst sensing the motion and orientation of

forearm in order to detect the gesture madev. Currently, the Myo can detect 5 gestures right

out of the box: double tap, making a fist, fingers spread, wave in and wave out.

Figure 3.1: Gestures detected by Myo armband. Figure obtained from Thalmic Labs,

accessed on 5/1/2016. <https://www.myo.com/techspecs>

The Myo armband transmits this information over a Bluetooth connection to communicate

with various applications and devices.

Android application

Two applications were created. One simulated the conventional remote controller with on-

screen buttons, while the other application allowed the Myo armband to communicate with

the UGV. Both applications have the live stream video (from the IP camera) embedded. The

conventional remote control application has 5 buttons embedded on it: forward, backward,

left, right and stop.

The Myo armband application offered two sets of control: standard gesture control and

orientation-based gesture control.

The gestures in Standard Control are:

a) Wave-in to turn left

b) Wave-Out to turn right

c) Fist to move forward

d) Spread to move backward

e) Double tap to stop

The gestures used in Orientation control are:

1. Fist and rotate left to turn left

2. Fist and rotate right to turn right

3. Wave-in for Forward

4. Wave-out for Backward

5. Spread for Stop

6. When the Fist is in the “neutral” position, it will also stop the robot

METHODS, RESULTS AND DISCUSSIONS

Experiment 1

The aim of Experiment 1 is to investigate the effectiveness of the Myo armband compared to

remote control in a „comfortable‟ seated position, and also in extenuating circumstances

where other variables such as the user‟s postures and the user‟s baggage are involved.

Page 7: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

7

Experiment 1 will be split into two: Experiment 1.1 and Experiment 1.2. The details and

procedures of Experiment 1 are made available in the appendix.

In Experiment 1.1, a pre-determined route is marked out with tape. In a comfortable sitting

position, the user maneuvers the UGV using the remote control with on-screen buttons

application. The user is not allowed to view the UGV directly; instead, he or she is to access

the live feed stream available on the same application. The user is to repeat the course 3

times, and the time taken to finish each course will be recorded. The time taken will serve as

the baseline for which other modes of control would be evaluated. For example, if a

particular mode of control is less responsive and/or causes the user to make more mistakes,

the time taken to complete the course would be longer. Upon completion with the remote

control application, the user is to repeat the experiment using the Myo armband.

Experiment 1.2 is similar to Experiment 1.1. The only difference lies in the posture of the

user; the user will now be in a crouching position, carrying both a weighted backpack and a

mock rifle. This aims to simulate real-life circumstances, where the added items may result in

differing results for either modes of control.

As shown in Table 1.1 below (tabulated results for Experiment 1.1), less time is needed to

complete the route when using remote control, with a difference of 14.5 seconds (when

compared to standard gesture control). This shows that it was more effective to control the

UGV using the conventional remote control compared to gesture control. Other than the pre-

set gestures made available by Thalmic Labs, the experiment also introduced an alternative

set of controls on the Myo armband (orientation-based gesture control, as mentioned above

under Materials).

As shown in Table 1.2 below (tabulated results for Experiment 1.2), remote control allows

for a faster completion of course by 23.5 seconds.

Page 8: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

8

Comparing the results from Experiment 1.1 and Experiment 1.2, it can be observed that, even

with the change in position, remote control results in faster time taken.

Experiment 2

Experiment 2 aims to investigate the amount of time a person takes to be in a ready position

when using the Myo armband compared to remote control. In this experiment, the user will

start in a crouching position while carrying a weighted bag and a mock rifle. He or she is to

maneuver the UGV with either modes of control. Upon hearing the buzzer, the user is to stop

the UGV and then get into a standing position with both hands on the mock rifle. The time

taken for the user to get into this “ready position” after the buzzer is sounded will be

recorded. The details and procedures of Experiment 2 are made available in the appendix.

As seen in Table 2, gesture control allows user to react faster by approximately 0.31 seconds

than a conventional remote control. The average human reaction time to a visual stimulus is

about 0.271 secondsvi

. This shows that the decrease of 0.31 seconds in reaction time for the

user may prove to be useful in combat situations. This decrease may be attributed to the

hands free experience offered by the Myo armband, enabling the users to quickly position

themselves with the mock rifle.

Experiment 3

Experiment 3 aims to investigate how accurate the Myo armband is at detecting the five pre-

set gestures, by measuring how many true and false readings the Myo armband returns back

when each user repeats each gesture for a specific number of times. The details and

procedures of Experiment 3 are made available in the appendix.

The number of times the Myo armband correctly detects a gesture is recorded in Table 3.

There is a wide range of percentage of recognition, varying with the users and gestures. This

shows that the Myo armband is still not 100% accurate and reliable, and there is room for

improvement for the armband.

Table 3

Subjects/ recognition (%)

Gestures: YH SR YW Ave

Wave out 100 90 60 83.3

Page 9: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

9

Table 3

Subjects/ recognition (%)

Wave in 100 50 60 70

Fist 90 70 90 83.3

Spread 60 80 20 53.3

Double Tap 80 50 100 76.7

Ave 86 68 66 73.3

CONCLUSIONS

In this study, the Myo armband by Thalmic Labs had been used to develop a prototype for

concept demonstration of gesture control technology and its potential applications. The

results from Experiment 1 showed that, given the current state of technology for gesture

control, conventional remote control would still be more effective in the control of UGVs.

Despite this, the results in Experiment 2 indicate that there is some value for this mode of

interaction. While the difference was merely an improvement of 0.31 seconds in response

time to the alert, the quicker response could be necessary for the soldier to get out of harm‟s

way should he be attacked by an adversary.

Whilst the Myo armband may not be a better-suited option in the control of the UGV for

now, there is potential for improvement as the results from Experiment 3 clearly implied that

the device itself is unable to accurately capture all the data inputs, which could have resulted

in the poorer performance of the gesture mode of control in Experiments 1 and 2. With

further improvement to the accuracy of these devices, it is possible to yield better

performance in the near future.

AREAS OF IMPROVEMENTS / FUTURE WORKS

1. Replicating ‘real-life’ scenario

Greater attempts to mimic real-life scenario could be made. For example, one may carry out

field tests with actual soldiers over a variety of operational scenarios. Similarly, the use of

actual military equipment instead of commercial ones for the trial (e.g. carrying actual

operational loads) may yield more representative results.

2. Methods

The gesture control experience can be optimized by coding out specified and specialized

gestures that suit each individual better (instead of using the pre-set gestures available) given

that everyone‟s electrical activity produced by the muscle cells are different. Similarly, this

study should be repeated with a larger group of people so as to obtain more readings to truly

evaluate the compatibility and efficiency of the armband at large. Other than quantitative

data, qualitative data may also be obtained via means of survey.

Page 10: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

10

ACKNOWLEDGEMENTS

This study was done under the Young Defense Scientists Programme. We would like to thank

our mentors Bryan Ng Kin Loong and Sharon Ang Mei Ling, from Defense Science and

Technology Agency (DSTA) who provided insight and expertise that greatly assisted our

learning experience.

Page 11: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

11

APPENDIX

Experiment 1

Aim:

To investigate the effectiveness of Myo armband compared to remote control in a

„comfortable‟ seated position, and also in extenuating circumstances where other variables

such as the user‟s postures and the user‟s baggage are involved.

Experiment 1.1 will be the situation whereby the user is in a „comfortable‟ seated position,

whereas Experiment 1.2 will be the situation whereby other extenuating factors are involved.

Approach:

By measuring and comparing the time taken to navigate the robot through a pre-defined

course by either remote control or gesture control in the various postures. This would allow

the difference in time taken for maneuvering the robot (measurement of effectiveness) to be

compared.

Constants:

Terrain and route

Controller (Subject)

When Experiment 1.1 and 1.2 are considered separately - Posture of user

When Experiment 1.1 and 1.2 are considered separately - Equipment and baggage

Independent Variable:

Mode of control (remote control or gesture control)

Dependent Variable:

Time taken for robot to maneuver through the route

Apparatus:

1. Course/route to be taken by UGV

a. Tape (to mark out the course/route)

b. Mock rifle (for simulation)

c. Bag with weights (for simulation)

2. Robot and controllers

a. Rover 5

b. Android phone (OnePlus 2)

c. Phone applications (MyoControlledwithLiveStream)

d. Myo armband

e. Power bank (16800mAh, Output: 5.0V and 2.0A)

f. Rechargeable batteries (GP 2700 series)

g. IP Camera (D-Link: DCS-930LB)

h. Router (TP-Link: TL-WR702N)

3. Recording Device - For observations

a. Phone camera (iPhone 6+)

b. Laptop (Microsoft Excel spreadsheet)

Page 12: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

12

Procedures:

1. Using tape, mark out a course/route in which the robot is supposed to complete.

2. Note that there are two sets of postures: seated comfortably or in a crouched position

while holding a gun and carrying a bag. User is to start seated comfortably.

3. Place the robot at the start line, and control the robot with the app in Remote Control

mode. User is not allowed to view the UGV directly; instead, he or she is to access the

live feed made available in the app.

4. The user is to, upon hearing “start”, begin to maneuver the vehicle around the course in

order to reach the finishing line.

5. The time to maneuver the vehicle from the start to end will be recorded.

6. Each person will repeat the test at least 3 times (and the average will thus be calculated).

7. Repeat steps 3 to 6 in crouching position.

8. Repeat steps 2 to 7 on the Myo mode.

EXPERIMENT 2

Aim:

To investigate the amount of time a person takes to be in a ready position when using the

Myo armband compared to remote control in extenuating circumstances.

Approach:

By measuring and comparing the time taken for a person to stop the UGV and pick up his

rifle with both hands and revert to a standing position when using the Myo Armband versus

Remote Control.

Constants:

1. Type of stopwatch

2. Posture of user (crouching position while carrying a bag and a mock rifle)

3. Type of weighted bag and mock rifle carried by user

Independent Variable:

Mode of control (Myo Armband or Remote Control)

Dependent Variable:

Time taken for the person to be in the ready position

Apparatus:

1. Robot and controllers

a. Rover 5

b. Android phone (OnePlus 2)

c. Phone applications (MyoControlledwithLiveStream)

d. Myo armband

Page 13: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

13

e. Power bank (16800mAh, Output: 5.0V and 2.0A)

f. Rechargeable batteries (GP 2700 series)

g. IP Camera (D-Link: DCS-930LB)

h. Router (TP-Link: TL-WR702N)

2. Recording Device - For observations

a. Phone camera (iPhone 6+)

b. Laptop (Microsoft Excel spreadsheet)

Procedures:

1. The subject will start in a crouching position while carrying a bag and a rifle. That person

will be also be holding the Remote Control to control the Rover.

2. The user then will start maneuvering the Rover.

3. A stopwatch and a timer will be started at the same time and once the timer reaches 0, its

buzzer will sound and the subject will have to stop the Rover and be in a standing

position with both hands holding the rifle on its trigger and handle.

4. Once the user is in the ready position, the person recording the time will stop the

stopwatch.

5. The time on the timer and the reaction time of the recorder (refer to table 2) will be

deducted from the time on the stopwatch to get the reaction time of the user including the

reaction time of the recorder.

6. Repeat steps 1-5 to obtain the average time taken by a person.

7. Repeat steps 1-6 while using Gesture Control.

EXPERIMENT 3

Aim

To investigate how accurate the Myo armband is at detecting the various pre-set gestures,

evaluating the success-fail rate of gesture recognition.

Approach

By measuring how many true and false readings the Myo armband returns back when each

user repeats each gesture for a specific number of times.

Dependent Variable

Number of times the Myo armband successfully recognize the intended gesture made by the

user

Apparatus

1. Myo armband

2. Laptop (to see the values returned by the armband, as well as to record down the number

of times the armband successfully identified the gestures made)

Procedures

1. The Myo armband is connected wirelessly to the laptop via a USC Type-A plug (which

came with the armband).

2. The user wears the armband and syncs it.

3. There are five pre-set gestures: wave left, wave right, double tap, fist, fingers spread.

4. The user begins to „wave left‟ for 10 times, alternating with any other gestures.

5. The gesture recognized by the armband will be returned on the app that we have coded.

Page 14: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

14

6. The number of times the Myo armband correctly (and wrongly) identifies the gesture

made will be recorded down.

7. Once done with the first set of gesture, repeat step 4 to 6 with other gestures (wave right,

double tap, fist, finger spread).

Page 15: USAGE OF GESTURE CONTROL TECHNOLOGY ON … · Myo gesture control armband ... set of controls on the Myo armband (orientation-based gesture control, as mentioned above under Materials).

15

REFERENCE

1. ii Dagu Robot, Rover 5 manual. Accessed on 21/12/2015.

https://www.dropbox.com/s/2zrl5hcrstc0gat/Rover%205%20Introduction.pdf?dl=0

2. ii Arduino, Arduino Uno SMD. Accessed on 21/12/2015.

https://www.arduino.cc/en/Main/ArduinoBoardUnoSMD

3. iii Arduino, Arduino Wi-Fi Shield. Accessed on 22/12/2015.

https://www.arduino.cc/en/Main/ArduinoWiFiShieldv

4. iv SG Botic, 2A Motor Shield. Accessed on 22/12/2015.

http://www.sgbotic.com/index.php?dispatch=products.view&product_id=1749

5. v Thalmic Labs, Tech Specs. Accessed on 5/1/2016. https://www.myo.com/techspecs

6. vi Human Benchmark, Reaction Time Statistics. Accessed on 10/02/2016.

http://www.humanbenchmark.com/tests/reactiontime/statistics