ksuweb.kennesaw.eduksuweb.kennesaw.edu/~snorth/HoloLens/Final Project... · Web viewThere is no...

7
Performance Analysis of Brain Control Interface in Drone Applications Sarah North, Ahmad Alissa, Josh Cooper, Adnan Rashied, Eric Rawls, Jason Walters, Utku “Victor” Sahin, Kade Randell and Cheyenne Sancho Department of Computer Science College of Computing and Software Engineering Kennesaw State University ABSTRACT The main objective of this study is to find efficient methods to utilize brain control interface in conjunction with aerial drones. The study investigates how effective the EPOC+ is by challenging users of diverse genders and ages to complete tasks using mental commands or facial expressions to control a Parrot AR-Drone 2.0. After a mental and facial commands calibration phase, the designed experiments were conducted using randomly selected participants (n=20). Preliminary analysis of the collected data indicated that there was no significant difference between the rating of difficulty before and after, between the mental and facial commands. Furthermore, this study showed that from group of participants more individuals had greater difficulty controlling the mental and facial commands than they originally expected. Keywords: User Interface, Brain Control Interface, Drones, Mental-Controlled Commands, Facial-Controlled Commands, EPOC+, Parrot AR-Drone 2.0. 1. INTRODUCTION Brain or mind controlled technology offers a new and dynamic type of human and computer interaction. This technology can make the user feel more immersed than other technology can such as a keyboard, mouse, controller, or any other device. During the past decade, drone usage has grown significantly in the military and other federal agencies; this growth has also extended to some local law-enforcement agencies. Growth in drone usage is not limited to government agencies; it can also be seen in the commercial, private, and recreational sectors. Due to the substantial growth in popularity and a large number of drones being acquired. Consequently, the Federal Aviation Administration (FAA) and many state governments taken steps to increase safety where drones are concerned. The purpose of this study was to investigate the innovative usages for a brain-controlled drone and how effective these drones might operate. This study also tested the bounds of human connectivity with a mind-controlled drone. 1.1 Concise Literature Review After an extensive research on the topic, revels two prominent articles that report on results from research conducted on online (Internet) and offline (non- Internet) and their relationship of emotions and the role they play in human interactions. In Anderson & McOwan’s [2] summary article reviews an extensive research of Emotiv EPOC+ headset and the correlation on emotions reveals that emotions play an important role in the

Transcript of ksuweb.kennesaw.eduksuweb.kennesaw.edu/~snorth/HoloLens/Final Project... · Web viewThere is no...

Page 1: ksuweb.kennesaw.eduksuweb.kennesaw.edu/~snorth/HoloLens/Final Project... · Web viewThere is no significant difference in difficulty between the user’s ability to control the drone

Performance Analysis of Brain Control Interface in Drone Applications

Sarah North, Ahmad Alissa, Josh Cooper, Adnan Rashied, Eric Rawls, Jason Walters, Utku “Victor” Sahin, Kade Randell and Cheyenne Sancho

Department of Computer ScienceCollege of Computing and Software Engineering

Kennesaw State University

ABSTRACTThe main objective of this study is to find efficient methods to utilize brain control interface in conjunction with aerial drones. The study investigates how effective the EPOC+ is by challenging users of diverse genders and ages to complete tasks using mental commands or facial expressions to control a Parrot AR-Drone 2.0. After a mental and facial commands calibration phase, the designed experiments were conducted using randomly selected participants (n=20). Preliminary analysis of the collected data indicated that there was no significant difference between the rating of difficulty before and after, between the mental and facial commands. Furthermore, this study showed that from group of participants more individuals had greater difficulty controlling the mental and facial commands than they originally expected.

Keywords: User Interface, Brain Control Interface, Drones, Mental-Controlled Commands, Facial-Controlled Commands, EPOC+, Parrot AR-Drone 2.0.

1. INTRODUCTIONBrain or mind controlled technology offers a new and dynamic type of human and computer interaction.  This technology can make the user feel more immersed than other technology can such as a keyboard, mouse, controller, or any other device. During the past decade, drone usage has grown significantly in the military and other federal agencies; this growth has also extended to some local law-enforcement agencies. Growth in drone usage is not limited to government agencies; it can also be seen in the commercial, private, and recreational sectors.  Due to the substantial growth in popularity and a large number of drones being acquired. Consequently, the Federal Aviation Administration (FAA) and many state governments taken steps to increase safety where drones are concerned. The purpose of this study was to investigate the innovative usages for a brain-controlled drone and how effective these drones might operate. This study also tested the bounds of human connectivity with a mind-controlled drone.

1.1 Concise Literature ReviewAfter an extensive research on the topic, revels two prominent articles that report on results from research conducted on

online (Internet) and offline (non-Internet) and their relationship of emotions and the role they play in human interactions. In Anderson & McOwan’s [2] summary article reviews an extensive research of Emotiv EPOC+ headset and the correlation on emotions reveals that emotions play an important role in the interactions between humans. Human emotion is fundamental to human experience; influencing things like cognition, perception, and even as far as rational decision-making. Therefore, the study of emotion and the Emotiv neuroheadset is indispensable [2]. The article “Emotion Recognition Using Emotiv EPOC+ Device” was to find the relationship between electroencephalogram (EEG) signals and human emotions. The study’s primary focus was on emotion recognition experiments that are conducted using the commercial Emotiv EPOC+ headset to record EEG signals while watching a variation of emotional movies [1, 2, 4, 5]. The need and importance of the automatic emotion recognition from EEG signals has grown with an increasing role of brain computer interface applications and development of new forms of human – centric and human – driven interaction with digital media.

Simply, Lin et al. [9], stated that as time goes on there are more and more studies that are focusing specifically on emotions and its impact on the users. Researchers continue to use Brain-Computer Interfaces (BCIs) such as Emotiv neuroheadset with mobile devices mandates to further development of efficient EEG data processing methods. In conclusion, documentation of the existing BCIs design process continue transpire towards designing of a brain computer interface for human emotion recognition with refinement of existing ones using additional feature set. A design set up of real time, cost – efficient and portable EEG bass affective BCI using the Emotiv EEG Neuroheadset have been proposed [6].

Research Question: Is there a significant difference in difficulty between mental command and facial expression utilizing brain control interface in conjunction with aerial drones?

Null Hypothesis: There is no significant difference in difficulty between the user’s ability to control the drone using mental or facial commands utilizing brain control interface in conjunction with aerial drones,

2. METHODOLOGY

Page 2: ksuweb.kennesaw.eduksuweb.kennesaw.edu/~snorth/HoloLens/Final Project... · Web viewThere is no significant difference in difficulty between the user’s ability to control the drone

2.1 ParticipantsTwenty participants (n=20) both male and female, ten (10) male and ten (10) female, with ages equal or greater than to eighteen (18) participated in the experiments. Participants were randomly selected from a User Interface Engineering class as well as other computer science courses. The initial phase was conducted by having the user control a virtual cube on an interactive graphical user interface to assess their abilities, and to calibrate the systems.

2.2 ApparatusThe experiments used an array of laptops and other electronic devices, but our main assessments were completed with the EPOC+, designed by Emotiv (see Figure 1), and the Parrot AR-Drone 2.0 [3, 7, 8]. A dedicated laptop was used to display and collect the assessment results in real-time.

Emotive EPOC+ (Highlight Specifications): 14 channels: AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4,

F8, AF4 Sequential sampling. Single ADC Resolution: 14 bits 1 LSB = 0.51μV 16 bits Bandwidth: 0.2 – 43Hz, digital notch filters at 50Hz and 60Hz Filtering: Built in digital 5th order Sinc filter Dynamic range (input referred): 8400μV(pp)

Parrot AR-Drone 2.0 (Highlight Specifications): 4 “inrunner” type brush-free motors, 14.5 watts and 28,500

rev/min Nyatron Gears Bronze self-lubricating ball bearings ARM Cortex A8 1 GHz 32-bit processor with DSP video 800

MHz TMS320DMC64x Linux 2.6.32 Operating System DDR2 1 GB at 200 MHz Gyroscope: 3 axles, accuracy of 2,000°/second

Figure 1. The EPOC+ nodes saturating on a test subject’s head.

2.3 Instruments for ExperimentsThe experimental instruments included a pre-experiment survey and post-experiment survey. Specific questions that were administered to the participants for each specific experiments are listed below:

Pre-Experiment Survey: Specify your age bracket. Gender. How do you feel about the usage of drone technology?

How would you rate your knowledge of drones? Have you ever flown a drone controlled by a handheld device? Have you ever flown a drone controlled through mental

commands? How much control do you think you will have of the drone while

using the EPOC+ (mental-command device? How easy do you think it will be to control the drone overall

using the EPOC+ device (headset)? What is your current stress level? Do you believe flying the drone will increase your current stress

level? How much difficulty do you believe you will have flying the drone

using mental commands? How much difficulty do you believe you will have flying the drone

using facial commands?

Post-Experiment Survey: How would you rate the control level you had of the drone while

using the EPOC+ device? How you rate your stress level after flying the drone? Do you believe flying the drone changed your stress level? How difficult was flying the drone using the mental commands? How much difficulty did you have flying the drone using the facial

commands?

2.4 ProcedureThe first initial step was the nodes that allowed the connection from the EPOC+ device were soaked in multipurpose solution and connected to the device. The next step is to make certain each node is receiving a good connection. This was determined by viewing the user interface and making sure the nodes were displaying as green color signal on the screen. A black, yellow, or red colored node meant the connection needed to be reevaluated. Once a proper connection was established, the EPOC+ was calibrated. Calibration occured by connecting the EPOC+ to the Xavier Composer interface (Figure 2 and Figure 3). The user then was subjected to the “virtual cube experiment,” which was the first stage of testing; during which, the user followed a set of commands where they imagined moving the virtual cube with their mind in a specific order. For example, the user was prompted to “push, pull, lift, shift to the right, shift to the left, and lower” the virtual cube. During the next part of experiment, the EPOC+ was connected to the Parrot AR-Drone 2.0. The user then attempted to replicate the results from the virtual cube experiment; only this time with an actual tangible object: the Parrot AR-Drone 2.0. The following stage of experiment was completed in the same manner as the preceding experiment. The next phase was completed with facial commands. The participant then used facial movements such as blink, smile, and clinch to make the drone perform certain movements.

With the EPOC+ and Parrot AR-Drone 2.0 set aside in an isolated room, individuals were randomly selected and asked if they would be willing to participate in the research. A pre-experiment survey was administered to all participants. The participants were given thirty (30) minutes to complete all experiment phases of the project. Once the participants completed the experiments, they were asked to complete a post-experiment survey questionnaire. The data was then collected and ran through a variation of statistical analysis test.

Page 3: ksuweb.kennesaw.eduksuweb.kennesaw.edu/~snorth/HoloLens/Final Project... · Web viewThere is no significant difference in difficulty between the user’s ability to control the drone

The entire experimental phase was monitored as well as the different reactions, expressions, and emotions of the experiment participants were captured.

Figure 2. Emotiv Xavier Control Panel connecting to individual nodes to establish connection.

Figure 3. Test subject’s emotion graph being recorded in the Emotiv Xavier Control Panel

3. RESULTSBefore analyzing the collected data, it was critical that the participants to complete a pre-experiment survey, so that the data analysis included the participants’ general knowledge and any previous exposure to drones that they might have had. The last two questions on the pre-experiment survey were used to compare with the final two questions on the post-experiment survey.

Out of twenty participants, 65% reported that they had never flown a drone before. In addition to that, 100% of the participants reported that they had no prior experience with the EPOC+ device to control a drone. These instances allowed for collecting more precise raw data. Many participants originally stated that they believed the drone would be relatively easy to control using mental commands, comprehensive surveys and observations showed that it was actually more difficult for the participants to control the drone. Shown in Figure 4a, and Figure 4b, are the average of how participants responded to an overall lower score in flying the drone using mental commands than they originally expected. Shown in Figure 5a and Figure 5b, are that the average of how participants responded to an overall lower score in flying the drone using facial commands than they originally expected. When comparing Figures 4a and 5a to Figures 4b and 5b, it can be seen that participants tend to claim an increase in the level of difficulty once they have

flown the drone. There was a 20% increase in participants responding that the difficulty on the scale rated “High” and an increase of 5% on “Very High.”

Figure 4a. Chart shows the overall average user response in difficulty before flying the drone using the EPOC+ device.

Figure 4b. Chart shows the overall average user response in difficulty after flying the drone using the EPOC+ device.

Figure 5a. Chart shows the overall average user response in difficulty before flying the drone using facial commands.

Figure 5b. Chart shows the overall average user response in difficulty after flying the drone using facial commands.

Page 4: ksuweb.kennesaw.eduksuweb.kennesaw.edu/~snorth/HoloLens/Final Project... · Web viewThere is no significant difference in difficulty between the user’s ability to control the drone

Overall, Figure 6 depicts all participants’ responses from the question five and six from the post survey. While Table 1 compare what the users believed the level of difficulty would be. ANOVA statistical analysis was applied to the collected data for all survey questions. The computation of the data resulted in an f-ratio value of 7.5147 with a p-value of 0.009, supporting that the results of the analysis of the difficulty in mental and facial commands is significant at p < 0.05; Table 2 illustrates that the facial commands has the higher level of rated difficulty. To further analysis the differences of pre-and post-test data, a series of t-test were conducted for each set of data provided. The t-test between pre-test mental and post-test mental commands was 0.4266 with a p-value of 0.2133, showing no statically significant difference at p < 0.05 (Table 3.) The t-test ran on the pretest facial commands vs. the posttest facial commands was 0.0095 and a p-value of 0.0047, shows that there is a statically significant at p < 0.05 (Table 3). The third t-test conducted was on the data collected from the mental commands pretest and the facial commands pre-test, the t-test for this data was 0.5217, and a p-value of 0.2608, showing that there is no statically significant difference at p < 0.05 (Table 3). The final t-test conducted was on the mental commands post-test and the facial commands post-test, the t-test value was 0.0075 with a p-value of 0.0037 showing that there is a statically significant at p < 0.05 (Table 3). This exploratory compound analysis showed that mental commands were perceived to be more difficult than facial commands. Facial commands showed higher means of difficult compared to mental commands.

Questions f-ratio p-value Difference Analysis

I 1.131 0.301 InsignificantII 9.255 0.070 SignificantIII 0.841 0.371 Insignificant IV 5.106 0.036 SignificantV 0.548 0.468 Insignificant

Table 1. Single factor ANOVA analysis between the correlated pre-survey and post- survey questions.

ANOVA: Single Factor AnalysisSUMMARYGroups Sum Average VarianceMental Commands 48 2.40 1.30526Facial Commands 69 3.45 1.62894ANOVASource of Variation SS MS FBetween Groups 11.023 7.621 7.5147Within Groups 55.750 1.467Total 66.773    

Table 2. Exploratory ANOVA statistical analysis of post data results about the difficulty of mental and facial commands.

Games Interfaces Comparisons t-test SignificantDifference

Pre-test Mental vs. Post-test Mental 0.4266 NegativePre-test Facial vs. Post-test Facial 0.0095 Affirmative

Pre-test Mental vs. Pre-test Facial 0.5217 NegativePost-test Mental vs. Post-test Facial 0.0075 Affirmative

Table 3. Compound t-test analysis for each set of drone commands.

Figure 6. Graph showing the participants overall response in difficult after flying the drone using both mental and facial commands. (Difficulty in mental commands are listed on left bar graph and are green, Difficulty in facial commands are listed on right bar graph and are blue).

4. CONCLUSIONS Allowing for some trouble with the hardware, the toughest part of running the experiment was providing consistent connection between the computer and the drone. Despite some connectivity issues, a good number of participants managed to fly the drone using the facial commands while a slightly smaller number of subjects could manipulate the drone with mental commands once it was in the air. Females tended to have better control over both the virtual cube experiment in the MindDrone App and controlling the drone using the EPOC+ headset. The male subjects consistently succeeded in passing the ‘virtual cube experiment’ but could not translate the thought control into manipulating the drone.

Anyone can learn to train his or her brain. As the pre- and post-experiment survey data showed, training one’s brain is most definitely a learned skill. Most of the participants expressed surprise at the ease with which they were able to control the virtual cube on the MindDrone App, having never tried any sort of brain-control activity previously, or even having any experience with drones whatsoever.

While currently limited, the current results given by the experiment and research show a very important aspect of the difficulty of mental versus facial commands. From the observation and analysis of collected data, it can be relatively concluded that there are some differences of difficulty in mental vs facial commands. This research shows that from the group of participants more individuals had greater difficulty controlling the mental and facial commands than they originally expected. At this time we can state that “There is no significant difference between the rating of difficulty before and after between the mental and facial commands.” It is entirely likely that with additional future research, more in-depth details, into what the impacts the level of difficulty in mental and facial commands.

Page 5: ksuweb.kennesaw.eduksuweb.kennesaw.edu/~snorth/HoloLens/Final Project... · Web viewThere is no significant difference in difficulty between the user’s ability to control the drone

5. ACKNOWLEDGMENTSpecial acknowledgment and thanks to CS 4712 class for their contribution to several phases of this research. In addition, authors express their appreciation for the time and efforts of all participates who graciously participated in all the phases of the experiments. Congratulation to the team who won 1st place at the College of Computing and Software Engineering (CCSE)/Computing Showcase event – C-Day on November 30, 2017. http://kennesaw.meritpages.com/. Major funded for this project provided by Computer Science department for advancement of technology and research on Undergraduate senior project activities. Furthermore, many thanks to Ahamd Alissa, and Josh Cooper who developed the manual for this project. http://whateven.com/ardrone/tutorials/

6. REFERENCES[1] Alissa, A., Cooper, J., & Rashied, A. (2017, November 6). Epoc Emotiv Manual. Retrieved on November 28, 2017 from http://whateven.com/ardrone/tutorials/index.php?page=1

[2] Anderson, K., & McOwan, W.P., A real-time automated system for the recognition of human facial expressions. IEEE Trans. System, Man, and Cybernetics, Part B: Cybernetics 36(1), 96–105 (2006)

[3] AR.Drone 2.0, Tutorial video #1: Setup [Video file]. Retrieved on July 27, 2017 fromhttps://www.parrot.com/us/drones/ parrot-ardrone-20-power-edition#ar-drone-20-power-edition.

[4] EMOTIV. MindDrone [Software]. Retrieved on August 5, 2017 from https://www.emotiv.com/product/mind drone/.

[5] EMOTIV. Xavier: EMOTIV Control Panel [Software]. Retrieved on September 29, 2017 from https://www. emotiv.com/product/xavier-control-panel/.

[6] Nakisa, B., Rastgoo, M. N., Tjondronegoro, D., & Chandran, V. (2018). Review: Evolutionary computation algorithms for feature selection of EEG-based emotion recognition using mobile sensors. Expert Systems with Applications, 93143-155. doi:10.1016/j.eswa.2017.09.062

[7] Parrot S.A. (2016). AR.FreeFlight (Version 2.4.15) [Software], Retrieved on September 11, 2017 fromhttps://play.google.com/store/apps/details?id=com.parrot.freeflight.

[8] Parrot S.A. (2016). Parrot AR.Drone 2.0 User Guide. Retrieved on August 3, 2017 from https://static.bhphotovideo.com/lit_files/121124.pdf.

[9] Lin, Y. P., Wang, C. H., Wu, T. L., Jeng, S. K., & Chen, J. H., (2009). "EEG-based emotion recognition in music listening: A comparison of schemes for multiclass support vector machine", ICASSP IEEE International Conference on Acoustics Speech and Signal Processing-Proceedings, pp. 489-492,