AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion...

75
1 AUTOMATIC MUSIC SELECTION BASED ON FACIAL RECOGNITION SYSTEM PHANG KE FEI BACHELOR OF COMPUTER SCIENCE (INTERNET COMPUTING) UNIVERSITI SULTAN ZAINAL ABIDIN 2018

Transcript of AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion...

Page 1: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

1

AUTOMATIC MUSIC SELECTION BASED ON

FACIAL RECOGNITION SYSTEM

PHANG KE FEI

BACHELOR OF COMPUTER SCIENCE

(INTERNET COMPUTING)

UNIVERSITI SULTAN ZAINAL ABIDIN

2018

Page 2: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

I

AUTOMATIC MUSIC SELECTION BASED ON

FACIAL RECOGNITION SYSTEM

PHANG KE FEI

Bachelor of Computer Science (Internet Computing)

Faculty of Informatics and Computing

University Sultan Zainal Abidin, Terengganu, Malaysia

AUGUST 2018

Page 3: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

I

DECLARATION

I hereby declare that this thesis is produced based on my original work except for

quotations and citations, which have been duly acknowledged. I would also declare

that it has not been previously or concurrently submitted for any other degree at

University Sultan Zainal Abidin or other institutions.

________________________________

Name : ..................................................

Date : ..................................................

Page 4: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

II

CONFIRMATION

This is to confirm that:

The research conducted and the writing of this report was under my supervision.

________________________________

Name : ..................................................

Date : ..................................................

Page 5: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

III

DEDICATION

To my beloved mother and father

Page 6: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

IV

ABSTRACT

This project introduces an approach that uses facial recognition system to detect one’s

facial expression and determine their current emotion. This project aims to solve the

emotion problem, such as depression, anxiety, and anger encountered by the end user.

Alexithymia and anger management issue are the most common psychological or

emotional problems that people have nowadays. Alexithymia is a personality construct

characterized by the sub-clinical inability to identify and describe emotions in the self.

[1] People with this psychiatric condition is unable to identify or verbally describe his

or her feelings. Anger management is the ability of someone to control him or her anger

and behaviour when they are getting angry. When a person is poor in anger

management, they could not think, speak or act logically. The ability to make a

judgement or any decision is affected too.

Music is believed to have a soothing effect that can help to express how people feel

inside when they could not find any suitable words to say it. For example, for someone

who has anger management issue, a list of relaxing music can help him, or her to calm

down and soothe his or her mood and even change his or her state of mind.

Keywords: facial recognition, Affectiva SDK, emotion detection, music

Page 7: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

V

ABSTRAK

Projek ini memperkenalkan pendekatan yang menggunakan sistem pengenalan wajah

untuk mengesan ekspresi wajah seseorang dan menentukan emosi semasa mereka.

Projek ini bertujuan untuk menyelesaikan masalah emosi, seperti kemurungan,

kecemasan dan kemarahan yang dihadapi oleh pengguna akhir.

Isu pengurusan Alexithymia dan kemarahan adalah masalah psikologi atau emosi yang

paling biasa pada masa sekarang. Alexithymia adalah pembinaan peribadi yang

dicirikan oleh ketidakupayaan subklinikal untuk mengenal pasti dan menggambarkan

emosi dalam diri. Orang dengan keadaan psikiatri ini tidak dapat mengenal pasti atau

secara lisan menggambarkan perasaannya. Pengurusan kemarahan adalah keupayaan

seseorang mengawalnya atau kemarahan dan kelakuannya apabila mereka marah.

Apabila seseorang itu gagal dalam pengurusan kemarahan, mereka tidak dapat berfikir,

bercakap atau bertindak secara logik. Keupayaan untuk membuat keputusan atau

sebarang keputusan turut terjejas.

Muzik dipercayai mempunyai kesan menenangkan yang boleh membantu untuk

menyatakan bagaimana perasaan orang di dalam apabila mereka tidak dapat mencari

perkataan yang sesuai untuk mengatakannya. Sebagai contoh, bagi seseorang yang

mempunyai masalah pengurusan kemarahan, senarai muzik yang santai dapat

membantu dia untuk menenangkan dan menenangkan moodnya dan juga mengubah

fikirannya.

Kata kunci: pengenalan wajah, Affectiva SDK, pengesanan emosi, muzik

Page 8: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

VI

CONTENTS

PAGE

DECLARATION I

CONFIRMATION II

DEDICATION III

ABSTRACT IV

ABSTRAK V

CONTENTS VI

LIST OF TABLES IX

LIST OF FIGURES X

LIST OF ABBREVIATIONS XII

LIST OF APPENDICES XIII

CHAPTER I INTRODUCTION

1.0 Introduction 1

1.1 Problem statement 2

1.2 Objectives 2

1.3 Scopes 2

1.4 Limitation of Works 3

1.5 Summary 3

CHAPTER II LITERATURE REVIEW

2.0 Introduction 4

2.1 Overview of Facial Recognition System 4

2.2 Facial Expression 5

2.3 Facial Action Coding System 5

2.4 Affectiva SDK 6

2.5 Summary 8

CHAPTER III

METHODOLOGY

3.0 Introduction 9

3. 1 Problem Identification 9

3.2 System Requirement 9

3.2.1 Software 10

3.2.2 Hardware 11

Page 9: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

VII

3.2.3 Emulator 13

3.3 Process Model 16

3.3.1 Context Diagram 16

3.3.2 Data Flow Diagram 17

3.4 Data Model 18

3.4.1 Entity Relationship Diagram 18

3.4.2 SQLite 19

3.5 Technology 21

3.5.1 Android Framework 21

3.5.2 Affectiva SDK 22

3.5.3 UML Class Diagram 23

3.6 Setup 35

3.6.1 Gradle Setup 35

3.6.2 Set Dependencies 35

3.6.3 Update Manifest 36

3.7 Summary 37

CHAPTER IV IMPLEMENTATION DAN RESULT

4.0 Introduction 38

4.1 Implementation of Application 28

4.1.1 Main Interface 38

4.1.2 Mood Tracker 39

4.1.3 Emotion Recognition 40

4.1.4 Customize Music Playlist 41

4.2 Results 42

4.2.1 Mood Tracker 42

4.2.2 Emotion Recognition 43

4.2.2.1 No Face Detected 43

4.2.2.2 Emotion: Angry 44

4.2.2.3 Emotion: Smile 45

4.2.3 Customize Music Playlist 46

4.3 Summary 47

Page 10: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

VIII

CHAPTER V CONCLUSION

5.0 Introduction 48

5.1 Contribution 48

5.2 Weakness and Limitation 49

5.3 Future Works 49

5.4 Overall Conclusion 50

REFERENCES 51

Page 11: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

IX

LIST OF TABLES

Table PAGE

Chapter 2: Literature Review

Table 2.1 List of combined AU related to six basic facial expressions 6

Chapter 3: Methodology

Table 3.1: Minimum system requirement for Windows 10

Table 3.2: Application project structure 11

Table 3.3: Application minimum system requirement 12

Table 3.4: Application deployment device 15

Table 3.5: List of function of Option Class 30

Table 3.6: List of function of Main Activity class 30

Table 3.7: List of function of Emo1 Class 31

Table 3.8: List of function of Music Class 32

Table 3.9: List of function of activity_playMusic Class 32

Table 3.10: List of function of activity_ music Class 33

Table 3.11: List of function of DatabaseHelper_Face Class 33

Table 3.12: List of function of DatabaseHelper_music Class 34

Page 12: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

X

LIST OF FIGURES

FIGURE PAGE

Chapter 2: Literature Review

Figure 2.1: Six basic facial expression 5

Chapter 3: Methodology

Figure 3.1: Android Platform Version Cumulative Distribution 13

Figure 3.2: Select Deployment Target window 14

Figure 3.3: Project CD 16

Figure 3.4: Project DFD 17

Figure 3.5: Project ERD 18

Figure 3.6: SQLiteOpenHelper class for Mood DB 20

Figure 3.7: SQLiteOpenHelper class for Customise Music DB. 20

Figure 3.8: Android Activity Lifecycle 21

Figure 3.9: Affectiva Android SDK Algorithm 22

Figure 3.10: Affdex SDK Java Documentation 23

Figure 3.11: Class relationship diagram for emotion module 24

Figure 3.12: Class relationship diagram for music module 24

Figure 3.13: UML Class Diagram 25

Figure 3.14: Application Gradle setup 35

Figure 3.15: Application dependencies setup 35

Page 13: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

XI

Figure 3.16: Application Manifest file 36

Chapter 4: Implementation and Results

Figure 4.1: The main interface 39

Figure 4.2: Steps occurred in retrieving mood history 39

Figure 4.3: Steps occurred in mood detection, music playing and music

playlist customization

40

Figure 4.4: Steps occurred in viewing customize music playlist 41

Figure 4.5: Result: Mood Tracker 42

Figure 4.6: Result: No face detected 43

Figure 4.7: Result: Angry 44

Figure 4.8: Result: Smile 45

Figure 4.9: Result: Music Customization 46

Page 14: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

XII

LIST OF ABBREVIATIONS / TERMS / SYMBOLS

AI Artificial Intelligence

API Application programming interface

APK Android Package Kit

CD Context Diagram

DB Database

DFD Data Flow Diagram

ERD Entity Relationship Diagram

JDK Java Development Kit

OS Operating System

SDK Software Development Kit

Page 15: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

XIII

LIST OF APPENDICES

APPENDIX PAGE

A Source code

A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54

A.1 Select Music From Device 55

A.1 Emotion Database 56

A.1 Music Database 58

Page 16: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

1

CHAPTER I

1.0 INTRODUCTION

There are few steps to take when it comes to recognize an object. Information is

received through the retina in the form of light. Visual processing occurs to organize the data

by determining size, shape, contoured edges, and surface so that the information can be

compared to other representations of objects in memory until recognition occurs (Robinson-

Riegler & Robinson-Riegler, 2008).[2]

Facial recognition or sometimes known as face recognition is a process of recognizing an

individual from their facial attributes. It is an integral part of the biometric system, an active

field in image processing, pattern recognition, computer vision and many other discipline rs

(Parmar and Metha, 2013).

A human can recognize up to thousands of faces in spite of changes in terms of age,

gender, skin colour and other possible factors. This type of information plays an important role

during visual communication among humans, to identify and interpret faces and facial

expression in real time based on facial information when interacting with other people.

Emotion is also a complex state of feeling that may cause physical or mental changes,

which will affect someone’s behaviour and thought. It can range from anger, disgust, fear,

happiness, sadness, surprise and so on. Moods are feelings that are last longer compare to

emotions. It can be either positive or negative.

Page 17: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

2

1.1 PROBLEM STATEMENT

The proposed application in this project aims to solve the problem encountered by the end user.

Some user may found hard to express their feelings in words, some may be poor in managing

their emotion. If a person is emotional- based and driven person, they may not be able to make

a correct decision and judgment.

1.2 OBJECTIVES

To design a prototype in mobile devices where a user can boost their emotion

anywhere and anytime.

To implement facial recognition system in the mobile application using Software

Development Kit (SDK) developed by Affectiva for emotion detection.

To study how music can affect one person mood and emotions

1.3 SCOPE

This proposed system involves Face Recognition System that is able to detect and analyse user

emotion based on their facial expression.

Target users for this application are end users, who own a smartphone that is able to

run Android- based application.

The function of the proposed application involves facial recognition, emotion detection,

and automatic music selection to boost and soothe the user’s mood.

Page 18: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

3

1.4 LIMITATION OF WORKS

Due to time constraint and limited resources, this proposed application will be using the

Software Development Kit (SDK) developed by Affectiva, an Emotion Measurement

Technology Company for analysing user facial expression and detection of users’ emotions in

real time. Therefore, the process on how to read and analyse the facial attributes and mood

detection is beyond the scope of the research and are not discussion.

1.5 SUMMARY

This project aims to use facial recognition software to detect the user’s current emotion from

their facial expression. Based on the detected emotion, a list of suitable music will be selected

and play to boost the user’s mood.

Page 19: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

4

CHAPTER 2

LITERATURE REVIEW

2.0 INTRODUCTION

There are similar published studies concerns facial recognition systems, mood detection

through facial expression and emotion recognition APIs or SDK.

2.1 OVERVIEW OF FACIAL RECOGNITION SYSTEM

Facial Recognition System is a biometric method that is able to provide a superior solution to

identifying an individual by comparing live capture or digital image data with the stored record

for that person. The system detects faces in images, quantify their features, and then match

them against stored templates in a database.

One of the advantages of this system is that it does not require any physical contact with the

user and can be easily deployed in user’s mobile devices, thus it is more acceptable by the end

users.

Page 20: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

5

2.2 FACIAL EXPRESSION

Facial expression exhibit a non-linear structure in face detection and recognition task. Every

person has an expression on his or her face used for non-verbal communication. It is the key to

determine and describe a person current emotion and feelings. There is six basic facial

expression, which is Happiness, Sadness, Surprise, Fear, Disgust, and Anger.

Figure 2.1: Six basic facial expression

2.3 FACIAL ACTION CODING SYSTEM (FACS)

FACS is a system to taxonomies human facial movements by their appearance on the face,

based on a system originally developed by a Swedish anatomist named Carl-Herman Hjortsjö

(Hjortsjö, CH(1969)). It was then developed further by Paul Ekman, and Wallace Friesen in

1978. The FACS used today is the updated version in 2002 by Ekman, Friesen, and Joseph C.

Hager.

Action Units (AUs) are used to determine emotion. Different AU indicates different movement

of facial muscles. Combination of these AU determine the emotions.

Page 21: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

6

Emotion Action Units (AUs) Description

Happiness 6 + 12 Cheek Raiser, Lip Corner Puller

Sadness 1+4+15 Inner Brow Raiser, Brow Lowerer, Lip Corner

Depressor

Surprise 1+2+5B+26 Inner Brow Raiser, Outer Brow Raiser, Upper Lid

Raiser, Jaw Drop

Fear 1+2+4+5+7+20+26 Inner Brow Raiser, Outer Brow Raiser, Brow Lowerer,

Upper Lid Raiser, Lid Tightener, Lip Stretcher, Jaw

Drop

Disgust 9+15+16 Nose Wrinkler, Lip Corner Depressor, Lower Lip

Depressor

Anger 4+5+7+23 Brow Lowerer, Upper Lid Raiser, Lid Tightener, Lip

Tightener

Source: Bryn Farnsworth, Ph.D., 2016

Table 2.1: List of combined AU related to six basic facial expressions

2.4 AFFECTIVA’S SDK

In this project, the Affectiva’s SDK is applied in the development of the application, to analyse

and detect users’ emotions in real time. It supports both IOS and Android platforms. Has the

ability to identify emotions, expressions and emoji and include classifiers for age,

gender and ethnicity, can detect emotion on individual faces as well as for groups of 20+.

This Emotion SDK is designed to analyse spontaneous facial expression that people show in

their daily interactions. It works with any optical sensor, device camera or standard camera.

Page 22: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

7

Computer vision algorithm identifies the key landmarks on the face, where the machine

learning algorithms (classifiers) analyse pixels in face regions to classify facial expressions.

Landmarks of the face can be the corners of eyebrows, the tip of the nose or the corners of the

mouth. Affective relies on the work of Paul Ekman, a research psychologist who built a

convincing body of evidence, that there are at least six universal human emotions, expressed

by everyone’s face identically, regardless of gender, age, or cultural upbringing. He decodes

these expressions, breaking them down into combinations of forty-six individual movements,

called “action units (AU).” From AU, he compiled the Facial Action Coding System (FACS),

a 500- page taxonomy of facial movements.

This SDK has high accurate classifiers, which have been trained and tested using Affectiva ’s

massive emotion data repository, the world’s largest emotion database with more than 6.5

million faces from 87 countries analysed.

Affdex is one of the software built based on the Affectiva SDK, is marketed as a tool that can

make reliable inferences about people’s emotions – tap into the unconscious. This software has

been using to test new shows, such as at the Las Vegas laboratory, Television City and during

the 2012 Presidential elections. The team uses this software to track more than 200 people

watching clips of the Obama-Romney debates and concluded that this software are able to

predict voting preference with 73% accuracy.

Page 23: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

8

2.5 SUMMARY

Facial recognition system continues to develop into high accuracy, simple, high effectiveness,

efficient and less computational time approach. There are more than 20 types of emotion

recognition APIs and SDK in the market, which helps developers to produce more emotion

recognition related application or system. These APIs and SDK use facial detection, eye

tracking, and facial expression to determine the objects emotion and mood.

Page 24: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

9

CHAPTER 3

METHODOLOGY

3.0 INTRODUCTION

The development of this application is carried out by applying SDK developed by Affectiva in

Android Studio environment. Java programming language is used throughout the development

of this application.

3.1 PROBLEM IDENTIFICATION

This application aims to boost the user’s mood using facial recognition system to detect user’s

current emotion. The main function of this application is to detect the user’s emotion, select

and play a list of suitable music to relax, boost or soothe his or her mood and emotion.

However, the proposed application is expected to be used by an android user only.

3.2 SYSTEM REQUIREMENT

This section describes the system requirement for the development and installation of this

application.

Page 25: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

10

3.2.1 SOFTWARE

Android Studio is the officially integrated development environment (IDE)

for Google's Android OS, built on JetBrains' IntelliJ IDEA software and designed specifica l ly

for Android development. [6] In this project, the development machine used to build this

application is an Asus laptop running Windows 10.

Table below shows the minimum system requirement of Android development platform for

machines running Windows.

Criteria Description

Platform Microsoft Windows 7/8/10 (32- or 64- bit)

64-bit is required for native debugging

RAM Minimum: 3GB

Recommended: 8GB

Android Emulator: 1GB

Available Disk Space Minimum: 2GB

Recommended: 4GB

(500 MB for IDE + 1.5 GB for Android SDK and emulator

system image)

Java Version JDK 8

Screen Resolution Minimum: 1280 x 800

Source: Android Studio Official Website, October 26, 2017.

Table 3.1: Minimum system requirement for Windows

Page 26: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

11

Android Studio has a flexible Gradle-based build system, which allows building android

packages, APK files by managing the dependencies and build logic. APK is a package file

format used by Android OS to distribute and installation of mobile apps. Android Studio will

generate and build APK files automatically when the developer chooses to build and run the

app. Table 3.2 shows the project structure for this app.

Criteria Description

Gradle version 4.4

Android Plugin Version 3.1.3

Android Plugin Repository google(), jcenter

Default Library Repository google(), jcenter, 'http://maven.affectiva.com'

Table 3.2: Application project structure

http://maven.affectiva.com is the library related to Affectiva SDK, which will be discuss in this

chapter under section 3.4.

3.2.2 HARDWARE

This application is able to run in any mobile device or tablet provided the device meets the

minimum system requirement for installation. The minimum requirement for an Android

platform to run the emotion SDK is version 4.4 or API level 19 and above. Therefore, this

application is able to run on approximately 90.1% of devices according to Android Developers,

Distribution Dashboard. (Refer Figure 3.1)

Page 27: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

12

Table below shows the minimum system requirement for this application.

Criteria Description

Platform OS Android 4.4 (KitKat) and above

RAM 1 GB (Recommended)

Processor Quad-core 1.5 GHz Cortex-A53

Memory Internal Yes

External Yes

Camera Front Facing Camera Yes

Connectivity Internet Access No

USB Yes

Screen Touchscreen Yes

Multi- Touch Yes

Sound Loudspeaker Yes

3.5mm jack Yes

Audio Media Type MP3

Table 3.3: Application minimum system requirement

Page 28: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

13

Figure 3.1: Android Platform Version Cumulative Distribution

3.2.3 EMULATOR

This application is test run using a real device. The details of the deployment device are stated

in Table 3.4.

Perform the following steps to configure and set up your device to Developer Mode.

Real mobile device: Developer Mode

1. Connect your device to your development machine with a USB cable.

2. Open the Setting app, select About Phone.

3. Scroll to the bottom and tap Build number 7 times.

4. Return to the previous screen, select Developer option.

5. Enable developer mode in your device.

6. Scroll down and enable USB debugging.

Page 29: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

14

Perform the following steps to run the app on a real device or an emulator.

1. Android Studio: Run project on a real device

1. Select run in the toolbar.

2. In the Select Deployment Target window, select your device, and click OK.

(Refer Figure 3.2)

Figure 3.2: Select Deployment Target window

Enabling USB debugging to allow Android Studio and other SDK tools to recognize your

device when connected via USB. This will also allow the Android Studio to install the APK

file to the connected device and launch immediately once the file is installed successfully.

Page 30: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

15

Table below shows the details of the deployment device for this application.

Device Description

Device Name Galaxy Note 3

Model Number SM-N9005

Android version 5.0

Internal Memory 32 GB

Developer Mode Yes

USB debugging Yes

Table 3.4: Application deployment device

Page 31: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

16

3.3 PROCESS MODEL

This section describes the process model in developing the project.

3.3.1 CONTEXT DIAGRAM

Figure below generalizes the function of the entire application in relationship to entity user

and entity Affectiva SDK.

Figure 3.3: Project CD

Page 32: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

17

3.3.2 DATA FLOW DIAGRAM

Figure below illustrates how the data or information flow between elements of the

application.

Figure 3.4: Project DFD

Page 33: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

18

3.4 DATA MODEL

This section describes the data model of the application.

3.4.1 ENTITY RELATIONSHIP DIAGRAM

Figure below describes the database relationship between entities: user, mood_tracker,

emotion_recognition, music_playlist, and customize_music.

Figure 3.5: Project ERD

Page 34: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

19

3.4.2 SQLite

SQLite is a Structure query base database, open source, lightweight, does not require network

access and standalone database. Subclass SQLiteOpenHelper class is needed to manage

database creation and version management. Two methods are provided from the class which is

onCreate(SQLiteDatabase db) and onUpgrade(SQLiteDatabase db, int oldVersion, int

newVersion). The main function of this class is to create a database if not exists, open if exist

and upgrade if required. Figure 3.6 and Figure 3.7 shows the SQLiteOpenHelper class for

Mood DB and Customise Music DB.

Page 35: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

20

Figure 3.6: SQLiteOpenHelper class for Mood DB

Figure 3.7: SQLiteOpenHelper class for Customise Music DB.

Page 36: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

21

3.5 TECHNOLOGY

This section describes the technologies used in developing the project.

3.5.1 ANDROID FRAMEWORK

This project is fully developed using Android Studio to create a mobile-based application that

can be used in many tablets and mobile phones. Android is one of the popular mobile OS

developed by Google Inc. It is a Linux based open source OS written in Java, C, and C++. In

Android OS, there are three most common entities, which is Activities, Services, and Widget.

Each has to be defined in application manifest (Refer Figure 3.14). This project will only utilize

entity Activity, which is the most common view that allows a user to interact with the activit ies

immediately after launching the application. It contains different types of controls such as

buttons, labels, and pictures. Figure 3.8 illustrated the lifecycle of an Activity.

Figure 3.8: Android Activity Lifecycle

Page 37: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

22

3.5.2 AFFECTIVA SDK

Affectiva is an emotion-measurement technology company that grew out of MIT’s Media Lab.

This project will be using the SDK developed by this company to determine and detect the

user’s facial expression and emotion. The SDK processes emotion data on-device. The library

is lightweight and is fast for real-time processing.

Figure 3.9 illustrated the SDK algorithm runs in Android device.

Figure 3.9: Affectiva Android SDK Algorithm

Affective has prepared an SDK java documentation to help the developer to use their SDK to

build any project related to emotion or facial detection. This project focuses on the movement

of the user’s mouth muscle, only two classes will be using for the expression detection which

is getSmile() and getLipTighten(). Thus, the expression can be detected and distinguish easily,

allow the user to view the result immediately.

Page 38: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

23

Figure 3.10: Affdex SDK Java Documentation

3.5.3 UML CLASS DIAGRAM

This project is going to be written in Java programming language, which runs in object-

oriented mode. Figure 3.11 and Figure 3.12 illustrates the class relationship diagram for

emotion module and music module.

Page 39: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

24

Figure 3.11: Class relationship diagram for emotion module

Figure 3.12: Class relationship diagram for music module

Page 40: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

25

The following UML class diagrams represent the classes that will be implemented into the

system.

Option

~ buttonM : Button

~ buttonR : Button

~ buttonMu : Button

# onCreate (savedInstanceState: Bundle) : void

# onDestroy() : void

+ onClick (view: View) : void

MainActivity

~ cameraDetector: CameraDetector

~ CameraImageView: SurfaceView

~ textView: TextView

~ maxProcessingRate: int

~ m_DatabaseHelper_Face: DatabaseHelper_Face

- detect_emotion: String

- emoion_score: double

- saveCurrentMood: OnClickListener

Page 41: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

26

# onCreate (savedInstanceState: Bundle) : void

# onPause (): void

+ onImageResults (faces: List<Face>, frame: Frame, timestamp: float): void

+ savedCurrentMoodBtnClicked () : void

+ AddData (socre: double, desc: String) : void

- toastMessage (msg: String) : void

Emo1

- e_id : int

- e_score : double

- e_description : String

- e_timestamp : String

+ Emo1(id: int, score: double, desc: String, timestamp: String)

+ getE_id () : double

+ setE_id (id: int) : void

+ getE_score () : double

+ setE_score (score: double) : void

+ getE_desc () : String

+ setE_desc (desc: String) : void

Page 42: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

27

activity_playMusic

~ play : Button

~ pause : Button

~ stop : Button

~ button_cm : Button

~ mediaPlayer : MediaPlayer

~ pauseCurrentPosition : int

+ getE_timestamp () : String

+ setE_ timestamp (timestamp: String) : void

Music

- cm_id : int

- cm_path : String

+ Music ()

+ Music (id: int, audio_path: String)

+ getCm_id () : int

+ setCm_id (id: int) : void

+ getCm_path () : String

+ setCm_path (audio_path: String) : void

Page 43: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

28

# onCreate (savedInstanceState: Bundle) : void

+ onClickListener (view: View) : void

activity_music

- final TAG : String

~ final RQS_OPEN_AUDIO_MP3 : int

~ m_DatabaseHelper_music : DatabaseHelper_music

- btnView : Button

- btnSelect : Button

# onCreate (savedInstanceState: Bundle) : void

# onActivityResult (requestCode: int, resultCode: int, data: Intent) : void

+ AddDataPath (newEntry: String) : void

- toastMessage (msg: String) : void

DatabaseHelper_Face

- final TAG : String

- final TABLE_NAME : String

- final col1 : String

- final col2 : String

- final col3 : String

Page 44: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

29

- final col4 : String

+ DatabaseHelper_Face (context: Context)

+ onCreate (db: SQLiteDatabase) : void

+ onUpgrade (db: SQLiteDatabase, i: int, i1: int) : void

+ addData (score: double, desc: String) : boolean

+ getDataTime () : String

+ getEmotion (id: long) : Emo1

+ getAllEmotion () : List<Emo1>

+ getEmoCount () : int

DatabaseHelper_music

- final TAG : String

- final TABLE_NAME : String

- final col1 : String

- final col2 : String

+ DatabaseHelper_music (context : Context)

+ onCreate (db: SQLiteDatabase) : void

+ onUpgrade (db: SQLiteDatabase, i: int, i1: int) : void

+ addDataPath (music_path: String) : boolean

+ getMusic (id: long) : Music

+ getAllMusic () : List<Music>

+ getMusicCount () : int

Page 45: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

30

Figure 3.13: UML Class Diagram

The Option class represents the main interface of this application. Three buttons displayed on

the screen will direct the user to a different activity. The function in Option class is as follows:

Function Description

onCreate (savedInstanceState: Bundle) : void Initialize activity

onDestroy() : void The activity stops running when the user

closes the application.

onClick (view: View) : void The selected button will direct to a different

activity

Table 3.5: List of function of Option Class

The Main Activity class represents the main function of this application. This class will handle

the face detection and emotion recognition. The user touches the screen (Surface View) to save

the current emotion in the database. The function in the Main Activity class is stated as follows:

Function Description

onCreate (savedInstanceState: Bundle) : void Initialize activity

onPause (): void The activity pause when user navigates to

other activity.

onImageResults (faces: List<Face>, frame:

Frame, timestamp: float): void

Face detection, emotion and expression

recognition. Timestamp is recorded.

savedCurrentMoodBtnClicked () : void Direct to different activity once clicked.

AddData (score: double, desc: String) : void Save the detected emotion and score in the

database.

Page 46: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

31

toastMessage (msg: String) : void Inform the user whether the emotion is

saved successfully or not.

Table 3.6: List of function of Main Activity class

The Emo1 class plays the role of setting and returning of emotion detail, such as id, description,

and score to other class in the application. The functions are stated as follows:

Function Description

Emo1(id: int, score: double, desc:

String, timestamp: String)

The constructor

getE_id () : double Return emotion id

setE_id (id: int) : void Set emotion id to the given value

getE_score () : double Return emotion score

setE_score (score: double) : void Set emotion score to the given value

getE_desc () : String Return emotion description

setE_desc (desc: String) : void Set emotion description to the given text

getE_timestamp () : String Return timestamp

setE_ timestamp (timestamp :

String) : void

Set timestamp to the given value

Table 3.7: List of function of Emo1 Class

Page 47: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

32

The Music class play the role of setting and returning of customizing music detail, such as id

and audio file path to other class in the application. The functions are stated as follows:

Function Description

Music (id: int, audio_path: String) The constructor

getCm_id () : int Return customize music id

setCm_id (id: int) : void Set customize music id to the given value

getCm_path () : String Return customize music file path

setCm_path (audio_path: String) :

void

Set customize music file path to the given value

Table 3.8: List of function of Music Class

The user will be direct to activity_ playMusic class once their emotion is successfully detected

and saved in the database. This class enables the user to play, pause, stop the music and

customize the music playlist. The function in activity_playMusic class is stated as follows:

Function Description

onCreate (savedInstanceState: Bundle) : void Initialize activity

onClick (view: View) : void The selected button will direct to a different

activity

Table 3.9: List of function of activity_playMusic Class

Page 48: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

33

The activity_ music class handles the music customization in this application. User select audio

file from their mobile device and the file path is saved in database. The function in the activity_

music class is stated as follows:

Function Description

onCreate (savedInstanceState: Bundle) : void Initialize activity

onActivityResult (requestCode: int,

resultCode: int, data: Intent) : void

Allow the user to choose an audio file from

their mobile device

AddDataPath (newEntry: String) : void Save the audio file path in the database.

toastMessage (msg: String) : void Inform the user whether the audio file path

is saved successfully or not.

Table 3.10: List of function of activity_ music Class

The DatabaseHelper_Face class is a database for saving and retrieving all the information

related to emotion. The functions are stated as follows:

Function Description

DatabaseHelper_Face (context:

Context)

The constructor

onCreate (db: SQLiteDatabase) :

void

Initialize and create a database if not exists

onUpgrade (db: SQLiteDatabase,

i: int, i1: int) : void

Upgrade database if exists

addData (score: double, desc:

String) : boolean

Insert emotion score and description into database

Page 49: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

34

getDataTime () : String Return timestamp from Emo1 class

getEmotion (id: long) : Emo1 Return emotion id from Emo1 class

getAllEmotion () : List<Emo1> Return all emotion details from Emo1 class

getEmoCount () : int Return number of emotion recorded

Table 3.11: List of function of DatabaseHelper_Face Class

The DatabaseHelper_music class is a database for saving and retrieving of all the information

related to music customization. The functions are stated as follows:

Function Description

DatabaseHelper_music (context :

Context)

The constructor

onCreate (db: SQLiteDatabase) :

void

Initialize and create a database if not exists

onUpgrade (db: SQLiteDatabase,

i: int, i1: int) : void

Upgrade database if exists

addDataPath (music_path:

String) : boolean

Insert file path into database

getMusic (id: long) : Music Return music id from Music class

getAllMusic () : List<Music> Return all music details from Music class

getMusicCount () : int Return number of tracks saved

Table 3.12: List of function of DatabaseHelper_music Class

Page 50: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

35

3.6 SETUP

This section shows the setup and connection of Affectiva SDK in Android Studio.

3.6.1 GRADLE SETUP

Add Affectiva’s repository, http://maven.affectiva.com to the app’s root build.gradle file.

Figure 3.14: Application Gradle setup

3.6.2 SET DEPENDENCIES

Add dependency declaration to app’s root build.gradle file. This allows the application to pick

up the most recent bug fix release series. In this application, version 3.2 is used.

Figure 3.15: Application dependencies setup

Page 51: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

36

3.6.3 UPDATE MANIFEST

There is some necessary declaration that needs to be clarified in app’s manifest file.

The SDK requires access to external storage on Android device, internet access when availab le

to communicate anonymized usage data and permission to access the camera.

Permission to access to media content, read or write to external storage is required for the music

player and music customization.

7

Figure 3.16: Application Manifest file

Page 52: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

37

3.7 SUMMARY

Android Studio environment is used throughout the development of the application. To detect

user’s facial and their expression, Affectiva SDK is applied in this application using Java

programming language. Setup Gradle, dependencies and manifest file to allow the application

to access to Affdex SDK Java Documentation.

Page 53: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

38

CHAPTER 4

IMPLEMENTATION AND RESULTS

4.0 INTRODUCTION

This system contains four interface, which is the main interface and another three separate

interfaces, which is emotion recognition, music player and music playlist (additional function).

4.1 IMPLEMENTATION OF APPLICATION

This section illustrates the activity runs between the interfaces.

4.1.1 MAIN INTERFACE

When the application is launched, the main interface is shown first. (Refer to figure 4.1)

User has three option to choose.

Option:

1. Select Mood, to track back their emotion history.

2. Select Emotion Recognition, to detect their current emotion.

3. Select Music Playlist, to view their customized music playlist.

Page 54: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

39

Figure 4.1: The main interface

4.1.2 MOOD TRACKER

Figure below illustrates the steps to track mood history.

Figure 4.2: Steps occurred in retrieving mood history

Page 55: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

40

4.1.3 EMOTION RECOGNITION

Perform the following steps for emotion detection and music playing.

1. Select Emotion Recognition.

2. You can choose to smile (Smile) or tighten your lip (Anger).

3. Your face should display on Surface View. Once you are ready, tap the Surface View

to record your current emotion, and you will be direct to Music Player.

4. Select the button to play, pause or stop the music.

5. Select to customize your own music playlist.

6. Tap the blue file icon to select music file (mp3) from your mobile device storage. The

file is saved to the music playlist DB once you have done selecting the music file.

7. Select to view your playlist.

Figure 4.3: Steps occurred in mood detection, music playing and music playlist customization

Page 56: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

41

4.1.4 CUSTOMIZE MUSIC PLAYLIST (ADD-ONS FUNCTION)

Figure below illustrates the steps to view the customized music playlist.

Figure 4.4: Steps occurred in viewing customize music playlist

Page 57: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

42

4.2 RESULTS

This section describes the application results after the test run on a real mobile device, Galaxy

Samsung Note 3.

4.2.1 MOOD TRACKER

The emotion is recorded and can be retrieved successfully. The user is able to track back their

mood history, to check their emotion from time to time. Details containing the date, time and

emotion are display in a vertically scrollable list.

Figure 4.5: Result: Mood Tracker

Page 58: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

43

4.2.2 EMOTION RECOGNITION

The emotion recognition functions well in detecting the user’s emotion in real time. The user

is able to see the detected emotion and the emotion score display on the text box.

4.2.2.1 NO FACE DETECTED

This may occur when the device front camera fails to launch or blocked, causing the emotion

SDK fails to detect any facial display on the Surface View. The recognition function is unable

to detect the user’s emotion in real time, therefore the text box will show “No face detected”.

Figure 4.6: Result: No face detected

Page 59: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

44

4.2.2.2 EMOTION: ANGRY

According to the Cambridge Dictionary, someone who is tight-lipped is pressing his or her

lips together; he or she is avoiding showing anger or is refusing to speak about something. [7]

The device front camera works well enable the emotion SDK to detect the facial display on

the Surface View. The recognition function detect user has tightened his or her lips; therefore

the text box will show “Angry”. The number is the user’s lip tighten score, which can range

from 0-100.

Figure 4.7: Result: Angry

Page 60: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

45

4.2.2.1 EMOTION: SMILE

According to Cambridge Dictionary, a smile is a happy or friendly expression on the face in

which the ends of the mouth curve up slightly, often with the lips moving apart so that the

teeth can be seen. [7]

The device front camera works well enable the emotion SDK to detect the facial display on

the Surface View. The recognition function detects the user is smiling; therefore the text box

will show “Smile”. The number is the user’s smile score, which can range from 0-100.

Figure 4.8: Result: Smile

Page 61: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

46

4.2.3 CUSTOMIZE MUSIC PLAYLIST (ADD-ONS FUNCTION)

The music file path is saved and can be retrieved successfully. The user is able to view the

saved track’s file path in a vertically scrollable list. At this stage, the user can only view the list

but not able to play the music.

Figure 4.9: Result: Music Customization

Page 62: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

47

4.3 SUMMARY

The emotion SDK works well on the tested device. All the function stated is able to run

smoothly allowing the user to detect emotion in real time, trace back emotion history and music

playing to soothe or boost their emotion. The music customization, which is an additiona l

function in this application, only allow the user to save and view the playlist.

Page 63: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

48

CHAPTER 5

CONCLUSION

5.0 INTRODUCTION

This section concludes the documentation of this project in an aspect of concept, algorithms,

design, implementation, and testing.

5.1 CONTRIBUTION

During the past several years, face recognition has received significant attention in the different

field. In 2016, Facebook started to implement facial recognition system in its app for emotion

detection called FacioMetrics. This allows a user to add a Like or adding one if its Wow/ Haha/

Angry/ Sad emoji reactions by showing that emotion with user’s face.

This paper has discussed an approach to of real- time emotion detection in anywhere and

anytime. The user is able to use this mobile application to detect their current emotion and

suitable music is played to soothe or boost their mood.

Page 64: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

49

5.2 WEAKNESS AND LIMITATION

Face recognition is not perfect and cannot perform well under certain conditions. According to

one of the pioneers of Automated Facial Recognition, Woody Bledsoe;

“This recognition problem is made difficult by the great variability in head rotation and

tilt, lighting intensity and angle, facial expression, aging, etc. Some other attempts at face

recognition by machine have allowed for little or no variability in these quantities. Yet the

method of correlation (or pattern matching) of unprocessed optical data, which is often used

by some researchers, is certain to fail in cases where the variability is great. In particular, the

correlation is very low between two pictures of the same person with two different head

rotations.” [8]

In this project, the application only works on mobile device or tablets running Android OS. A

user using devices running other mobile OS such as iOS is not able to install and use this

application. As the facial detection and emotion recognition relies on Affdex Java SDK,

therefore the version of the SDK may need to update from time to time.

5.3 FUTURE WORKS

The SDK implemented in this project can indeed provide accurate facial detection and emotion

recognition. Research on facial and emotion recognition should continue to carry out, in order

to help people who have the difficulty in expressing his or her emotion a way to tell others

what his or her feeling deep inside their heart is. Further research about this technology may

help a developer to develop more application on gesture-based controls and perform related

actions.

Page 65: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

50

To improve the user- friendliness of this application, the music customization module should

be improved to allow the user to customize and play their own emotion-release music.

5.5 OVERALL CONCLUSION

Face recognition is one of the advanced AI technology that has the ability to sense and detect

human facial and emotions. This project aims to provide a user-friendly approach by combining

emotion detection and media player developed using Android Studio. Using the emotion SDK,

user facial and emotion can be detected in real time, media player enables a user to play music

and add an audio file to the database. Extreme Programming in is used for the development of

the application and is written in Java.

Page 66: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

51

REFERENCES

[1] Sifneos PE (1973). "The prevalence of 'alexithymic' characteristics in psychosomatic

patients". Psychotherapy and psychosomatics. 22(2): 255-262.

doi:10.1159/000286529. PMID 4770536.

[2] Jennifer L. Black, “The importance of Facial Recognition”, Owlcation.com/social-sciences,

February 4, 2017.

[3] Daniel McDuff, Rana El Kaliouby, Thibaud Senechal, May Amr, Jeffrey F. Cohn, Rosalind

Picard, “Affectiva-MIT Facial Expression Dataset (AM-FED): Naturalistic and Spontaneous

Facial Expressions Collected In-the-Wild”

[4] Daniel McDuff, Abdelrahman Mahmoud, Mohammad Mavadati, May Amr, Jay Turcot,

Rana el Kaliouby, “AFFDEX SDK: A Cross-Platform RealTime Multi-Face Expression

Recognition Toolkit”, CHI'16 Extended Abstracts, May 07-12, 2016, San Jose, CA, USA ACM

978-1-4503-4082-3/16/05

[5] Thomas Oropeza, Beste Filiz Yuksel, “Getting started with the Affectiva SDK”, CS

686/486 Affective Computing, University of San Francisco

[6] Ducrohet, Xavier; Norbye, Tor; Chou, Katherine (May 15, 2013). "Android Studio: An IDE

built for Android". Android Developers Blog. Google. Retrieved May 16, 2013.

[7] Cambridge University Press. (2008). Cambridge online dictionary, Cambridge Dictionary

online. Retrieved at April 23, 2008,

Page 67: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

52

[8] Dr.S.B.Thorat, S.K.Nayak, Miss.Jyoti P Dandale, “Facial Recognition Technology: An

analysis with scope in India” (IJCSIS) International Journal of Computer Science and

Information Security, Vol. 8, No. 1, 2010

[9] Raffi Khatchadourian, “We Know You Feel” The New Yorker, January 19, 2015 Issue

[10] Facial recognition system. March 11, 2017. Wikipedia

[11] Emotion recognition, June 28, 2018. Wikipedia.

[12] Josh Constine, “Like by smiling? Facebook acquires emotion detection startup

FacioMetrics”, Nov 17, 2016. Techcrunch

Page 68: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

53

APPENDIX

A. SOURCE CODE

A.1 EMO1 CLASS

public class Emo1 {

private int e_id;

private double e_score;

private String e_description;

private String e_timestamp;

public Emo1(){ }

public Emo1(int id, double score, String desc, String timestamp){

this.e_id =id;

this.e_score = score;

this.e_description = desc;

this.e_timestamp = timestamp;

}

public double getE_id() {return this.e_id;}

public void setE_id(int id) { this.e_id = id; }

public double getE_score() {return this.e_score;}

public void setE_score(double score) {this.e_score = score;}

public String getE_desc() {return this.e_description;}

public void setE_description(String dec) { this.e_description =dec;}

public String getE_timestamp() {return this.e_timestamp;}

public void setE_timestamp(String timestamp) { this.e_timestamp =timestamp;}

}

Page 69: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

54

A.2 MUSIC CLASS

public class Music {

private int cm_id;

private String cm_path;

public Music(){ }

public Music(int id, String audio_path){

this.cm_id =id;

this.cm_path = audio_path;

}

public double getCm_id() {return this.cm_id;}

public void setCm_id(int id) { this.cm_id = id; }

public String getCm_path() {return this.cm_path;}

public void setCm_path(String audio_path) { this.cm_path =audio_path;}

}

A.3 EMOTION DETECTION FUNCTION

@Override

public void onImageResults(List<Face> faces, Frame frame, float timeStamp){

if(faces.size() == 0) {

textView.setText("No face detected");

}

else{

Face face =faces.get(0);

if(face.expressions.getSmile() >= 50){

emotion_score = face.expressions.getSmile();

detect_emotion = "Smile";

textView.setText(String.format("Smile: , %.2f", emotion_score));

}

else if(face.expressions.getSmile() < 50){

Page 70: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

55

emotion_score = face.expressions.getLidTighten();

detect_emotion = "Anger";

textView.setText(String.format("Anger: , %.2f", emotion_score));

}

}

}

A.4 SELECT MUSIC FROM DEVICE

@Override

protected void onCreate(Bundle savedInstanceState){

super.onCreate(savedInstanceState);

setContentView(R.layout.activity_music);

btnView = (Button) findViewById(R.id.btnView);

btnSelect = (Button) findViewById(R.id.btnSelect);

m_DatabaseHelper_music = new DatabaseHelper_music(this);

btnSelect.setOnClickListener(new View.OnClickListener(){

@Override

public void onClick(View view) {

Intent intent = new Intent();

intent.setType("audio/*");

intent.setAction(Intent.ACTION_GET_CONTENT);

startActivityForResult(Intent.createChooser(intent, "Select Music"),

RQS_OPEN_AUDIO_MP3);

}

});

btnView.setOnClickListener(new View.OnClickListener() {

@Override

public void onClick(View view) {

Intent intent = new Intent(activity_music.this,

activity_musicDisplay.class);

startActivity(intent);

}

});

}

Page 71: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

56

A.5 EMOTION DATABASE

public class DatabaseHelper_Face extends SQLiteOpenHelper{

private static final String TAG = "DatabaseHelper_Face";

private static final String TABLE_NAME = "Mood_Tracker";

private static final String col1 = "m_id";

private static final String col2 = "m_score";

private static final String col3 = "m_desc";

private static final String col4 = "m_time";

public DatabaseHelper_Face(Context context) {

super(context, TABLE_NAME, null,1);

}

@Override

public void onCreate(SQLiteDatabase db) {

String createTable = "CREATE TABLE IF NOT EXISTS " + TABLE_NAME + " ("

+ col1 + " INTEGER PRIMARY KEY AUTOINCREMENT,"

+ col2 + " DOUBLE,"

+ col3 + " TEXT,"

+ col4 + " DATETIME DEFAULT CURRENT_TIMESTAMP"

+ ")";

db.execSQL(createTable);

}

@Override

public void onUpgrade(SQLiteDatabase db, int i, int i1) {

db.execSQL("DROP TABLE IF EXISTS " + TABLE_NAME);

onCreate(db);

}

public String getDateTime(){

SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss",

Locale.getDefault());

Date date = new Date();

return dateFormat.format(date);

}

public boolean addData(double score, String desc){

SQLiteDatabase db = this.getWritableDatabase();

Page 72: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

57

ContentValues contentValues = new ContentValues();

contentValues.put(col2, score);

contentValues.put(col3, desc);

contentValues.put(col4, getDateTime());

Log.d(TAG, "add score: " + score + " to " + TABLE_NAME);

Log.d(TAG, "add desc: " + desc + " to " + TABLE_NAME);

Log.d(TAG, "add timestamp: " + getDateTime() + " to " + TABLE_NAME);

long result = db.insert(TABLE_NAME, null, contentValues);

//check if result enter correctly

if (result == -1 ){ return false; }

else return true;

}

public Emo1 getEmo(long id){

SQLiteDatabase db = this.getReadableDatabase();

Cursor cursor = db.query(TABLE_NAME,

new String[]{col1, col2, col3, col4},

col1 + "=?", new String[]{String.valueOf(id)},null,null,null,null);

if (cursor != null){

cursor.moveToFirst();

}

Emo1 emo1 = new Emo1(

cursor.getInt(cursor.getColumnIndex(col1)),

cursor.getDouble(cursor.getColumnIndex(col2)),

cursor.getString(cursor.getColumnIndex(col3)),

cursor.getString(cursor.getColumnIndex(col4)));

cursor.close();

return emo1;

}

public List<Emo1> getAllEmo1(){

List<Emo1> Emo1 = new ArrayList<>();

String selectQuery = "SELECT * FROM " + TABLE_NAME;

Page 73: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

58

SQLiteDatabase db = this.getWritableDatabase();

Cursor cursor = db.rawQuery(selectQuery, null);

if(cursor.moveToFirst()){

do {

Emo1 emo1 = new Emo1();

emo1.setE_id(cursor.getInt(cursor.getColumnIndex(col1)));

emo1.setE_score(cursor.getDouble(cursor.getColumnIndex(col2)));

emo1.setE_description(cursor.getString(cursor.getColumnIndex(col3)));

emo1.setE_timestamp(cursor.getString(cursor.getColumnIndex(col4)));

Emo1.add(emo1);

}

while (cursor.moveToNext());

}

db.close();

return Emo1;

}

public int getEmoCount(){

String count = "SELECT * FROM " + TABLE_NAME;

SQLiteDatabase db = this.getReadableDatabase();

Cursor cursor = db.rawQuery(count, null);

int count1 = cursor.getCount();

cursor.close();

return count1;

}

A.6 MUSIC DATABASE

public class DatabaseHelper_music extends SQLiteOpenHelper {

private static final String TAG = "DatabaseHelper_Music";

private static final String TABLE_NAME = "customize_music";

private static final String col1 = "cm_id";

private static final String col2 = "cm_path";

Page 74: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

59

public DatabaseHelper_music(Context context) {

super(context, TABLE_NAME, null, 1);

}

@Override

public void onCreate(SQLiteDatabase db) {

String createTable = "CREATE TABLE IF NOT EXISTS " + TABLE_NAME + " ("

+ col1 + " INTEGER PRIMARY KEY AUTOINCREMENT,"

+ col2 + " TEXT)";

db.execSQL(createTable);

}

@Override

public void onUpgrade(SQLiteDatabase db, int i, int i1) {

db.execSQL("DROP TABLE IF EXISTS " + TABLE_NAME);

onCreate(db);

}

public boolean addDataPath(String music_path){

SQLiteDatabase db = this.getWritableDatabase();

ContentValues contentValues = new ContentValues();

contentValues.put(col2, music_path);

Log.d(TAG, "add music file path: " + music_path + " to " + TABLE_NAME);

long result = db.insert(TABLE_NAME, null, contentValues);

//check if result enter correctly

if (result == -1 ){

return false;

} else return true;

}

public Music getMusic(long id) {

SQLiteDatabase db = this.getReadableDatabase();

Cursor cursor = db.query(TABLE_NAME,

new String[]{col1, col2},

col1 + "=?", new String[]{String.valueOf(id)}, null, null, null,

null);

if (cursor != null) {

cursor.moveToFirst();

Page 75: AUTOMATIC MUSIC SELECTION BASED ON FACIAL … · A.1 Emo1 Class 53 A.2 Music Class 54 A.3 Emotion Detection Function 54 A.1 Select Music From Device 55 A.1 Emotion Database 56 A.1

60

}

Music music = new Music(

cursor.getInt(cursor.getColumnIndex(col1)),

cursor.getString(cursor.getColumnIndex(col2)));

cursor.close();

return music;

}

public List<Music> getAllMusic() {

List<Music> Music = new ArrayList<>();

String selectQuery = "SELECT * FROM " + TABLE_NAME;

SQLiteDatabase db = this.getWritableDatabase();

Cursor cursor = db.rawQuery(selectQuery, null);

if (cursor.moveToFirst()){

do {

Music music = new Music();

music.setCm_id(cursor.getInt(cursor.getColumnIndex(col1)));

music.setCm_path(cursor.getString(cursor.getColumnIndex(col2)));

Music.add(music);

}

while (cursor.moveToNext());

}

db.close();

return Music;

}

public int getMusicCount() {

String count = "SELECT * FROM " + TABLE_NAME;

SQLiteDatabase db = this.getReadableDatabase();

Cursor cursor = db.rawQuery(count, null);

int count1 = cursor.getCount();

cursor.close();

return count1;

}

}