MULTI-OBJECTIVE EVOLUTIONARY ALGORITHMS OF SPIKING...
Transcript of MULTI-OBJECTIVE EVOLUTIONARY ALGORITHMS OF SPIKING...
-
MULTI-OBJECTIVE EVOLUTIONARY ALGORITHMS OF
SPIKING NEURAL NETWORKS
ABDULRAZAK YAHYA SALEH
UNIVERSITI TEKNOLOGI MALAYSIA
-
iv
MULTI-OBJECTIVE EVOLUTIONARY ALGORITHMS OF
SPIKING NEURAL NETWORKS
ABDULRAZAK YAHYA SALEH
A thesis submitted in fulfilment of the
requirements for the award of the degree of
Doctor of Philosophy (Computer Science)
Faculty of Computing
Universiti Teknologi Malaysia
JUNE 2015
-
vi
To my beloved parents, wife, children, brothers and my sisters
-
vii
ACKNOWLEDGEMENT
Firstly, All my praise and thanks are owed to Allah, who honored me the
health and persistence who substantially depends on Him.
I am very grateful to my main supervisor, Prof. Dr. Siti Mariyam
Shamsuddin. I wish to express my sincere appreciation to her for all her kind
guidance and inspiration to make this research possible. Her personality, enthusiasm,
patience and intellectual spirit made her a great supervisor and invaluable role model
for my professional career.
I am also grateful to my co-supervisor Dr. Haza Nuzly Bin Abdull Hamed for
his precious advices and comments and knowledge sharing in spiking neural
network. Special thanks for his generous help throughout the duration of this study.
Many thanks to the Ministry of Higher Education (MOHE) under the research
grant that have partially paid from LRGS Grantt R.J130000.7828.4L805 - MULTI-
VARIATE PATTERN ANALYSIS FOR FORENSIC IDENTIFICATION AND
VERIFICATION for the support during my study.
In addition, I am extremely grateful to my family for unlimited support and
encouragement during this research. My sincere appreciation also extends to UTM
Big Data Center, Soft Computing Research Group (SCRG) and all my colleagues for
the support and incisive comments in making this study a success. Their views and
tips are useful indeed. Unfortunately, it is not possible to list all of them in this
limited space.
-
viii
ABSTRACT
Spiking neural network (SNN) is considered as the third generation of artificial
neural networks. Although there are many models of SNN, Evolving Spiking Neural
Network (ESNN) is widely used in many recent research works. Among the many important
issues that need to be explored in ESNN are determining the optimal pre-synaptic neurons
and parameters values for a given data set. Moreover, previous studies have not investigated
the performance of the multi-objective approach with ESNN. In this study, the aim is to find
the optimal pre-synaptic neurons and parameter values for ESNN simultaneously by
proposing several integrations between ESNN and differential evolution (DE). The proposed
algorithms applied to address these problems include DE with evolving spiking neural
network (DE-ESNN) and DE for parameter tuning with evolving spiking neural network
(DEPT-ESNN). This study also utilized the approach of multi-objective (MOO) with ESNN
for better learning structure and classification accuracy. Harmony Search (HS) and memetic
approach was used to improve the performance of MOO with ESNN. Consequently, Multi-
Objective Differential Evolution with Evolving Spiking Neural Network (MODE-ESNN),
Harmony Search Multi-Objective Differential Evolution with Evolving Spiking Neural
Network (HSMODE-ESNN) and Memetic Harmony Search Multi-Objective Differential
Evolution with Evolving Spiking Neural Network (MEHSMODE-ESNN) were applied to
improve ESNN structure and accuracy rates. The hybrid methods were tested by using seven
benchmark data sets from the machine learning repository. The performance was evaluated
using different criteria such as accuracy (ACC), geometric mean (GM), sensitivity (SEN),
specificity (SPE), positive predictive value (PPV), negative predictive value (NPV) and
average site performance (ASP) using k-fold cross validation. Evaluation analysis shows
that the proposed methods demonstrated better classification performance as compared to the
standard ESNN especially in the case of imbalanced data sets. The findings revealed that the
MEHSMODE-ESNN method statistically outperformed all the other methods using the
different data sets and evaluation criteria. It is concluded that multi objective proposed
methods have been evinced as the best proposed methods for most of the data sets used in
this study. The findings have proven that the proposed algorithms attained the optimal pre-
synaptic neurons and parameters values and MOO approach was applicable for the ESNN.
-
ix
ABSTRAK
Rangkaian Neural Pakuan (SNN) dianggap sebagai generasi ketiga buatan
saraf rangkaian.Walaupun terdapat banyak model dari SNN, Rangkaian Neural
Pakuan Berevolusi (ESNN) telah digunakan secara meluas di dalam kajian terkini.
Antara isu-isu penting yang perlu dikaji di dalam ESNN adalah menentukan bilangan
optimal pra-sinaptik neuron dan bilangan parameter bagi data set yang telah
diberikan. Selain itu, kajian sebelum ini tidak menekankan prestasi pendekatan
pelbagai objektif bersama ESNN. Tujuan utama kajian ini adalah untuk mencari
nilai optimum pra-sinaptik neuron dan parameter ESNN secara serentak dengan
mencadangkan beberapa integrasi antara ESNN dan Evolusi Pembezaan (DE).
Pelbagai algoritma telah dicadangkan iaitu DE bersama Rangkaian Neural Pakuan
Berevolusi (DE-ESNN) dan DE bagi penalaan parameter dengan Rangkaian Neural
Pakuan Berevolusi (DEPT-ESNN). Kajian ini juga menggunakan pendekatan multi-
objektif (MOO) bersama ESNN, bagi menerangkan struktur pembelajaran yang lebih
baik dan ketepatan pengelasan. Carian Harmoni (HS) dan pendekatan memetik
digunakan untuk meningkatkan prestasi MOO bersama ESNN. Oleh itu, Multi-
Objektif Evolusi Pembezaan bersama Rangkaian Neural Pakuan Berevolusi (MODE-
ESNN), Carian Harmoni Multi-Objektif Evolusi Pembezaan bersama Rangkaian
Neural Pakuan Berevolusi (HSMODE-ESNN) dan Multi-Objektif Evolusi
Pembezaan bersandarkan memetik harmoni bersama Rangkaian Neural Pakuan
Berevolusi (MEHSMODE-ESNN) digunakan bagi memperbaiki struktur ESNN dan
ketetapan kadar. Kaedah hibrid telah diuji dengan menggunakan tujuh penanda aras
data set dari repositori pembelajaran mesin. Prestasi kaedah yang dicadangkan telah
dinilai menggunakan kriteria yang berbeza seperti ketepatan (ACC), min geometri
(GM), kepekaan (SEN), kekhususan (SPE), nilai ramalan positif (PPV), nilai ramalan
negatif (NPV) dan tapak purata prestasi (ASP) menggunakan pengesahan silang k
kali ganda. Analisis penilaian menunjukkan bahawa kaedah yang dicadangkan
menunjukkan prestasi klasifikasi yang lebih baik berbanding dengan standard ESNN
terutama dalam kes data set yang tidak seimbang. Penilaian ini mendedahkan bahawa
kaedah MEHSMODE-ESNN statistik secara telak mengatasi semua kaedah lain
yang menggunakan data set dan kriteria penilaian yang berbeza. Hasil kajian
mendapati bahawa MEHSMODE-ESNN yang di cadangkan telah terbukti sebagai
kaedah terbaik bagi kebanyakan data set yang digunakan dalam kajian ini. Hasil
kajian telah membuktikan bahawa algoritma yang dicadangkan mencapai optimum
neuron pra-sinaptik dan parameter nilai dan pendekatan MOO terpakai untuk ESNN.
-
x
TABLE OF CONTENTS
PAGE TITLE CHAPTER
v DECLARATION
vi DEDICATION
vii ACKNOWLEDGEMENT
viii ABSTRACT
ix ABSTRAK
x TABLE OF CONTENTS
xiv LIST OF TABLES
xviii LIST OF FIGURES
xxiv LIST OF APPREVIATIONS
xxvii LIST OF APPENDICES
1 INTRODUCTION 1
1 1.1 Overview
3 1.2 Problem Background
6 1.3 Problem Statement
8 1.4 Research Aim
9 1.5 Research Objectives
9 1.6 Research Scope
10 1.7 Research Methodology Overview
12 1.8 Summary of Research Contributions
13 1.9 Thesis Outline
-
xi
15 LITERATURE REVIEW 2
17 2.1 Overview of Spiking Neural Network
17 2.1.1 Introduction to SNN
18 2.1.2 Neuron models
19 2.1.2.1 Hodgkin-Huxley model
20 2.1.2.2 Izhikevich SNN model
22 2.1.2.3 Leaky-integrate-and-fire model
22 2.1.2.4 Spike response model ( SRM )
23 2.1.2.5 Thorpe model
23 2.1.3 Neuronal coding
23 2.1.3.1 Temporal coding
24 2.1.3.2 Rate coding
24 2.1.3.3 Population coding
25 2.1.4 Learning methods
25 2.1.4.1 Unsupervised learning
26 2.1.4.2 Supervised learning
26 2.1.5 SNN types/architectures
26 2.1.5.1 ESNN
29 2.1.5.2 SpikeProp
30 2.1.5.3 Spiking -timing dependent
plasticity (STDP)
30 2.1.5.4 Spatio-temporal pattern
recognition
31 2.2 Multi-Objective Evolutionary Algorithms (EAs)
Optimization
31 2.2.1 Multi-objective optimization (MOO)
32 2.2.1.1 Definition of a MOO problem
33 2.2.1.2 Search and decision making
34 2.2.1.3 Methods of MOO algorithms
35 2.2.2 EA algorithms
36 2.2.2.1 Differential Evolution (DE)
39 2.2.2.2 Harmony Search (HS)
44 2.2.2.3 Memetic Technique
-
xii
45 2.2.3 Discussion of classification enhancement
45 2.3 Related Work and Discussion
53 2.4 Summary
55 RESEARCH METHODOLOGY 3
55 3.1 Introduction
55 3.2 General Research Framework
58 3.2.1 Phase1:Research Design
58 3.2.1.1 Improvement of Overall Research
Plan
58 3.2.1.2 Improvement of operational
framework and algorithms for
proposed methods
63 3.2.2 Phase2: Experimental Design
63 3.2.2.1 Data set Preparation
69 3.2.2.2 Learning Phase
69 3.2.3 Phase3: Results Evaluation and Comparison
69 3.2.3.1 Performance Measures
72 3.2.3.2 Statistical test
72 3.3 Summary
73
PROPOSED HYBRID METHODS FOR
EVOLVING SPIKING NEURAL NETWORK
4
73 4.1 Introduction
74 4.2 Enhancement of ESNN Structure by Differential
Evolution (DE-ESNN) for Pre-Synaptic Neurons
77 4.3 Parameter Optimization of DE-ESNN
(DEPT-ESNN)
80 4.4 Experimental Study
80 4.4.1 Experimental design
81 4.4.2 Analysis of the proposed methods using
k- fold cross-validation
82 4.4.2.1 Results and analysis of ESNN
-
xiii
85 4.4.2.2 Results and analysis of DE-ESNN
87 4.4.2.3 Results and analysis of DEPT-
ESNN
91 4.5 Analysis of the Hybrid Proposed Methods Based
on the Convergence Time
98 4.6 Statistical Analysis of the Proposed Hybridisation
Models with ESNN
99 4.7 Summary
101
MULTI-OBJECTIVE HYBRID METHODS FOR
EVOLVING SPIKING NEURAL NETWORK
5
101 5.1 Introduction
102 5.2 Multi-Objective DE based ESNN (MODE-ESNN)
104 5.3 Enhancement of Multi-Objective Based HS
(HSMODE-ESNN)
108 5.4 Enhancement of Multi-Objective Based Memetic
Method (MEHSMODE-ESNN)
110 5.5 Experimental Study
111 5.5.1 Experimental design
111 5.5.2 Analysis of the proposed methods using
k-fold cross-validation
112 5.5.2.1 Results and Analysis of
MODE- ESNN
115 5.5.2.2 Results and analysis of
HSMODE-ESNN
118 5.5.2.3 Results and analysis of
MEHSMODE-ESNN
123 5.6 Analysis of the MOO Methods with ESNN Based
on the Convergence Time
129 5.7 Statistical Analysis of the Proposed Hybridisation
Models with ESNN
130 5.8 Summary
-
xiv
132 COMPARATIVE STUDY OF THE PROPOSED
METHODS
6
132 6.1 Introduction
133 6.2 Analysis of the Proposed Methods on the Single
and Multi-Objective Hybrid
153 6.3 Analysis of the Proposed Methods with Several
Data Mining Algorithms
158 6.4 Analysis of the Proposed Methods with Various
Classifiers
168 6.5 Analysis of the Proposed Methods Based on
Computational Time and Convergence Time
169 6.5.1 Analysis of the proposed methods based on
computational time
170 6.5.2 Comparison with related works based on
computational time
172 6.6 Statistical Analysis of the Proposed Hybridisation
Models with ESNN
173 6.7 Summary
174 CONCLUSION AND FUTURE WORK 7
174 7.1 Introduction
174 7.2 Thesis Summary
176 7.3 Research Contributions
177 7.4 Future Work
179 REFERENCES
191 -198 Appendix A
-
xiv
LIST OF TABLES
TABLE NO. TITLE PAGE
2.1 Hodgkin-Huxley parameters (Gerstner and Kistler, 2002) 20
2.2 Description of parameters of Izhikevich model 21
2.3 Description of parameters of the HSA algorithm 41
2.4 Summary of fashionable approaches in SNN research 46
2.5 Summary of review on SNNs 50
2.6 Framework of the study 57
3.1 Overall research plan 59
3.2 Summary of data sets which used in this study 66
4.1 Parameter settings for the proposed algorithms 81
4.2 Results of training error, testing error and value of parameters
for ESNN 82
4.2 ( continued) 83
4.3 Results of SEN, SPE, GM and ACC for ESNN 83
4.4 Results of NPV, PPV and ASP for ESNN 84
4.5 Results of training error, testing error and value of parameters
for DE-ESNN 85
4.6 Results of SEN, SPE, GM and ACC for DE-ESNN 86
4.7 Results of NPV, PPV and ASP for DE-ESNN 87
-
xv
4.8 Results of training error, testing error and value of parameters
for DEPT-ESNN 88
4.9 Results of SEN, SPE, GM and ACC for DEPT-ESNN 89
4.10 Results of NPV, PPV and ASP for DEPT-ESNN 90
4.11 Friedman descriptive statistics of the hybrid proposed
algorithms 99
4.12 Friedman test statistics of the hybrid proposed algorithms 99
5.1 Parameter settings for MOO proposed algorithms 111
5.2 Results of training error, testing error and number of
parameters for MODE-ESNN 112
5.2 (continued) 113
5.3 Results of SEN, SPE, GM and ACC for MODE-ESNN 114
5.4 Results of NPV, PPV and ASP for MODE-ESNN 115
5.5 Results of training error, testing error and number of
parameters for HSMODE-ESNN 116
5.6 Results of SEN, SPE, GM and ACC for HSMODE-ESNN 117
5.7 Results of NPV, PPV and ASP for HSMODE-ESNN 118
5.8 Results of training error, testing error and number of
parameters for MEHSMODE-ESNN 119
5.9 Results of SEN,SPE,GM and ACC for MEHSMODE-ESNN 120
5.10 Results of NPV, PPV and ASP for MEHSMODE-ESNN 121
5.11 Friedman descriptive statistics of MOO proposed algorithms 130
5.12 Friedman test statistics of the MOO proposed algorithms 130
6.1 Comparison of results of all proposed algorithms in terms of
the modulation factor parameter ( Mod) for 10-fold cross-
validation 134
-
xvi
6.2 Comparison of results of all proposed algorithms in terms of
the similarity value parameter (Sim) for ten-fold cross-
validation 135
6.3 Comparison of results of all proposed algorithms in terms of
the proportion factor parameter (Threshold) for ten-fold
cross-validation 137
6.4 Comparison of results of all proposed algorithms in terms of
the ESNN structure (pre-synaptic neurons) for ten-fold cross-
validation 138
6.5 Sensitivity analysis of all proposed algorithms for ten-fold
cross-validation 139
6.6 SPE analysis for all proposed methods for ten-fold cross-
validation 141
6.7 GM analysis for all proposed methods for ten-fold cross-
validation 143
6.8 Accuracy analysis for all proposed methods for ten-fold
cross-validation 144
6.9 NPV analysis for all proposed methods for ten-fold cross-
validation 145
6.10 PPV analysis for all proposed methods for ten-fold cross-
validation 146
6.11 ASP analysis for all proposed methods for ten-fold cross-
validation 147
6.12 Summary analysis of all proposed methods 148
6.12 (Continued) 149
6.13 Description In-depth of data sets which used in this study 152
6.14 Summary of performance evaluation on the proposed
methods for each data sets 152
6.15 Description of KEEL data mining algorithm (Alcalá et al.,
2011) 154
-
xvii
6.16 Results of accuracy of training performance 154
6.17 Results of accuracy of testing performance 155
6.18 Results of different measures of classification performance of
all proposed methods with some standard classifiers 159
6.18 (Continued) 160
6.18 (Continued) 161
6.19 Computational time (in seconds) for training the proposed
algorithms 169
6.20 Computational time (in seconds) for testing the proposed
algorithms 169
6.21 Summary of computational time the proposed and existing
methods 170
6.21 (Continued) 171
6.22 Friedman descriptive statistics of the hybrid proposed
algorithms 173
6.23 Friedman test statistics of the hybrid proposed algorithms 173
-
xviii
LIST OF FIGURES
FIGURE NO. TITLE PAGE
1.1 Scenario guides to the research problem 5
1.2 Flow of research methodology phases 11
1.3 Summary of research contributions 12
2.1 A general overview of the literature review of this study 16
2.2 A schematic representation of SNN. Redrawn from (Gerstner
and Kistler, 2002) 18
2.3 Diagram of Hodgkin-Huxley model(Gerstner and Kistler,
2002). 19
2.4 Parameters of Izhikevich model (Izhikevich, 2010). 21
2.5 Diagram of the LIF model(Bishop and Maass, 1999) 22
2.6 GRF encoding. (Meftah et al., 2010) 25
2.7 A simplified architecture of ESNN (Hamed et al., 2009a) 28
2.8 Flowchart for training of DE (Storn and Price, 1997) 38
2.9 Pseudo-code of standard HSA 40
2.10 Flowchart for training of HS 43
2.11 Pseudo-code of MA(Elbeltagi et al., 2005) 44
3.1 Framework of the study 57
3.2 Schematic of the proposed methods 62
3.3 N-fold cross-validation 68
4.1 Visual summary of the hybrid proposed methods 74
file:///C:/Users/sony/Desktop/RAZAK_CORRECTION%20FILE%2028%20APRIL.docx%23_Toc417979036file:///C:/Users/sony/Desktop/RAZAK_CORRECTION%20FILE%2028%20APRIL.docx%23_Toc417979043file:///C:/Users/sony/Desktop/RAZAK_CORRECTION%20FILE%2028%20APRIL.docx%23_Toc417979045
-
xix
4.2 Schematic representation of hybrid DE-ESNN learning 74
4.3 Population encoding method.(Schliebs et al., 2009b) 75
4.4 The DE candidate structure in DE-ESNN framework 76
4.5 Process of hybrid DEPT-ESNN learning 78
4.6 DE candidate structure in DEPT-ESNN framework 79
4.7 Comparison of the DEPT-ESNN for ten-fold cross-validation
in terms of parameter analysis 88
4.8 Comparison of the proposed methods in terms of
convergence time for the appendicitis data set: (a) DE-
ESNN, (b) DEPT-ESNN. 91
4.9 Comparison of the proposed methods in terms of
convergence time for the Haberman data set: (a) DE-ESNN,
(b) DEPT-ESNN. 92
4.10 Comparison of the proposed methods in terms of
convergence time for the heart data set: (a) DE-ESNN, (b)
DEPT-ESNN. 94
4.11 Comparison of the proposed methods in terms of
convergence time for the hepatitis data set: (a) DE-ESNN, (b)
DEPT-ESNN. 94
4.12 Comparison of the proposed methods in terms of
convergence time for the ionosphere data set: (a) DE-ESNN,
(b) DEPT-ESNN. 96
4.13 Comparison of the proposed methods in terms of
convergence time for the iris data set: (a) DE-ESNN, (b)
DEPT-ESNN. 97
4.14 Comparison of the proposed methods in terms of
convergence time for the liver data set: (a) DE-ESNN, (b)
DEPT-ESNN. 98
5.1 Summary of the proposed MOO methods 102
5.2 Schematic representation of the proposed MODE-ESNN 103
-
xx
5.3 Flow chart of the hybrid HSMODE-ESNN 107
5.4 Schematic representation of the proposed MEHSMODE-
ESNN 110
5.5 Comparison of the MODE-ESNN for ten-fold cross-
validation in terms of parameter analysis 113
5.6 Comparison of the HSMODE-ESNN for ten-fold cross-
validation in terms of parameter analysis 116
5.7 Comparison of the MEHSMODE-ESNN for ten-fold cross-
validation in terms of parameter analysis 119
5.8 Evolution of accuracy and pre-synaptic neurons on (a)
Appendicitis, (b) Haberman, (c) Iris from the proposed
method MODE-ESNN. 122
5.9 Comparison of MOO methods with ESNN in terms of
convergence time for the appendicitis data set 124
5.10 Comparison of MOO methods with ESNN in terms of
convergence time for the Haberman data set 126
5.11 Comparison of MOO methods with ESNN in terms of
convergence time for the heart data set 126
5.12 Comparison of MOtable O methods with ESNN in terms of
convergence time for the hepatitis data set 126
5.13 Comparison of MOO methods with ESNN in terms of
convergence time for the ionosphere data set 126
5.14 Comparison of MOO methods with ESNN in terms of
convergence time for the iris data set 127
5.15 Comparison of MOO methods with ESNN in terms of
convergence time for the liver data set 129
6.1 A comparison of the proposed methods for 10-fold cross-
validation in terms of Mod parameter analysis 134
6.2 Comparison of the proposed methods for 10-fold cross-
validation in terms of Sim parameter analysis 136
125
127
128
-
xxi
6.3 Comparison of the proposed methods for ten-fold cross-
validation in 137
6.4 Comparison of the proposed methods for ten-fold cross-
validation in terms of pre-synaptic neurons analysis 138
6.5 Comparison of the proposed methods for ten-fold cross-
validation in terms of sensitivity analysis 140
6.6 Comparison of the proposed methods for ten-fold cross-
validation in terms of SPE analysis 142
6.7 Comparison of the proposed methods for ten-fold cross-
validation in terms of GM analysis 143
6.8 Comparison of the proposed methods for ten-fold cross-
validation in terms of accuracy analysis 144
6.9 Comparison of the proposed methods for ten-fold cross-
validation in terms of NPV analysis 145
6.10 Comparison of the proposed methods for ten-fold cross-
validation in terms of PPV analysis 146
6.11 Comparison of the proposed methods for ten-fold cross-
validation in terms of ASP analysis 147
6.12 Average training accuracy of proposed methods with data
mining methods 155
6.13 Average testing accuracy of proposed methods with data
mining methods 156
6.14 Average training accuracy of proposed methods with various
classifiers and data mining methods 157
6.15 Average testing accuracy of proposed methods with various
classifiers and data mining methods 157
6.16 Average classification's measure performance of proposed
methods with some standard classifiers for the appendicitis
data set 162
-
xxii
6.17 Average classification's measures performance of proposed
methods with some standard classifiers for the Haberman
data set 163
6.18 Average classification's measures performance of proposed
methods with some standard classifiers for the heart data set 164
6.19 Average classification's measures performance of proposed
methods with some standard classifiers for the hepatitis data
set 165
6.20 Average classification's measures performance of proposed
methods with some standard classifiers for the ionosphere
data set 166
6.21 Average classification's measures performance of proposed
methods with some standard classifiers for the iris data set 167
6.22 Average classification's measures performance of proposed
methods with some standard classifiers for the liver data set 168
6.23 Performance comparisons of the proposed and existing
methods 171
7.1 Schematic representation of the research objectives achieved 175
-
xxiii
LIST OF ABBREVIATIONS
ACC - Accuracy
ANNs - Artificial Neural Networks
ASP - Average Site Performance
BP - Back-Propagation
CM - Current method (ESNN)
CPSO - Cooperative Particle Swarm Optimization
DE - Differential Evolution
DE-ESNN - Differential Evolution with Evolving Spiking
Neural Network
DEPT-ESNN
- Differential Evolution for Parameter Tuning with
Evolving Spiking Neural Network
DM - Decision Making
EAs - Evolutionary Algorithms
ErrR - Error Rate
ESNN - Evolving Spiking Neural Network
FN - False Negative
FNR - False Negative Rate
FP - False Positive
FPR - False Positive Rate
GA - Genetic Algorithm
GM - Geometric Mean
GRF - Gaussian Receptive Fields
HM - Harmony Memory
HMCR - Harmony Memory Considering Rate
HMS - Harmony Memory Size
HS - Harmony Search algorithm
-
xxiv
HSA - Harmony Search Algorithm
HSMODE-ESNN
- Harmony Search Multi objective Differential
Evolution with Evolving Spiking Neural Network
LTD - Long Term Depression
LTP - Long Term Potentiation
MEHSMODE-ESNN
- Memetic Harmony Search Multi objective
Differential Evolution with Evolving Spiking
Neural Network
ML - Machine Learning
MLP - Multilayer Perceptron Network
Mod - Modulation Factor
MODE-ESNN
- Multi objective Differential Evolution with
Evolving Spiking Neural Network
MOEAs - Multi-objective evolutionary algorithms
MOGA - Multi objective genetic algorithm
MOO - Multi-Objective Optimization
MuSpiNN - Multi-Spiking Neural Network
NPV - Negative Predictive Value
NRU - No right to use
PAR - Pitch Adjusting Rate
PM - Proposed Method
PNNs - Probabilistic Neural Networks
PPV - Positive Predictive Value
PSO - Particle Swarm Optimization
QiPSO - Quantum-inspired Particle Swarm Optimization
raccept - Accepting Rate
SA - Simulated Annealing
SEN - Sensitivity
SI - Swarm Intelligence
Sim - Neuron Similarity Value
SNN - Spiking Neural Network
SO - Single Objective
SOM-AC - Self-Organizing Map with modified adaptive
coordinates
SPE - Specificity
SRM
- Spike Response Model
-
xxv
STDP - Spike Time Dependent Plasticity
SVM - Support Vector Machine
SWRNN - Spiking Wavelet Radial Basis Neural Network
Threshold - Proportion Factor
TN - True Negative
TNR - True Negative Rate
TP - True Positive
TPR - True Positive Rate
TS - Tabu Search
-
xxvi
LIST OF APPENDICES
APPENDIX TITLE
PAGE
A Data distribution 191
-
1
CHAPTER 1
INTRODUCTION
1.1 Overview
Classification of patterns is vital for several data mining processes.
Classification is one of the most commonly obverse processing tasks for a decision
support system (Ahmed et al., 2013a). There are many areas in life which need
classification such as medical diagnoses, medicine, science, industry, speech
recognition and handwritten character recognition. Among feasible classifiers,
artificial neural network (ANN) classifiers have proved to be one of the most robust
classification systems; their ability to deal with noisy input patterns and to handle
both noisy and continuous data demonstrates their use as an important tool for
classification (Mitchell and Michell, 1997).
ANNs are amongst the most well-known brain computational models and
ANN solves problems that are based on standard algorithmic techniques. ANNs can
be utilized in pattern recognition, generalization, perception and non-linear control.
Action potentials or spikes are responsible for all communications between neurons;
however, individual spikes in ANN models are averaged out over time. All
interactions are divided by the mean firing rate of the neurons. Furthermore, they are
computationally more powerful than ANNs, which use mean firing rates (Maass and
Bishop, 2001).
Due to its effectiveness in ANNs, the sigmoidal neuron model is considered
to be one of the best models of the biological neuron. Several vital applications of
ANNs have been built by rate modeling, which means a single biological neuron
-
2
releases action potentials (spikes) as a monotonically increasing function of input-
match. From another point of view, explorations of the computational power in
single spikes have been undertaken due to the spiking nature of biological neurons.
Bohte et al. (2002a) prove that more powerful computation can be found through
individual spike times rather than sigmoidal activation functions.
Spiking neural networks (SNNs), the third generation of ANNs, play an
essential role in biological information processing (Gerstner and Kistler, 2002).
Compared with ANNs, which use rate coding for neuronal activity representation,
spiking models provide an in-depth description of biological neuronal behavior.
More information has been used with the average firing rate for computations with
real neurons. Furthermore, instead of rate coding, the difference in firing times may
be used (Belatreche et al., 2006).
Although there are many models of SNN, the evolving spiking neural
network (ESNN) is used widely in recent research. The ESNN has several
advantages (Schliebs et al., 2009c) including being a simple, efficient neural model
and trained by a fast one-pass learning algorithm. The evolving nature of the model
can be updated whenever new data becomes accessible with no requirement to
retrain earlier existing samples. However, the ESNN model is affected by the choice
of parameter; the correct selection of parameters allows the network to evolve
towards reaching the best structure, thus guaranteeing the best output. For this
reason, an optimizer is needed to find the best combination of parameters.
Optimization has been used to enhance the ESNN algorithm. Choosing a
good optimization algorithm for real-world applications is necessary, especially for
optimal solutions of an ESNN. Evolutionary algorithms (EAs), mainly differential
evolution (DE), are common competitors in optimization problems because of the
following characteristics: simpler implementation, better performance, very few
control parameters and low space complexity (Abbass, 2001; Das and Suganthan,
2011). Therefore, DE is conducted to enhance ESNN algorithms. However, many
real-world optimization problems include several contradictory objectives. Rather
than single optimization, multi-objective optimization (MOO) can be utilized as a set
of optimal solutions to solve these problems. Every MOO solution appears to be a
-
3
new trade-off between the objectives. The key objective of MOO is to improve
ESNN optimal solutions of both structure and classification accuracy. In addition,
optimization of both accuracy and complexity leads to generalization.
The MOO approach is preferred to algorithms of traditional learning for a
number of reasons. First, as a result of using MOO, a good performance of these
learning algorithms can be achieved (Abbass, 2003b). Second, various objectives are
taken into consideration in the generation of multiple learning models such as
accuracy, complexity (Igel, 2005; Jin, 2006; Jin et al., 2004), interpretability and
accuracy (Jin et al., 2005), multiple error measures (Fieldsend and Singh, 2005).
Third, it is superior to build learning ensembles to use models that are produced
using MOO (Abbass, 2003a; Chandra and Yao, 2004; Jin et al., 2004). The
important goal of the MOO algorithm is to find a set of solutions from which the best
one is chosen. Based on Tan et al. (2001), the ability of EAs to search for optimal
solutions gives them the priority to be selected in MOO problems. EAs have the
ability to explore different parts of the related algorithm in the optimal set because
of the population-based algorithms.
Moreover, one of the EAs i.e. harmony search (HS) algorithm was
utilized to overcome problems of convergence rate at finding the global
minimum of DE (Gao et al., 2014; Purnomo and Wee, 2014; Wang and
Guo, 2013). Subsequently, backpropagation (BP) was used to speed up
convergence known as a memetic approach.
1.2 Problem Background
In SNN, the behavior which is archived in topology is like that of Kohonen’s
self-organization map and can be used effectively in character recognition
(Buonomano and Merzenich, 1999), classifications and dynamic path planning
(Yang and Luo, 2004). Consequently, SNNs have been utilized as a vital method for
classification. Many types of SNN are used for classification problems. Bohte et al.
(2002a) proposed a supervised learning algorithm, spike backpropagation
-
4
(SpikeProp), using spike-time encoding based on error BP, which is used for solving
classification problems. Schrauwen et al. (2004) proposed many learning rules to
extend SpikeProp for good learning of spike times. Ghosh-Dastidar et al. (2009)
introduced a multi-SpikeProp for supervised learning of spike patterns in multiple-
synapse transmission (Bohte et al., 2002b). Ahmed et al. (2013a) proposed and
presented several methods for classification problems for an improved SpikeProp by
particle swarm optimization (PSO) and angle-driven dependency learning rate.
Ahmed et al. (2014) mentioned that the most important challenge is to find out
efficient learning rules that might take advantage of the specific features of SNNs
while keeping the advantageous properties (general-purpose, easy-to-use, available
simulators etc.) of traditional connectionist models.
There have been many attempts to improve new models of SNNs. Wysoski et
al. (2006c) proposed a new model type, ESNN. Recently, a few studies on the
hybridization of the ESNN algorithm have been implemented. A novel supervised
learning algorithm combined with PSO for this model ESNN has been introduced by
Hamed et al. (2011a).
The most significant problem facing these recent studies is to determine the
optimal number of pre-synaptic neurons for a given data set (Hamed et al., 2011a).
The number of pre-synaptic neurons is required before the ESNN structure can be
constructed. This problem is similar to identifying the number of hidden nodes in
multilayer perceptron (MLP). Based on the work by Hamed (2012), a smaller
number of pre-synaptic neurons cause fewer input spikes to be generated and may
subsequently affect learning accuracy, while a larger number increases
computational time. Evolving processes are difficult to model as there might be no
prior knowledge for some parameters (Kasabov, 2003). Figure 1.1 explains the
scenario which leads to the problem settled by this research. In Figure 1.1 the
challenges of ESNN model and the limitation of existing model are revealed.
-
5
Evolving spiking neural network (ESNN)
1. ESNN cannot identify number of pre-synaptic neurons which is required before the ESNN
structure
2. ESNN cannot identify the optimum values of parameters
3. Difficulty in achieving optimization trade-off between accuracy and ESNN structure.
What do we need? A sufficient solution that guarantees the optimum number
of pre-synaptic neurons, optimum values of ESNN parameters and high performance of classification
accuracy.
Limitations of existing ESNN
model 1. Predetermined number of pre-
synaptic neurons
2. Predetermined values of ESNN parameters.
3. Not ensured high performance of classification accuracy.
Challenges of ESNN model 1. Fixed number of pre-synaptic
neurons affects the ESNN
performance
2. ESNN performance is influenced by fixed values of its parameters.
3. Single optimization has an impact on performance of classification
accuracy.
Required solution should guarantee 1. High accuracy performance. 2. Improved ESNN structure (optimum pre-
synaptic neurons).
Figure 1.1 Scenario guides to the research problem
The scenario shown in Figure 1.1 draws attention to the most important
requirements that should be taken into consideration during the design of the
solution.
-
6
According to Watts (2009), the automatic selection of evolving connectionist
systems (ECOS) training parameters would be a significant advantage. Therefore, it
would be interesting to choose an optimization technique to carry out this parameter
adaptation. Among the various optimization techniques, EAs have been used to solve
learning problems when applied to the ESNN model. The significance of using EAs
is their ability to adapt to a varying environment (Fernandez Caballero et al., 2010);
that is why it is a common optimizer in many classification models such as ANNs
(da Silva et al., 2010; Mineu et al., 2010), wavelet neural networks (Dheeba and
Selvi, 2012) and support vector machine (SVM) (Zhou et al., 2007). To capitalize on
the particular advantage of DE, it can be an attractive method to optimize pre-
synaptic neurons and to find trade-off solutions to overcome the problems of MOO.
A few studies have evaluated multi-objective evolutionary algorithm
(MOEAs) methods with SNN such as multi-objective genetic algorithm (MOGA)
with SpikeProp and showed that this algorithm performs well (Jin et al., 2007a). As
Yee and Teo suggested in (2013) multi-objective techniques could generate better
solutions in SNNs. Therefore, this research improved new hybrid methods with other
types of SNNs, for example ESNN with MOEAs such as multi-objective differential
evolution with evolving spiking neural network (MODE-ESNN), harmony search
multi-objective differential evolution with evolving spiking neural network
(HSMODE-ESNN) and memetic harmony search multi-objective differential
evolution with evolving spiking neural network (MEHSMODE-ESNN). The
proposed methods aim to create a trade-off between the structures of ESNN and the
accuracy of testing data of ESNN.
1.3 Problem Statement
Issue 1: Recently, ESNNs have attracted extensive research attention
because of the multiple advantages they offer compared to others models (Batllori et
al., 2011; Kasabov, 2012; Kasabov et al., 2014; Mohemmed et al., 2013; Murli et
al., 2014; Nuntalid et al., 2011a; Schliebs and Kasabov, 2013). Among the many real
issues that need to be explored in ESNN, determining the optimal number of pre-
-
7
synaptic neurons for a given data set is the most important one (Hamed, 2012;
Kasabov et al., 2014). The number of pre-synaptic neurons is required before the
ESNN structure can be constructed. This problem is similar to identifying the
number of hidden nodes in MLP. Fewer pre-synaptic neurons cause the generation of
fewer input spikes, which may subsequently affect learning accuracy, while more
pre-synaptic neurons increase computational time. Additionally, each of the methods
has a number of parameters which are currently set by hand, based on performance
with the training data set. Therefore, the automation of the process of parameter
selection is another challenge (Kasabov, 2012; Kita, 2011; Pears et al., 2013; Yu et
al., 2014).
Issue 2: Another real issue of the ESNN is achieving an optimized balance
between accuracy and the network structure. Several integrations between EAs and
Swarm Intelligence (SI) strategies with ESNN have been performed such as: (Hamed
et al., 2009a; Schliebs et al., 2009b; Schliebs et al., 2010a). However, GA has some
shortcomings such as more predefined parameters, competing conventions and
premature convergence problem (Kim et al., 2005; Sahab et al., 2005). Nevertheless,
no specific algorithm can achieve the best performance for particular problems as
supposed to the 'no free lunch theorem' (Wolpert and Macready, 1997). On the other
hand, the many advantages of DE compared to PSO and GA, which include being
much simpler to implement, much better performance, very few control parameters
and low space complexity (Abbass, 2001; Das and Suganthan, 2011) motivate
research in utilizing this hybridization
Therefore, in this thesis, all the hybrid proposed methods: differential
evolution with evolving spiking neural network (DE-ESNN), differential evolution
for parameter tuning with evolving spiking neural network (DEPT-ESNN), multi
objective differential evolution with evolving spiking neural network (MODE-
ESNN), harmony search multi objective differential evolution with evolving spiking
neural network (HSMODE-ESNN) and memetic harmony search multi objective
differential evolution with evolving spiking neural network (MEHSMODE-ESNN)
are presented.
-
8
Based on the above issues which are mentioned in section 1.3, the main
research question is
Are the proposed hybrid methods between ESNN and different meta-heuristic
and MOEAs which include DE-ESNN, DEPT-ESNN, MODE-ESNN, HSMODE-
ESNN and MEHSMODE-ESNN, beneficial for evolving learning of ESNN in terms
of structure (pre-synaptic neurons) and accuracy?
Thus, the following issues need to be addressed:
1. How to optimize both the structure of ESNN (the pre-synaptic neurons )
using the proposed method (DE-ESNN) and ESNN parameters using the
proposed method (DEPT-ESNN) ?
2. How to improve a multi objective method to optimize ESNN’s pre-
synaptic neurons as well as the parameters simultaneously using MODE-
ESNN ?
3. How effective is harmony search (HS) and memetic technique in
enhancing the multi objective method (MODE-ESNN) using HSMODE-
ESNN and MEHSMODE-ESNN?
4. Would the classification accuracy and other classification performance
measures be improved when all the previous proposed methods are
implemented ?
1.4 Research Aim
This research aims to enhance hybrid learning of evolving spiking neural
network (ESNN) with the proposed methods to obtain simple (the lowest number of
pre-synaptic neurons) and accurate ESNN model.
-
9
1.5 Research Objectives
In order to find the answers to the above questions, the objectives of this
study have been identified as:
1. To enhance evolving spiking neural network (ESNN) learning by
proposing hybrid methods using a differential evolution (DE)
algorithm to optimize the pre-synaptic neurons and the parameters of
ESNN.
2. To improve a multi-objective hybrid method of multi objective
differential evolution with evolving spiking neural network(MODE-
ESNN) to optimize the pre-synaptic neuron as well as the parameters
simultaneously.
3. To enhance the proposed hybrid methods using HS and memetic
techniques.
1.6 Research Scope
To accomplish the above objectives, the scope of this study is restricted to the
following:
1. Data sets on both binary and multi class classification problems are
essential for evaluating the proposed methods Appendicitis, Iris,
Hepatitis, Ionosphere, Liver, Haberman and Pima heart.
2. Focus is on the proposed methods of DE-ESNN, DEPT-ESNN, MODE-
ESNN, HSMODE-ESNN and MEHSMODE-ESNN for learning, which
includes training and testing in classification problems.
3. Performance is tested based on structure (number of pre-synaptic
neurons), classification accuracy (ACC), geometric mean (GM),
sensitivity (SEN), specificity (SPE), positive predictive value (PPV),
negative predictive value (NPV) and average site performance (ASP).
4. The programs are customized, improved and applied to the learning of
ESNN using Microsoft Visual C++ 10.0 and Matlab.
-
10
1.7 Research Methodology Overview
This study consists of five phases: research planning, data set preparation,
research design, implementation and analysis. Research planning is the key to
success in guiding the research direction. Additionally, algorithm performance
depends on data set nature. The data sets were used as inputs the for the proposed
methods in research design and implementation of the process enhancement. The
proposed algorithms were trained, tested and validated using quantitative
measurements to classification problems. Finally, statistical test analysis was
applied. Figure 1.2 shows the research methodology of the study.
-
11
Figure 1.2 Flow of research methodology phases
Phase 2: Data-set preparation
Data collection
Data preprocessing
Phase 3: Research Design
DE-ESNN
DEPT-ESNN
MODE-ESNN
HSMODE-ESNN
MEHSMODE-ESNN
Phase 5: Quantitative analysis
Classification analysis
Statistical test
HSMODE-
ESNN and
MEHSMODE-
ESNN
Phase 4: Research implementation
Parameter setting
Validation
Training
Testing
Phase 1: Research Planning
Milestone
Gantt Chart
HSMODE-
ESNN and
MEHSMODE-
ESNN
-
12
1.8 Summary of Research Contributions
The contributions of the study can be summarized in the next points, as also
illustrated in Figure 1.3:
1. Hybrid method known as DE-ESNN used for optimizing the pre-synaptic
neurons.
2. Hybrid method known as DEPT-ESNN used to optimize the parameters
(Mod, Sim, Threshold) in ESNN.
3. Multi-objective method known as MODE-ESNN used to optimize the pre-
synaptic. neurons and the parameters simultaneously.
4. Hybrid methods known as HSMODE-ESNN used to enhance the MODE-
ESNN method.
5. Hybrid methods known as MEHSMODE-ESNN used to enhance the
MODE-ESNN method.
Figure 1.3 Summary of research contributions
Hybrid method known as DE-ESNN
Hybrid method known as DEPT-ESNN
Multi objective method known as MODE-ESNN
Hybrid methods known as HSMODE-ESNN
Hybrid methods known as MEHSMODE-ESNN
-
13
1.9 Thesis Outline
This thesis contains seven chapters, including the introduction chapter. The
second chapter describes the background and the earlier work in the field of SNNs,
evolving spiking neural network and MOEAs. The third chapter describes the
research methodology for the work. The fourth and fifth chapters present the
proposed methods used in this study and their algorithmic and results details. Finally,
the last two chapters present the performance evaluation, discussion, conclusion and
future extensions of the study.
Chapter 2, Literature Review, introduces a general overview of the
literature review of this study. Fundamental concepts of SNNs, ESNN and EAs
that are used in this thesis and MO optimization are introduced.
Chapter 3, Research Methodology, illustrates the methodology used
in this study. The research methodology is presented as a flow chart diagram that
explains briefly how each step is utilized.
Chapter 4, Hybrid Proposed Methods, explains in detail how EAs can
optimize the ESNN model for classification. Furthermore, this chapter describes
the implementation of the algorithms which are used. Moreover, the results based
on performance measures are illustrated for all proposed methods. Last but not
least, statistical analysis is carried out.
Chapter 5, MOO Proposed Methods, describes how multi-objective
algorithms can optimize the ESNN model for classification. Additionally, this
chapter explains the implementation of MOO algorithms that are used.
Moreover, the results based on performance measures are illustrated for all
proposed methods. Finally, statistical analysis is carried out.
Chapter 6, Comparative study of the proposed methods, implements
the results based on performance measures and illustrates the comparative
analysis among all proposed methods. Moreover, a comparison is conducted with
various classifiers and data mining algorithms. Last but not least, statistical
analysis is carried out.
-
14
Chapter 7, Conclusion and Future Work, winds up the study and
highlights the contributions and findings of the research work. In addition,
Chapter 7 provides recommendations and suggestions and for future work.
Finally , a summary is reported for the whole study.
-
179
REFERENCES
Abbass, H. A. (2001), A memetic pareto evolutionary approach to artificial neural
networks. In AI 2001: Advances in Artificial Intelligence. (1-12). Springer.
Abbass, H. A. (2003b). Speeding up backpropagation using multiobjective
evolutionary algorithms. Neural computation. 15 (11), 2705-2726.
Abdolalipour, A. and Alibabaee, A. (2012). Harmony Search algorithm. Int. J. Acad.
Res. Appl. Sci. 1 (3), 13-16.
Adrian, E. D. and Zotterman, Y. (1926). The impulses produced by sensory nerve-
endings Part II. The response of a Single End-Organ. The Journal of
physiology. 61 (2), 151-171.
Ahmed, F. Y., Shamsuddin, S. M. and Hashim, S. Z. M. (2013a). Improved
SpikeProp for using Particle Swarm Optimization (PSO).
Ahmed, F. Y., Shamsuddin, S. M. and Hashim, S. Z. M. (2013b). Improved
SpikeProp for using particle swarm optimization. Mathematical Problems in
Engineering. 2013
Ahmed, F. Y., Yusob, B. and Hamed, H. N. A. (2014). Computing with Spiking
Neuron Networks A Review. International Journal of Advances in Soft
Computing & Its Applications. 6 (1),
Alcalá, J., Fernández, A., Luengo, J., Derrac, J., García, S., Sánchez, L. and Herrera,
F. (2011). Keel data-mining software tool: Data set repository, integration of
algorithms and experimental analysis framework. Journal of Multiple-Valued
Logic and Soft Computing. 17 255-287.
Asuncion, A. and Newman, D. (2007). UCI Machine Learning Repository
[http://www. ics. uci. edu/~ mlearn/MLRepository. html]. Irvine, CA:
University of California. School of Information and Computer Science.
Bäck, T. and Schwefel, H.-P. (1993). An overview of evolutionary algorithms for
parameter optimization. Evolutionary computation. 1 (1), 1-23.
-
180
Batllori, R., Laramee, C., Land, W. and Schaffer, J. (2011). Evolving spiking neural
networks for robot control. Procedia Computer Science. 6 329-334.
Belatreche, A., Maguire, L. P., Mcginnity, M. and Wu, Q. X. (2006). Evolutionary
design of spiking neural networks. New Mathematics and Natural
Computation. 2 (03), 237-253.
Belatreche, A., Maguire, L. P. and McGinnity, M. (2007). Advances in design and
application of spiking neural networks. Soft Computing. 11 (3), 239-248.
Bi, G.-q. and Poo, M.-m. (2001). Synaptic modification by correlated activity:
Hebb's postulate revisited. Annual review of neuroscience. 24 (1), 139-166.
Bishop, C. M. and Maass, W. (1999). Pulsed neural networks. MIT Press
Cambridge, MA.
Bliss, T. V. and Gardner-Medwin, A. (1973). Long-lasting potentiation of synaptic
transmission in the dentate area of the unanaesthetized rabbit following
stimulation of the perforant path. The Journal of physiology. 232 (2), 357.
Bodyanskiy, Y. and Dolotov, A. (2010). HYBRID SYSTEMS OF
COMPUTATIONAL INTELLIGENCE EVOLVED FROM SELF-
LEARNING SPIKING NEURAL NETWORK. Methods and Instruments of
Artificial Intelligence. 17.
Bodyanskiy, Y., Dolotov, A. and Vynokurova, O. (2014). Evolving spiking wavelet-
neuro-fuzzy self-learning system. Applied Soft Computing. 14 252-258.
Bohte, S. M., La Poutré, H., Kok, J. N. and Informatica, C. v. W. e. (2000). Error-
backpropagation in temporally encoded networks of spiking neurons.
Centrum voor Wiskunde en Informatica.
Bohte, S. M., Kok, J. N. and La Poutre, H. (2002a). Error-backpropagation in
temporally encoded networks of spiking neurons. Neurocomputing. 48 (1),
17-37.
Bohte, S. M., La Poutré, H. and Kok, J. N. (2002b). Unsupervised clustering with
spiking neurons by sparse temporal coding and multilayer RBF networks.
Neural Networks, IEEE Transactions on. 13 (2), 426-435.
Bohte, S. M. and Kok, J. N. (2005). Applications of spiking neural networks.
Information Processing Letters. 95 (6), 519-520.
Brette, R., Rudolph, M., Carnevale, T., Hines, M., Beeman, D., Bower, J. M.,
Diesmann, M., Morrison, A., Goodman, P. H. and Harris Jr, F. C. (2007).
-
181
Simulation of networks of spiking neurons: a review of tools and strategies.
Journal of computational neuroscience. 23 (3), 349-398.
Buonomano, D. V. and Merzenich, M. (1999). A neural network model of temporal
code generation and position-invariant pattern recognition. Neural
computation. 11 (1), 103-116.
Chandra, A. and Yao, X. (2004), DIVACE: Diverse and accurate ensemble learning
algorithm. In Intelligent Data Engineering and Automated Learning–IDEAL
2004. (619-625). Springer.
Chandra, B. and Naresh Babu, K. (2014). Classification of gene expression data
using spiking wavelet radial basis neural network. Expert systems with
applications. 41 (4), 1326-1330.
Charnes, A. and Cooper, W. W. (1957). Management models and industrial
applications of linear programming. Management Science. 4 (1), 38-91.
Cohon, J. L. (2010). Multiobjective programming and planning, 1978. Academic,
New York.
da Silva, A. J., Mineu, N. L. and Ludermir, T. B. (2010). Evolving Artificial Neural
Networks Using Adaptive Differential Evolution. Advances in Artificial
Intelligence - Iberamia 2010. 6433 396-405.
Das, S. and Suganthan, P. N. (2011). Differential evolution: A survey of the state-of-
the-art. Evolutionary Computation, IEEE Transactions on. 15 (1), 4-31.
Delorme, A. and Thorpe, S. J. (2003). SpikeNET: an event-driven simulation
package for modelling large networks of spiking neurons. Network:
Computation in Neural Systems. 14 (4), 613-627.
Dheeba, J. and Selvi, S. T. (2012). An Improved Decision Support System for
Detection of Lesions in Mammograms Using Differential Evolution
Optimized Wavelet Neural Network. Journal of Medical Systems. 36 (5),
3223-3232.
Do, K.-A. and Ambroise, C. (2004). Analyzing microarray gene expression data.
Wiley. 14 1080-1087.
Drazin, S. and Montag, M. (2012). Decision tree analysis using WEKA. Machine
Learning-Project II, University of Miami. 1-3.
-
182
Elbeltagi, E., Hegazy, T. and Grierson, D. (2005). Comparison among five
evolutionary-based optimization algorithms. Advanced engineering
informatics. 19 (1), 43-53.
Fawcett, T. (2006). An introduction to ROC analysis. Pattern recognition letters. 27
(8), 861-874.
Fernandez Caballero, J. C., Martínez, F. J., Hervás, C. and Gutiérrez, P. A. (2010).
Sensitivity versus accuracy in multiclass problems using memetic Pareto
evolutionary neural networks. Neural Networks, IEEE Transactions on. 21
(5), 750-770.
Fieldsend, J. E. and Singh, S. (2005). Pareto evolutionary neural networks. Neural
Networks, IEEE Transactions on. 16 (2), 338-354.
Gao, L.-q., Li, S., Kong, X. and Zou, D.-x. (2014). On the iterative convergence of
harmony search algorithm and a proposed modification. Applied Mathematics
and Computation. 247 1064-1095.
García, S., Luengo, J. and Herrera, F. (2015), A Data Mining Software Package
Including Data Preparation and Reduction: KEEL. In Data Preprocessing in
Data Mining. (285-313). Springer.
Geem, Z. W., Kim, J. H. and Loganathan, G. (2001). A new heuristic optimization
algorithm: harmony search. Simulation. 76 (2), 60-68.
Gerstner, W. and Kistler, W. M. (2002). Spiking neuron models: Single neurons,
populations, plasticity. Cambridge university press.
Ghosh-Dastidar, S. and Adeli, H. (2009). A new supervised learning algorithm for
multiple spiking neural networks with application in epilepsy and seizure
detection. Neural Networks. 22 (10), 1419-1431.
Goli, B. and Govindan, G. (2012). WEKA-A powerful free software for
implementing Bio-inspired Algorithm. Technical Trends, CSI
Communications, Page. (9-12),
Gong, W., Cai, Z. and Ling, C. X. (2010). DE/BBO: a hybrid differential evolution
with biogeography-based optimization for global numerical optimization. Soft
Computing. 15 (4), 645-665.
Hamed (2012). Novel Integrated Methods of Evolving Spiking Neural Network and
Particle Swarm Optimisation. Phd thesis, Auckland University of
Technology.
-
183
Hamed, A., Nuzly, H. and Kasabov, N. (2011a). Quantum-inspired particle swarm
optimisation for integrated feature and parameter optimisation of evolving
spiking neural networks. International Journal of Artificial Intelligence. 7
(A11), 114-124.
Hebb, D. O. (1949a). The organization of behavior: A neuropsychological approach.
John Wiley & Sons.
Hebb, D. O. (1949b). The organization of behavior: Aneuropsychological theory.
Wiley.
Ho, W., Xu, X. and Dey, P. K. (2010). Multi-criteria decision making approaches for
supplier evaluation and selection: A literature review. European Journal of
Operational Research. 202 (1), 16-24.
Hopfield, J. J. (1995). Pattern recognition computation using action potential timing
for stimulus representation. Nature. 376 (6535), 33-36.
Howard, G., Bull, L., de Lacy Costello, B., Gale, E. and Adamatzky, A. (2014).
Evolving spiking networks with variable resistive memories. Evolutionary
computation. 22 (1), 79-103.
Hwang, C.-L., Masud, A. S. M., Paidy, S. R. and Yoon, K. P. (1979). Multiple
objective decision making, methods and applications: a state-of-the-art
survey. Springer Berlin.
Igel, C. (2005), Multi-objective Model Selection for Support Vector Machines. In
Coello Coello,Hernández Aguirre and Zitzler. Evolutionary Multi-Criterion
Optimization. (534-546). Springer Berlin Heidelberg.
Ijiri, Y. (1965). Management goals and accounting for control. North Holland Pub.
Co.
Ilonen, J., Kamarainen, J.-K. and Lampinen, J. (2003). Differential evolution training
algorithm for feed-forward neural networks. Neural Processing Letters. 17
(1), 93-105.
Ishibuchi, H., Nakashima, T. and Murata, T. (1999). Performance evaluation of fuzzy
classifier systems for multidimensional pattern classification problems.
Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on.
29 (5), 601-618.
-
184
Izhikevich, E. M. (2010). Hybrid spiking models. Philosophical Transactions of the
Royal Society A: Mathematical, Physical and Engineering Sciences. 368
(1930), 5061-5070.
Jin, Y. (2006). Multi-objective machine learning. Springer.
Jin, Y., Wen, R. and Sendhoff, B. (2007b), Evolutionary multi-objective optimization
of spiking neural networks. In Artificial Neural Networks–ICANN 2007. (370-
379). Springer.
Kasabov, N. (2012), Evolving spiking neural networks and neurogenetic systems for
spatio-and spectro-temporal data modelling and pattern recognition. In
Advances in Computational Intelligence. (234-260). Springer.
Kasabov, N., Dhoble, K., Nuntalid, N. and Indiveri, G. (2013). Dynamic evolving
spiking neural networks for on-line spatio-and spectro-temporal pattern
recognition. Neural Networks. 41 188-201.
Kasabov, N., Feigin, V., Hou, Z.-G., Chen, Y., Liang, L., Krishnamurthi, R.,
Othman, M. and Parmar, P. (2014). Evolving spiking neural networks for
personalised modelling, classification and prediction of spatio-temporal
patterns with a case study on stroke. Neurocomputing. 134 269-279.
Kasabov, N. K. (2003). Evolving connectionist systems: Methods and applications in
bioinformatics, brain study and intelligent machines. Springer.
Kasabov, N. K. (2014). NeuCube: A spiking neural network architecture for
mapping, learning and understanding of spatio-temporal brain data. Neural
Networks. 52 62-76.
Kim, D., Kim, H. and Chung, D. (2005), A modified genetic algorithm for fast
training neural networks. In Advances in Neural Networks–ISNN 2005. (660-
665). Springer.
Kita, E. (2011). Evolutionary algorithms. InTech.
Kohavi, R. and John, G. H. (1997). Wrappers for feature subset selection. Artificial
intelligence. 97 (1), 273-324.
Lee, K. S. and Geem, Z. W. (2005). A new meta-heuristic algorithm for continuous
engineering optimization: harmony search theory and practice. Computer
methods in applied mechanics and engineering. 194 (36), 3902-3933.
Maass, W. (1997). Networks of spiking neurons: the third generation of neural
network models. Neural Networks. 10 (9), 1659-1671.
-
185
Maass, W. and Bishop, C. M. (2001). Pulsed neural networks. MIT press.
Markram, H., Lübke, J., Frotscher, M. and Sakmann, B. (1997). Regulation of
synaptic efficacy by coincidence of postsynaptic APs and EPSPs. Science.
275 (5297), 213-215.
Marler, R. T. and Arora, J. S. (2004). Survey of multi-objective optimization
methods for engineering. Structural and multidisciplinary optimization. 26
(6), 369-395.
Meftah, B., Lezoray, O. and Benyettou, A. (2010). Segmentation and edge detection
based on spiking neural network model. Neural Processing Letters. 32 (2),
131-146.
Mineu, N. L., Ludermir, T. B., Almeida, L. M. and Ieee (2010), Topology
Optimization for Artificial Neural Networks using Differential Evolution. In
2010 International Joint Conference on Neural Networks Ijcnn 2010. New
York: Ieee.
Moh’d Alia, O. and Mandava, R. (2011). The variants of the harmony search
algorithm: an overview. Artificial Intelligence Review. 36 (1), 49-68.
Mohemmed, A., Schliebs, S., Matsuda, S. and Kasabov, N. (2011b), Method for
training a spiking neuron to associate input-output spike trains. In
Engineering Applications of Neural Networks. (219-228). Springer.
Mohemmed, A., Schliebs, S., Matsuda, S. and Kasabov, N. (2012). Span: Spike
pattern association neuron for learning spatio-temporal spike patterns.
International Journal of Neural Systems. 22 (04),
Mohemmed, A., Schliebs, S., Matsuda, S. and Kasabov, N. (2013). Training spiking
neural networks to associate spatio-temporal input–output spike patterns.
Neurocomputing. 107 3-10.
Moscato, P. (1989). On evolution, search, optimization, genetic algorithms and
martial arts: Towards memetic algorithms. Caltech concurrent computation
program, C3P Report. 826 1989.
Moslehpour, S. (2011). Transmission Network Expansion Planning Based on
Hybridization Model of Probabilistic Neural Networks and Harmony Search
Algorithm.
-
186
Neri, F. and Cotta, C. (2012). Memetic algorithms and memetic computing
optimization: A literature review. Swarm and Evolutionary Computation. 2 1-
14.
Noman, N. and Iba, H. (2008). Accelerating differential evolution using an adaptive
local search. Evolutionary Computation, IEEE Transactions on. 12 (1), 107-
125.
Pears, R., Widiputra, H. and Kasabov, N. (2013). Evolving integrated multi-model
framework for on line multiple time series prediction. Evolving Systems. 4
(2), 99-117.
Philipson, R. and Ravindran, A. (1978). Application of goal programming to
machinability data optimization. Journal of Mechanical Design. 100 (2), 286-
291.
Phung, S. L., Bouzerdoum, A. and Nguyen, G. H. (2009). Learning pattern
classification tasks with imbalanced data sets.
Platel, M. D., Schliebs, S. and Kasabov, N. (2009). Quantum-inspired evolutionary
algorithm: a multimodel EDA. Evolutionary Computation, IEEE
Transactions on. 13 (6), 1218-1232.
Ponulak, F. and Kasinski, A. (2010). Supervised learning in spiking neural networks
with ReSuMe: sequence learning, classification, and spike shifting. Neural
computation. 22 (2), 467-510.
Purnomo, H. D. and Wee, H.-M. (2014). Maximizing production rate and workload
balancing in a two-sided assembly line using Harmony Search. Computers &
Industrial Engineering. 76 222-230.
Qasem, S. N. and Shamsuddin, S. M. (2011). Memetic elitist pareto differential
evolution algorithm based radial basis function networks for classification
problems. Applied Soft Computing. 11 (8), 5565-5581.
Ranjith, J. and Muniraj, N. (2014). High Performance Independent Component
Analysis. Asian Journal of Scientific Research. 7 (4),
Razfar, M. R., Zinati, R. F. and Haghshenas, M. (2011). Optimum surface roughness
prediction in face milling by using neural network and harmony search
algorithm. The International Journal of Advanced Manufacturing
Technology. 52 (5-8), 487-495.
-
187
Russell, A., Orchard, G., Dong, Y., Mihalas, S., Niebur, E., Tapson, J. and Etienne-
Cummings, R. (2010). Optimization methods for spiking neurons and
networks. Neural Networks, IEEE Transactions on. 21 (12), 1950-1962.
Sahab, M., Ashour, A. and Toropov, V. (2005). A hybrid genetic algorithm for
reinforced concrete flat slab buildings. Computers & structures. 83 (8), 551-
559.
Schliebs, S., Defoin-Platel, M. and Kasabov, N. (2009a), Integrated Feature and
Parameter Optimization for an Evolving Spiking Neural Network. In
Köppen,Kasabov and Coghill. Advances in Neuro-Information Processing.
(1229-1236). Springer Berlin Heidelberg.
Schliebs, S., Defoin-Platel, M. and Kasabov, N. (2009b), Integrated feature and
parameter optimization for an evolving spiking neural network. In Advances
in Neuro-Information Processing. (1229-1236). Springer.
Schliebs, S., Defoin-Platel, M., Worner, S. and Kasabov, N. (2009c). Integrated
feature and parameter optimization for an evolving spiking neural network:
Exploring heterogeneous probabilistic models. Neural Networks. 22 (5), 623-
632.
Schliebs, S., Kasabov, N. and Defoin-Platel, M. (2010a). On the probabilistic
optimization of spiking neural networks. International Journal of Neural
Systems. 20 (06), 481-500.
Schliebs, S., Nuntalid, N. and Kasabov, N. (2010b), Towards spatio-temporal
pattern recognition using evolving spiking neural networks. In Neural
Information Processing. Theory and Algorithms. (163-170). Springer.
Schliebs, S. and Kasabov, N. (2013). Evolving spiking neural network—a survey.
Evolving Systems. 4 (2), 87-98.
Sheskin, D. J. (2000). Parametric and nonparametric statistical procedures. Boca
Raton: CRC.
Soltani, M. A. Z., Haghighat, A. T., Rashidi, H. and Chegini, T. G. (2011). A Couple
of Algorithms for K-Coverage Problem in Visual Sensor Networks.
Steuer, R. (1986). Multiple criteria optimization: theory, computation, and
application. 1986. Willey, New York.
-
188
Storn, R. and Price, K. (1997). Differential evolution–a simple and efficient heuristic
for global optimization over continuous spaces. Journal of global
optimization. 11 (4), 341-359.
Storn, R. (2008), Differential evolution research–trends and open questions. In
Advances in differential evolution. (1-31). Springer.
Tan, K. C., Lee, T. H. and Khor, E. F. (2001). Evolutionary algorithms with dynamic
population size and local exploration for multiobjective optimization.
Evolutionary Computation, IEEE Transactions on. 5 (6), 565-588.
Tanwani, A. K. and Farooq, M. (2009), The role of biomedical data set in
classification. In Artificial Intelligence in Medicine. (370-374). Springer.
Theodoridis, Y. and Nascimento, M. A. (2000). Generating spatiotemporal data sets
on the WWW. ACM SIGMOD Record. 29 (3), 39-43.
Thorpe, S., Delorme, A. and Van Rullen, R. (2001). Spike-based strategies for rapid
processing. Neural Networks. 14 (6), 715-725.
Triantaphyllou, E. (2000). Multi-criteria decision making methods a comparative
study. Springer.
Vázquez, R. A. and Garro, B. A. (2011), Training spiking neurons by means of
particle swarm optimization. In Advances in Swarm Intelligence. (242-249).
Springer.
Verstraeten, D., Schrauwen, B., d’Haene, M. and Stroobandt, D. (2007). An
experimental unification of reservoir computing methods. Neural Networks.
20 (3), 391-403.
Wang, G. and Guo, L. (2013). A novel hybrid bat algorithm with harmony search
for global numerical optimization. Journal of Applied Mathematics. 2013
Wang, J., Belatreche, A., Maguire, L. and McGinnity, T. (2014). An Online
Supervised Learning Method for Spiking Neural Networks with Adaptive
Structure. Neurocomputing.
Watts, M. J. (2009). A decade of Kasabov's evolving connectionist systems: a
review. Systems, Man, and Cybernetics, Part C: Applications and Reviews,
IEEE Transactions on. 39 (3), 253-269.
Weiss, G. M. and Provost, F. (2003). Learning when training data are costly: the
effect of class distribution on tree induction. Journal of Artificial Intelligence
Research. 315-354.
-
189
Witten, I. H. and Frank, E. (2005). Data Mining: Practical machine learning tools
and techniques. Morgan Kaufmann.
Wolpert, D. H. and Macready, W. G. (1997). No free lunch theorems for
optimization. Evolutionary Computation, IEEE Transactions on. 1 (1), 67-82.
Wu, X., Kumar, V., Quinlan, J. R., Ghosh, J., Yang, Q., Motoda, H., McLachlan, G.
J., Ng, A., Liu, B. and Philip, S. Y. (2008). Top 10 algorithms in data mining.
Knowledge and Information Systems. 14 (1), 1-37.
Wysoski, S., Benuskova, L. and Kasabov, N. (2006b). On-line learning with
structural adaptation in a network of spiking neurons for visual pattern
recognition. Artificial Neural Networks–ICANN 2006. 61-70.
Wysoski, S. G., Benuskova, L. and Kasabov, N. (2006d), On-line learning with
structural adaptation in a network of spiking neurons for visual pattern
recognition. In Artificial Neural Networks–ICANN 2006. (61-70). Springer.
Yang, S. X. and Luo, C. (2004). A neural network approach to complete coverage
path planning. Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE
Transactions on. 34 (1), 718-724.
Yang, X.-S. (2009a), Harmony search as a metaheuristic algorithm. In Music-
inspired harmony search algorithm. (1-14). Springer.
Yang, X.-S. (2009b), Harmony search as a metaheuristic algorithm. In Geem.
Music-inspired harmony search algorithm. (1-14). Springer.
Yee, E. E. and Teo, J. (2013). Evolutionary Spiking Neural Networks as Racing Car
Controllers. International Journal of Computer Information Systems and
Industrial Management Applications. 5 (2013), 365-372
Yen, G. G. and Lu, H. (2003). Dynamic multiobjective evolutionary algorithm:
adaptive cell-based rank and density estimation. Evolutionary Computation,
IEEE Transactions on. 7 (3), 253-274.
Yu, Q., Tang, H., Tan, K. C. and Yu, H. (2014). A brain-inspired spiking neural
network model with temporal encoding and learning. Neurocomputing. 138
3-13.
Zhou, S. W., Wu, L. H., Yuan, X. F., Wen, T. and PRess, A. T. (2007). Parameters
selection of SVM for function approximation based on Differential Evolution
- art. no. 1270. Paris: Atlantis Press.
-
190
Zitzler, E. (1999). Evolutionary algorithms for multiobjective optimization: Methods
and applications. Shaker Ithaca.
Zitzler, E., Deb, K. and Thiele, L. (2000). Comparison of multiobjective evolutionary
algorithms: Empirical results. Evolutionary computation. 8 (2), 173-195.
thesis ver 21thesis ver 20thesisTHESIS VER 15 NEWTHESIS VER 14THESIS 13MY THSIS 11 JUNENEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
103HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
MY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
contNEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
103HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
MY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
DEMY THSIS 11 JUNENEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
103HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
MY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
contNEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
103HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
MY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
adeebTHESIS 13MY THSIS 11 JUNENEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
103HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
MY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
contNEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
103HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
MY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
DEMY THSIS 11 JUNENEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
103HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
MY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
contNEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
103HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
MY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
adeebTHESIS 13MY THSIS 11 JUNENEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
103HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
MY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
contNEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
103HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
MY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
DEMY THSIS 11 JUNENEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
103HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
MY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
contNEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
103HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
MY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
adeebTHESIS 13MY THSIS 11 JUNENEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
103HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
MY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
contNEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
103HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
MY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12
DEMY THSIS 11 JUNENEW THESIS 10 JUNEMY THSIS 10 JUNERAZAK_BINDING 9 june_version 12Razak_part of table of content ve6RAZAK_BINDING 9 june_version 12LR OKHAB2RAZAK_BINDING 4 june_version 9table of content ver 3RAZAK_BINDING 4 june_version 9
RAZAK_BINDING 9 june_version 12CH4HAB2RAZAK_BINDING 4 june_version 9t