A comprehensive review of bat algorithms and their ...

113
UNIVERSITY OF MARIBOR A comprehensive review of bat algorithms and their hybridization by Iztok Fister Jr. A Thesis submitted in partial fulfillment for the degree of Master of Computer Science and Information Technologies at the Faculty of Electrical Engineering and Computer Science University of Maribor Supervisor: Dr. Marjan Mernik (University of Maribor) Co-Supervisor: Dr. Xin-She Yang (Middlesex University) September 2013

Transcript of A comprehensive review of bat algorithms and their ...

Page 1: A comprehensive review of bat algorithms and their ...

UNIVERSITY OF MARIBOR

A comprehensive review of bat

algorithms and their hybridization

by

Iztok Fister Jr.

A Thesis submitted in partial fulfillment for the

degree of Master of Computer Science and Information Technologies

at the

Faculty of Electrical Engineering and Computer Science

University of Maribor

Supervisor: Dr. Marjan Mernik (University of Maribor)

Co-Supervisor: Dr. Xin-She Yang (Middlesex University)

September 2013

Page 2: A comprehensive review of bat algorithms and their ...

i

Page 3: A comprehensive review of bat algorithms and their ...

“Most people never get there. They’re afraid or unwilling to demand enough of them-

selves and take the easy road, the path of least resistance. But struggling and suffering,

as I now saw it, were the essence of a life worth living. If you’re not pushing yourself be-

yond the comfort zone, if you’re not constantly demanding more from yourself–expanding

and learning as you go–your choosing a numb existence. Your denying yourself an ex-

traordinary trip.”

Dean Karnazes, Ultramarathon Man: Confessions of an All-Night Runner

Page 4: A comprehensive review of bat algorithms and their ...

UNIVERSITY OF MARIBOR

Abstract

Faculty of Electrical Engineering and Computer Science

Smetanova 17, 2000 Maribor, Slovenia

Master of Computer Science and Information Technologies

by Iztok Fister Jr.

KEYWORDS: swarm intelligence, evolutionary computation, bat algorithm, hybridiza-

tion, review

UDC: 004.5:004.89(043)

Swarm intelligence is a modern and efficient mechanism for solving hard problems in com-

puter science, engineering, mathematics, economics, medicine and optimization. Swarm

intelligence is the collective behavior of decentralized and self-organized systems. This

research area is a branch of artificial intelligence and could be viewed as some kind of

family relationship with evolutionary computation because both communities share a

lot of common characteristics. To date, a lot of swarm intelligence algorithms have been

developed and applied to several real-world problems. The main focus of this thesis is

devoted to the bat algorithm which is a member of the swarm intelligence community, as

developed recently. In line with this, a comprehensive analysis of papers was performed

tackling this algorithm. Some hybridizations of the original algorithm were proposed be-

cause the preliminary results of this algorithm regarding the optimization of benchmark

functions with higher dimensions had not too promising. Extensive experiments showed

that the hybridizing the original bat algorithm has beneficial effects on the results of

the original bat algorithm. Finally, an experimental study was performed during which

we researched for the dependence of an applied randomized method on the results of

the original bat algorithm. The results of this study showed that selecting the random-

ized method had a crucial impact on the results of the original bat algorithm and that

the bat algorithm using Levy flights is also suitable for solving the harder optimization

problems.

Page 5: A comprehensive review of bat algorithms and their ...

Acknowledgements

I would like to gratefully and sincerely thank my supervisor Dr. Marjan Mernik for

helping and leading me through my Master Course. Moreover, I would like also to

gratefully and sincerely thank my co-supervisor Dr. Xin-She Yang from Middlesex Uni-

versity (United Kingdom). Special thanks go to my parents, who have supported me

throughout my studies. Thanks also to Karin for your support during this course.

I am also thankful to my brother and his drawing work during this thesis. Thanks

also to Mr. George Yeoman for proofreading of this thesis.

Finally, special thanks go to my friend Sandi Dolinar for his help in my personal trans-

formation and for his energy support every day.

iv

Page 6: A comprehensive review of bat algorithms and their ...

Contents

Abstract iii

Acknowledgements iv

List of Figures viii

List of Tables ix

Abbreviations xi

1 Introduction 1

2 Biological origins of swarm intelligence 7

2.1 Decentralized decision making in social insects . . . . . . . . . . . . . . . 8

2.2 Coordinated movement of animal groups . . . . . . . . . . . . . . . . . . . 9

2.3 Collective behavior in biological systems . . . . . . . . . . . . . . . . . . . 10

2.4 Biological systems and their characteristics . . . . . . . . . . . . . . . . . 12

3 Swarm intelligence algorithms 13

3.1 Ant colony optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3.2 Artificial bee colony algorithm . . . . . . . . . . . . . . . . . . . . . . . . 16

3.3 Cuckoo search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

3.4 Firefly algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

3.5 Flower pollination algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . 21

3.6 Particle swarm optimization . . . . . . . . . . . . . . . . . . . . . . . . . . 23

4 A comprehensive review of bat algorithms 25

4.1 Biological foundations of bats . . . . . . . . . . . . . . . . . . . . . . . . . 26

4.2 The original bat algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

4.3 Applications of the bat algorithm . . . . . . . . . . . . . . . . . . . . . . . 30

4.3.1 Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

4.3.1.1 Continuous optimization . . . . . . . . . . . . . . . . . . 31

4.3.1.2 Constrained optimization . . . . . . . . . . . . . . . . . . 33

4.3.1.3 Multi-objective optimization . . . . . . . . . . . . . . . . 33

4.3.2 Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

4.3.2.1 Clustering . . . . . . . . . . . . . . . . . . . . . . . . . . 35

v

Page 7: A comprehensive review of bat algorithms and their ...

Contents vi

4.3.2.2 Neural networks . . . . . . . . . . . . . . . . . . . . . . . 35

4.3.2.3 Feature selection . . . . . . . . . . . . . . . . . . . . . . . 36

4.3.3 Engineering applications . . . . . . . . . . . . . . . . . . . . . . . . 36

4.3.3.1 Image processing . . . . . . . . . . . . . . . . . . . . . . . 36

4.3.3.2 Industrial design . . . . . . . . . . . . . . . . . . . . . . . 37

4.3.3.3 Scheduling . . . . . . . . . . . . . . . . . . . . . . . . . . 38

4.3.3.4 Electronics . . . . . . . . . . . . . . . . . . . . . . . . . . 39

4.3.3.5 Spam filtering . . . . . . . . . . . . . . . . . . . . . . . . 39

4.3.3.6 Sports . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

4.4 There are so many algorithms... . . . . . . . . . . . . . . . . . . . . . . . . 40

4.5 Future research in bat algorithm . . . . . . . . . . . . . . . . . . . . . . . 43

5 Hybridization of bat algorithm 45

5.1 Hybridization using differential evolution strategies . . . . . . . . . . . . . 46

5.1.1 Differential evolution . . . . . . . . . . . . . . . . . . . . . . . . . . 46

5.1.2 Hybrid Bat Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . 47

5.2 Hybridization with random forests . . . . . . . . . . . . . . . . . . . . . . 48

5.2.1 Random Forest . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

5.2.2 Hybrid Bat Algorithm with Random Forest . . . . . . . . . . . . . 49

5.3 Randomized Bat algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . 50

5.3.1 Continuous random number distributions . . . . . . . . . . . . . . 52

5.3.1.1 Uniform distribution . . . . . . . . . . . . . . . . . . . . . 52

5.3.1.2 Normal or Gaussian distribution . . . . . . . . . . . . . . 53

5.3.1.3 Levy flights . . . . . . . . . . . . . . . . . . . . . . . . . . 54

5.3.1.4 Chaotic maps . . . . . . . . . . . . . . . . . . . . . . . . . 54

5.3.1.5 Random sampling in turbulent fractal cloud . . . . . . . 55

5.3.1.6 Characteristics of the mentioned randomized methods . . 56

5.3.2 Variants of the Randomized Bat Algorithm . . . . . . . . . . . . . 58

5.4 Experiments and results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59

5.4.1 Test suite . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

5.4.1.1 Griewangk’s function . . . . . . . . . . . . . . . . . . . . 60

5.4.1.2 Rosenbrock’s function . . . . . . . . . . . . . . . . . . . . 60

5.4.1.3 De Jong’s sphere function . . . . . . . . . . . . . . . . . . 60

5.4.1.4 Rastrigin’s function . . . . . . . . . . . . . . . . . . . . . 61

5.4.1.5 Ackley’s function . . . . . . . . . . . . . . . . . . . . . . . 61

5.4.2 PC configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61

5.4.3 Influence of hybridizations . . . . . . . . . . . . . . . . . . . . . . . 62

5.4.4 Influence of randomness . . . . . . . . . . . . . . . . . . . . . . . . 64

5.4.4.1 Influence of randomization methods on the results of RBA 64

5.4.4.2 Comparative study . . . . . . . . . . . . . . . . . . . . . 66

5.5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68

6 Conclusion 70

A List of bio-inspired and swarm intelligence algorithms 72

Page 8: A comprehensive review of bat algorithms and their ...

Contents vii

B Kratek povzetek v slovenskem jeziku 74

C Bibliography of candidate 75

D Short biography of candidate 79

Bibliography 80

Page 9: A comprehensive review of bat algorithms and their ...

List of Figures

1.1 Evolutionary algorithms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

1.2 SI and EC as a sister and brother. . . . . . . . . . . . . . . . . . . . . . . 6

2.1 Interactions in swarm intelligence. . . . . . . . . . . . . . . . . . . . . . . 9

2.2 Inspirations for swarm intelligence algorithms . . . . . . . . . . . . . . . . 11

3.1 Ant . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3.2 Bee . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

3.3 Cuckoo . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

3.4 Firefly . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

3.5 Bee pollinating flowers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

3.6 PSO . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

4.1 A bat. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

4.2 Dolphins use echolocation to determine food . . . . . . . . . . . . . . . . . 26

4.3 Classification of Bat algorithms . . . . . . . . . . . . . . . . . . . . . . . . 30

4.4 Pressure vessel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

5.1 Random forests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

5.2 Results of the various randomized methods . . . . . . . . . . . . . . . . . 57

5.3 Friedman’s non-parametric test . . . . . . . . . . . . . . . . . . . . . . . . 63

5.4 Results of the Friedman non-parametric test on different variants of RBAalgorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66

5.5 Results of the Friedman non-parametric test on suite of test algorithms . 68

viii

Page 10: A comprehensive review of bat algorithms and their ...

List of Tables

1.1 Differences among evolutionary algorithms . . . . . . . . . . . . . . . . . . 4

2.1 Biological systems with their characteristics . . . . . . . . . . . . . . . . . 12

4.1 Review of papers solving the optimization problems with BA. . . . . . . . 31

4.2 Review of papers solving classification problems with BA. . . . . . . . . . 35

4.3 Review of papers tackling engineering applications with BA. . . . . . . . . 36

4.4 Experimental setups in analyzed papers tackling continuous optimization. 42

5.1 Ensemble of DE-strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

5.2 Variant of the RBA algorithm . . . . . . . . . . . . . . . . . . . . . . . . . 58

5.3 The results of experiments (D=10) . . . . . . . . . . . . . . . . . . . . . . 62

5.4 Results of RBA according to various randomization methods (D=30) . . . 65

5.5 Comparing the results of RBA with the results of other algorithms (D=50) 67

ix

Page 11: A comprehensive review of bat algorithms and their ...

List of Algorithms

1 Evolutionary algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

2 Swarm Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

3 Pseudo code of the ACO algorithm . . . . . . . . . . . . . . . . . . . . . . 15

4 Pseudo code of the ABC algorithm . . . . . . . . . . . . . . . . . . . . . . 17

5 Original Cuckoo search algorithm . . . . . . . . . . . . . . . . . . . . . . . 19

6 Pseudo code of the basic Firefly algorithm . . . . . . . . . . . . . . . . . . 20

7 Pseudo code of the FPA algorithm . . . . . . . . . . . . . . . . . . . . . . 23

8 Pseudo code of the PSO algorithm . . . . . . . . . . . . . . . . . . . . . . 24

9 Original Bat algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

10 Modification in Hybrid Bat Algorithm (HBA) . . . . . . . . . . . . . . . . 47

11 Original RF algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

12 The ES algorithm the ensemble DE strategy . . . . . . . . . . . . . . . . . 50

13 Ensemble Strategies functions . . . . . . . . . . . . . . . . . . . . . . . . . 51

14 RSiTFC(L,H,N, x, h, i) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

x

Page 12: A comprehensive review of bat algorithms and their ...

Abbreviations

ABC Artificial Bee Colony

ACO Ant Colony Optimization

BAM Bat Algorithm with Mutation

BBA Binary Bat Algorithm

BBF Bin Bloom Filter

BBO Biogeography Based Optimization

CS Cuckoo Search

DEBA Doppler Effect Bat Algorithm

DG Distributed Generator

EA Evolutionary Algorithm

FA Firefly Algorithm

FPA Flower Pollination Algorithm

FSM Finite State Machine

GSA Gravitational Search Algorithm

HFS Hybrid Flowshop Scheduling

HS Harmony Search

IBA Improved Bat Algorithm

PSO Particle Swarm Optimization

SI Swarm Intelligence

SGA Simple Genetic Algorithm

TSP Travelling Salesman Person

UCAV Uninhabited Combat Air Vehicle

xi

Page 13: A comprehensive review of bat algorithms and their ...
Page 14: A comprehensive review of bat algorithms and their ...

Chapter 1

Introduction

Over recent decades, a lot of problems have arisen within computer science, engineering,

economics, and medicine. A lot of these problems are NP-hard. According to algorith-

mic theory [83], a problem becomes NP-hard when increasing the problem size causes

increase in the time complexity exponentially (mathematically described as O(2n), where

n denotes the problem size). In the worst case, user could be waiting for the solution

over an infinite period of time [65]. Therefore, several methods have arisen that try to

approximately solve these problems. These methods look for solutions on a smart, i.e.,

heuristic ways instead of exhausting search by enumerating all the solutions.

Throughout history, the more powerful heuristic mechanisms have been inspired by Na-

ture. Let me mention just two of the more important heuristics, i.e. Evolutionary Algo-

rithms (EA) [50] that imitate the Darwinian theory of evolution [36], and Swarm Intel-

ligence (SI) that finds its inspiration within collective, decentralized, and self-organized

natural systems [14, 15]. Although nowadays many different variants of EAs exist, the

common idea behind all these techniques remains the same: within the population of

individuals (solutions) environment pressure causes natural selection (survival of the

fittest), thus causing a rise in the fitness of the population [48]. Fitness in EA deter-

mines the quality of candidate solutions and can be minimum or maximum value of the

objective function connected with a problem to be solved. In this thesis, I deal with the

minimum values of objective functions only and therefore, the fitness function is used as

synonymous for the minimum value of the objective function. The candidate solutions

undergo acting the operators of crossover and mutation by reproduction. As a result,

a population of offspring are generated that enter with their parents into a process of

selection to struggle for a place within a new population. A pseudo-code of EA is il-

lustrated in Algorithm 1, from which it can be seen that EAs consist of the following

components, in general:

1

Page 15: A comprehensive review of bat algorithms and their ...

Chapter 1. Introduction 2

• representation of individuals,

• fitness function evaluation,

• population scheme,

• parent selection mechanism,

• variation operators (crossover and mutation),

• survivor selection mechanism (replacement).

Algorithm 1 Evolutionary algorithm

1: initialize population with random candidate solutions;

2: eval = evaluate each candidate;

3: while termination condition not meet do

4: select parents;

5: recombine pairs of parents;

6: mutate the resulting offspring;

7: eval += evaluate new candidates;

8: select individuals for the next generation;

9: end while

Additionally, an initialization method and termination condition need to be defined in

order to work properly.

EAs are divided into the following classes (Fig. 1.1) which mainly differ according to

their representations:

• genetic algorithms (GA),

• evolution strategies (ES),

• genetic programming (GP),

• evolutionary programming (EP) and

• differential evolution (DE).

For example, genetic algorithms developed by John Holland in 1975 [87, 100] use binary

representation of individuals. Evolution strategies [11, 149] are inspired by adaptation.

Instead of binary representation as used in genetic algorithms, they use real numbers.

Similarly, differential evolution [173, 174, 199] also works on real-valued individuals.

Page 16: A comprehensive review of bat algorithms and their ...

Chapter 1. Introduction 3

EVOLUTIONARYALGORITHMS

geneticprogramming

geneticalgorithms

evolutionstrategies

differentialevolution

evolutionaryprogramming

Figure 1.1: Evolutionary algorithms.

Instead of natural reproduction via crossover and mutation, mathematical differences

between a set of real-valued vectors (i.e., individuals) are taken into account. Genetic

programming [129–131] was developed in order to find computer programs in Lisp that

perform certain operations which are user-defined. That is, an individual in genetic pro-

gramming is represented as a program in Lisp and not as a binary number. Individuals

in evolutionary programming [5, 72, 248] are finite-state machines (FSM) that consist

of a number of states S and a number of state transitions. The differences between

evolutionary algorithms are presented in Table 1.1.

These classes of algorithms are used for solving problems from function optimization, ac-

tivities planning, scheduling, combinatorial optimization, and also real-world problems.

Many classical variants of these algorithms have been hybridized with other methods

(especially with other meta-heuristics or machine-learning methods), and also with local

search methods [18, 63, 64, 90]. Hybridization allows algorithms to introduce problem-

specific knowledge about a problem to be solved. Hybridization can also improve the

balance between exploration and exploitation within evolutionary algorithms [33, 34, 49].

Exploration is a process of visiting entirely new regions of a search space. Exploitation is

a process of visiting regions of a search space based on previously visited points (neigh-

borhood) [141].

At this point it is necessary to also mention memetic algorithms [160–162]. A memetic

algorithm is an evolutionary algorithm or any other population-based approach that

separates individual learning or local improvement for a problem [104]. Usually, if the

evolutionary algorithm employs a local search method, it can then be referred to as a

memetic algorithm. The ’No free lunch’ theorem [221] observes that there is no better

search algorithm than a random search if all the problem families are being taken into

account. It means that if one algorithm is good for one family of problems, it does not

mean that it will be still be appropriate for other problem families. In order to tailor one

Page 17: A comprehensive review of bat algorithms and their ...

Chapter 1. Introduction 4

Fea

ture

Gen

etic

al-

gor

ith

mE

volu

tion

stra

tegy

Gen

etic

pro

gram

-m

ing

Evo

luti

onar

yp

rogr

am-

min

g

Diff

eren

tial

evol

uti

on

Rep

rese

nta

tion

bit

nu

mb

ers

real

nu

m-

ber

sp

rogr

ams

fin

ite

stat

eau

tom

ata

real

nu

m-

ber

s

Sea

rch

oper

ato

rscr

oss

over

,m

uta

tion

mu

tati

on,

cros

sover

(lat

ely)

cros

sover

mu

tati

oncr

osso

ver

,m

uta

tion

Sel

ecti

on

stoh

asti

cd

eter

min

isti

cst

ohas

tic

stoh

asti

cst

ohas

tic

Ord

erof

op

erat

ors

sele

ctio

n,

cross

over

,m

uta

tion

cros

sover

,m

uta

tion

,se

lect

ion

sele

ctio

n,

cros

sover

mu

tati

on,

sele

ctio

nse

lect

ion

,cr

osso

ver

,m

uta

tion

Table1.1:

Diff

eren

ces

am

on

gev

olu

tion

ary

alg

ori

thm

s

Page 18: A comprehensive review of bat algorithms and their ...

Chapter 1. Introduction 5

algorithm to other problem families, we can create hybrid/memetic algorithms which

can properly address the balance between exploration and exploitation.

The second class of nature inspired algorithms is swarm intelligence that can be divided

into the following disciplines: ant colony optimization [42], artificial bee colony algo-

rithm [116], bat algorithm [233], cuckoo search [237], firefly algorithm [235], particle

swarm optimization [121], etc. Similarly as the EA, the SI algorithm also starts with

the initialization of a population consisting of individuals (particles). Then, a fitness

function evaluation is needed in order to determine the quality of each particle. The

central part of this algorithm is repeat loop which moves particles towards the best eval-

uated individual. After this movement each solution is evaluated and then follows the

selection where we select the better individuals for the next generations. A pseudo-code

of SI is presented in Algorithm 2.

Algorithm 2 Swarm Intelligence

1: initialize population with random candidate particles;

2: eval = evaluate each particle;

3: while termination condition not meet do

4: move particles towards the best individual;

5: eval += evaluate each particle;

6: select the best individuals for the next generation;

7: end while

As can be seen from the algorithm, evolutionary computation and swarm intelligence

are very similar, in general. Both algorithms are nature inspired but different natural

phenomena influence their behavior. The main difference between both appears in the

exploration and exploitation components of the search process. EC imitates the natural

process of reproduction using the crossover and mutation operators, where only the

better individuals can survive bear their good genetic characteristics onto the offspring.

However in SI, simple individuals using communication can move together through the

search space and explore the more promising regions within it. If the EC is taken

metaphorically as a ”brother” in the AI family then the SI might be observed as its

younger ”sister” Fig. 1.2.

This thesis is devoted to SI algorithms, in particular to the bat algorithm (BA) developed

by Yang in [233]. The goal is twofold: firstly, to conduct a comprehensive review of work

within the BA algorithm area that cannot be found in the presented publications and

secondly, to further improve the BA algorithm. As matter of fact, the improvement

efforts were oriented towards two directions as follows: the first dealt with hybridization

of the BA algorithm, While the second analyzed the influences of various distribution

Page 19: A comprehensive review of bat algorithms and their ...

Chapter 1. Introduction 6

SWARMINTELLIGENCE

SWARMINTELLIGENCE

EVOLUTIONARYCOMPUTATION

Figure 1.2: SI and EC as a sister and brother.

methods, like Levi flights, chaos, and random sampling in turbulent fractal cloud, on

the results of BA algorithms.

The structure of this thesis is as follows: the next chapter introduces readers to the more

important concepts of the SI method. Then, Chapter 3 deals with the more popular

SI algorithms. Chapter 4 introduces the bat algorithm and presents a comprehensive

review of them. Chapter 5 presents our practical work in which the focus was on the

hybridization of bat algorithms. Conclusions and directions for further work completes

this thesis.

Page 20: A comprehensive review of bat algorithms and their ...

Chapter 2

Biological origins of swarm

intelligence

Swarm intelligence draws inspiration for its work from Nature, more precisely from biol-

ogy [71]. For million of years many biological systems have solved complex problems by

sharing information with group members [13]. For example, social insects like termites

build nests which house the colony. These nests (also called mounds) are elaborate

structures built from materials like soil, mud, chewed wood/cellulose, and saliva. These

mounds provide living places and water conservation (through controlled condensation).

These are much greater than an individual termite (also a particle in swarm) and cannot

be created without the engagement of all individuals within the colony. Namely, build-

ing nests demands special nest-building behavior that recruits all individuals within the

colony. In line with this, individuals become a group that consists of individuals inter-

acting with each other in order to satisfy their clearly defined goal, i.e., building nests.

Practically two abilities are needed for successful interaction: cooperation and coordi-

nation. The former is achieved by the termite social behavior, where individuals are

aware of the fact that alone (without the whole group) they cannot survive. The latter

is enabled through a stigmergy that is defined as the process by which the coordination

of tasks and the regulation of construction do not depend directly on the workers but

on the constructions themselves [101]. Therefore, coordination during the building ac-

tivities within a termite colony is not inherent within the termites themselves. Instead,

the coordination mechanisms are found to be regulated by the task’s environment, in

this case the growing nest structure. Being located on the growing nest stimulates a

termites building behavior, thereby transforming the local nest structure, which in turn

stimulates additional building behavior in the same or another termite [147].

7

Page 21: A comprehensive review of bat algorithms and their ...

Chapter 2. Biological origins of swarm intelligence 8

As can be inferred from the above example, biological systems are usually very complex.

They consist of particles (individuals) that are definitely more complex than molecules

and atoms. These particles are capable of performing autonomous actions within an

environment. On the other hand, a group of particles is capable of intelligent behav-

ior that is appropriate for solving complex problems within nature. Therefore, it is

no coincidence that these biological systems have also inspired computer scientists to

imitate their intelligent behavior in order to solve complex problems in mathematics,

physics, engineering, etc. Moreover, interest of researching various biological systems

has increased recently. As a result, new SI algorithms are being created almost every

day.

Two characteristics of biological systems are especially important by developing SI algo-

rithms: decentralized decision making in social insects and the coordinated movements

of animal groups [13]. These particular characteristics are treated in Sections 1 and

2, respectively. Section 3 addresses the collective behavior in biological systems. This

chapter concludes by reviewing biological systems with regard to their phenomena which

have served as an inspiration for developing the corresponding SI algorithms.

2.1 Decentralized decision making in social insects

The social lives of humans are regulated by norms and social laws [135]. The norm is

simply the established and expected patterns of behavior. The term social law carries

the same meaning but is controlled by some authority [222]. In a case that social law is

violated, the some authority can punish humans violating it with some kind of repressive

action (e.g., prison).

Social insects live together within nests in order to survive [13]. These beings have indeed

developed some kind of sociality which is regulated according to norms valid for living

in swarms. However, any particle that violates these norms is punished with isolation,

which means certain death. In order to act as a group, certain interactions between

particles have been created which enable the particles to be informed. The particles can

cooperate and coordinate between each other using interactions (Fig. 2.1). There is no

central control that would direct the activities of particles towards a common goal. In

contrast, the particles make decisions on the base of local information obtained from

interactions with their peers and their environment. As a matter of fact, each particle

makes decisions locally, but acts globally. Therefore, these decisions are also named

’collective decisions’.

Page 22: A comprehensive review of bat algorithms and their ...

Chapter 2. Biological origins of swarm intelligence 9

COORDINATION

COOPERATION

Figure 2.1: Interactions in swarm intelligence.

Social insects make collective decisions mainly when the following situations have arisen:

where to forage, where to live, when to reproduce, and how to divide the necessary tasks

among the available work force [13]. For the organization of foraging they use some kind

of recruitment, with which an increasing the number of individuals at a particular place

is obtained. In general, there are two ways of recruitment mechanisms, i.e. direct and

indirect. The former demands that the recruiter is in contact with the recruited, while

this requirement is not demanded by the latter.

Direct recruitment is used, for example, by honeybee swarms, where scouts using a so

named ’waggle dance’ inform unemployed foragers of the presence of food. In contrast,

ants use pheromone trails in order to inform about the qualities of food sources. The

better the quality, the more pheromone is left on the trail.

2.2 Coordinated movement of animal groups

Many animal species move in groups. The intentions for these movements are different,

e.g., seasonal migrations (birds like swallows, swans, storks, etc.), traveling to food

sources and homecoming (e.g., bees, ants, etc.). These movements are self-organized,

that is, there is no hierarchical central navigation for controlling their movements. As

a result, particles make decentralized decisions, i.e. on the basis of local information

obtained from another particle or from the environment.

Interestingly, such movement is also characteristic of intelligent species like humans. For

example, a human group celebrates Labor Day outdoors. Suddenly, it starts to hail and

the human group moves under a tree not far from the show ground that can only provide

Page 23: A comprehensive review of bat algorithms and their ...

Chapter 2. Biological origins of swarm intelligence 10

shelter from the storm. This action is coordinated but happens without any cooperation

because each human wishes to save his/her bacon first. However, coordination is imposed

by the storm.

A decision about moving within any insect group arises, for example, because the current

swarm becomes too large to house all the individuals. An example of this is ants, where

the queen in a swarm of honeybees decides that it is a time to create her own family

or even this migration is genetically caused, as with migrating the birds. A consensus

within a group needs to be reached before migration can be started. Contributors to this

consensus can be either all the individuals within the group or relatively few individuals.

In the first case, the necessity to migrate is inherent within the genes of each individual.

Here, migration is started as soon as the number of gathered individuals within a group

reaches a certain threshold required for coordinated movement [13]. This threshold is

known in theoretical physics as ’phase transition’. In the second case, the movement

is guided by a few well-informed individuals who possesses information about travel

directions, and guide the uninformed majority [13].

2.3 Collective behavior in biological systems

Inspirations for swarm intelligence algorithms come from nature, usually from the dy-

namical animal world. Severe animal species live in swarms and their collective behavior

help them to survive within an unpredictable environmental conditions. Therefore, they

work together in swarms and help each other. Animal swarms provide a great mech-

anism for protection against predators and other intruders, which might affect their

living. Furthermore, these also serve as a good mechanism for finding the food places

and for securing the transportation of food into their storage areas.

In order to survive, each swarm species develops specific collective behavior, where

individuals interact with each other by using certain biological organ developed through

an evolutionary process. This organ is usually based on some chemical reaction and

explores a certain physical phenomenon. Some good examples of animal swarms and

their characteristics are as follows (Fig. 2.2):

• ants: these insects are well-known because of their foraging and building of nests.

A pheromone is used for their interaction.

• bat: bats use echolocating in order to find food.

Page 24: A comprehensive review of bat algorithms and their ...

Chapter 2. Biological origins of swarm intelligence 11

Figure 2.2: Inspirations for swarm intelligence algorithms

• bees: bees swarming in groups is twofolds. When searching for food they have

developed a mechanism called ’waggle dance’, while they live in swarms because

of protection against predators and other enemies.

• cuckoo: lays its eggs in the nests of other birds for housing.

• firefly: has a very unique way of mating by attractiveness.

• flower: a natural process of pollination is used for reproduction.

• migrating birds: a lot of bird species in Central Europe have to leave this area in

order to avoid the winters. They have to fly in groups to hot places, e.g., to Africa,

while their navigation is based on a genetic code, i.e., all the individuals within a

group make collective decisions to migrate.

As a matter of fact, animals, birds, and insects are not the only inspirations for re-

searchers as many other nature-inspired processes are also their inspirations. For ex-

ample, natural rivers and how they find optimal path to their destination [187–189].

Moreover, how water forms rivers by eroding the ground and depositing sediments, is

also used in algorithms [175, 176]. On the other hand, physics is also a source of inspi-

ration for researchers [179, 180, 224–226].

Page 25: A comprehensive review of bat algorithms and their ...

Chapter 2. Biological origins of swarm intelligence 12

In future years, many new SI inspired algorithms may be expected. Currently inspi-

rations for SI algorithms are being taken from domains like the animal world, physics,

chemistry, social sciences, music, etc. In the future, this might be widen to include sport,

because many team sports expose behavior that is very close to swarm intelligence.

2.4 Biological systems and their characteristics

This section presents a comparison between various biological systems is presented. Each

biological system bases on some physical phenomenon that enables interaction between

individuals in a swarm, and indirectly in collective behavior. Seven biological systems

compared, i.e., ants, bats, bees, cuckoos, fireflies, flowers, and migrating birds. The

swarms are compared according to two characteristics important for the development of

SI algorithms, i.e., decentralized decision making and group movement. The results of

this comparison are presented in Table 2.1.

Table 2.1: Biological systems with their characteristics

Biological system Physical phenomena Decentralized decision making Group movement

Ant pheromone where to forage where to live(leaders-directed)

Bat micro-wave where to forage -Bee waggle dance where to forage where to live

(leaders-directed)Cuckoo parasitic behavior where to live -Firefly luciferin when to reproduce -Flower blossom when to reproduce -Migrating bird genetic code - where to live

(group-directed)

As can be seen from the above table, some social insects expose both decentralized deci-

sion making and group movements but in different occurrences. The former determine,

where to forage, while the latter when the new swarms will live. The pheromone is used

by ants in both cases, i.e., where to forage and where to build new nests, but group

movement is initialized by a few individuals. On the other hand, the ’waggle dance’ is

applied by bees when a new promising area rich with nectar (food) is found by scouts,

while group movement is led by certain individuals that take care about queen when

flying to new locations.

Page 26: A comprehensive review of bat algorithms and their ...

Chapter 3

Swarm intelligence algorithms

Swarm intelligence is nature-inspired mechanism that is also very useful in comput-

ing [10, 16, 52, 84, 122]. Scientists first introduced this method 20 years ago and nowa-

days swarm intelligence is applied for solving very sophisticated real-world problems,

optimization problems, problems in the financial industry, etc. Swarm intelligence is

recognized by three main features:

• self-organization: enables the coordinated movements of animal groups,

• collective behavior: individuals within the swarm interact with each other in order

to achieve a common goal,

• decentralization: there is no hierarchical control regarding swarm intelligence,

where members act decentralized on the basis of local information obtained by

their peers or from the environment.

Swarm intelligence also refers to multi-agent systems [222]. Multi-agent systems consist

of severe intelligent agents that interact between each other. Although an agent as an

individual is restricted, the multi-agent system is capable of building large quantities

of work (e.g., building the Egyptian pyramids). The advantages of these multi-agent

systems are that they can solve hard problems in many areas, which would be difficult

or even impossible to solve by only one, individual agent. Among another things, multi-

agent systems are useful when applied to robotics, planning, civil engineering, etc.

This chapter presents a review of the more popular SI algorithms of today. In line with

this, the following algorithms are described:

• ant colony optimization (ACO),

13

Page 27: A comprehensive review of bat algorithms and their ...

Chapter 3. Swarm intelligence algorithms 14

• artificial bee colony (ABC),

• cuckoo search (CS),

• firefly algorithm (FA),

• flower pollination algorithm (FPA),

• particle swarm algorithm (PSO).

Although the mentioned algorithms represent only 10 percent of all SI algorithms in

the world, we believe that this collection can well complete an insight into this rapid

developed and dynamic domain in computer science today.

3.1 Ant colony optimization

Ant colony optimization is probably the more famous swarm intelligence algorithm pro-

posed by Marco Dorigo in his thesis of 1992 [42]. It is a probabilistic method for solving

problems which is based on finding optimal paths in graphs.

Figure 3.1: Ant

The inspiration for using ant colony in the computer world arose from observing ants

(Fig. 3.1) and their foraging. Ants have very limited cognitive abilities but they can

forage collectively and collectively find the shortest path to food [220]. The process of

ant foraging is illustrated as follows:

• ants find a food source and when returns back to the nest they leave pheromone

trail,

• other ants try to follow this trail,

• ants choose the shortest route as pheromone evaporates the longer the route.

Page 28: A comprehensive review of bat algorithms and their ...

Chapter 3. Swarm intelligence algorithms 15

The basic steps of the ant colony algorithm are illustrated as a pseudo-code in Algo-

rithm 3.

Algorithm 3 Pseudo code of the ACO algorithm

Input: Ant colony xi = (xi1, . . . , xiD)T .Output: The best solution xbest and its corresponding value fmin = min(f(x)).

1: set parameters;2: initialize pheromone trails;3: eval = evaluate the new population;4: fmin=find the best solution(xbest);5: while termination condition not meet do6: for i = 1 to Np do7: y = generate new solution(xi);8: fnew = evaluate the new solution(y);9: xi = mark if better location(y, fnew );

10: update pheromones;11: eval = eval + 1;12: end for13: fmin=find the best solution(xbest);14: end while

An ant colony algorithm (Algorithm 3) acts as follows. It starts by setting the algorithm

parameters (line 1) and generating the initial ant population (line 2). This population is

then evaluated (line 3) and the best solution found (line 4). The main loop (lines 5-14)

consists of the following steps. The new route is selected for each ant in colony (line 7)

and evaluated (line 8). If a better route is found, it is replaced by the new route (line

9). Finally, the pheromone is updated according to the evaporation rate (line 10). After

this, the best global solution is searched for (line 13).

Two main issues need to be solved in order for this algorithm to be fully operable: the

probability of route selection and the evaporation rate of the pheromone. However, both

issues are influenced by the problem to be solved. The ways in which these issues have

been tackled has led to the emergence of several variants of ant colony algorithms.

The ant colony optimization algorithm are used for many real-world applications. Sh-

elokar et al. [192] applied an ant colony for clustering. Ying and Liao [250] used ants for

permutation flow-shop sequencing. Dowsland and Thompson [43] colored graphs using

ant colony optimization. Ant colony has been used many times for vehicle routing (e.g.,

Bell and McMullen [9]).

Page 29: A comprehensive review of bat algorithms and their ...

Chapter 3. Swarm intelligence algorithms 16

3.2 Artificial bee colony algorithm

The artificial bee colony algorithm was introduced by Karaboga and Basturk in 2005 [115–

117] after being inspired by the natural behavior of bees (Fig. 3.2). In the ABC algo-

rithm, the artificial bee colony consists of three bee groups [113, 114, 235], as follows:

employed bees, onlookers, and scouts. The employed bees discover each food source,

in other words only one employed bee exists for each food source. The employed bees

share information about food sources with onlooker bees, within their hive. Then, the

onlooker bees can choose a food source to forage. The reacher the food source with

nectar, the more probability of it being visited by the onlooker bee. Interestingly, those

employed bees whose food source is exhausted either by the employed or onlooker bees,

become scouts.

Figure 3.2: Bee

The ABC algorithm is formally described in Algorithm 4, from which it can be seen

that each cycle of the ABC search process (statements within a while loop) consists of

three functions [61]:

• send employed bees (lines 5-14): sending the employed bees towards the food

sources and evaluating their nectar amounts,

• send onlooker bees (lines 17-30): sharing the information about food sources with

employed bees, selecting the proper food source and evaluating its nectar amounts,

• send scouts (lines 30-38): determining the scout bees and then sending them to-

wards possibly new food sources.

However, before this search process can take place, initialization has to be performed

(function init bees (line 1)). The termination condition (function termination condition not meet

(line 4)) is responsible for terminating the search cycle. Typically, the maximum number

of function evaluations MAX FE is used as the termination condition.

Page 30: A comprehensive review of bat algorithms and their ...

Chapter 3. Swarm intelligence algorithms 17

Algorithm 4 Pseudo code of the ABC algorithm

Input: Population of food sources xi = (xi1, . . . , xiD)T for i = 1 . . . Np, MAX FE.Output: The best solution xbest and its corresponding value fmin = min(f(x)).

1: init bees;2: eval = evaluate the new population;3: fmin=find the best solution(xbest);4: while termination condition not meet do5: for i = 1 to Np do6: y = generate new solution(xi);7: fnew = evaluate the new solution(y);8: eval = eval + 1;9: if fnew ≤ fi then

10: xi = y; fi = fnew; // save the local best solution11: else12: triali = triali + 1;13: end if14: end for{ send employed bees }15: pi = calc probabilities(f);16: i = 0; j = 0;17: while i < Np do18: if rand(0, 1) < pi then19: j = j + 1;20: y = generate new solution(xi);21: fnew = evaluate the new solution(y);22: eval = eval + 1;23: if fnew ≤ fi then24: xi = y; fi = fnew; // save the local best solution25: else26: triali = triali + 1;27: end if28: end if29: i = i+ 1;30: end while{ send onlooker bees }31: for i = 1 to Np do32: if (triali > limits) then33: xi = init bee(i);34: fi = evaluate the new solution(xi);35: eval = eval + 1;36: triali = 0;37: end if38: end for{ send scouts }39: fmin=find the best solution(xbest);40: end while

Many artificial bee colony algorithms have been created so far and applied to several

problems. Some of important ABC algorithms and their modifications can be found in

these papers [2, 112, 163, 185, 198, 252, 255, 257].

Page 31: A comprehensive review of bat algorithms and their ...

Chapter 3. Swarm intelligence algorithms 18

3.3 Cuckoo search

Cuckoo search is an swarm intelligence algorithm developed by Xin-She Yang and Suash

Deb in 2009 [216, 237, 239]. The inspiration for this algorithm also comes from nature

and was inspired by the obligate brood parasitisms of some cuckoo species by laying

their eggs in the nests of other host birds (Fig 3.3).

Figure 3.3: Cuckoo

The cuckoo is a bird of medium size which can be found in most areas of the world.

Cuckoos can also be found in all primary school books because of their special way

of reproduction, i.e., the majority of species are brood parasites. The meaning of this

phrase is that these species lay their eggs in the nests of other birds.

Yang and Deb defined the cuckoo search algorithm, as follows. Each egg in a nest

represents a solution, and a cuckoo egg represents a new solution. These solutions are

usually represented as a vector of real numbers. The aim of cuckoo search is to use

and discover new, and probably better solutions for replacing the worst solutions in the

nests [166].

Three idealized rules has been implied from this idea:

1. Each cuckoo lays one egg at a time, and dumps its egg in a randomly chosen nest.

2. The best nests with high quality eggs will carry over to the next generation.

3. The number of available host nests is fixed, and the egg laid by a cuckoo is discov-

ered by the host bird using certain probability. Some discovered set of the worst

nests are dumped from farther calculations.

The pseudo-code of the cuckoo search algorithm can be seen in Algorithm 5.

As can be seen from the pseudo-code in Algorithm 5, the first step of the algorithm is

to generate an initial population, where the host nests are positioned within the search

space randomly (line 1). In the main loop that follows, the algorithm randomly obtains

a new position of i-th cuckoo using Levi flight (line 5), and then evaluates its fitness

(line 6). Then, a certain random solution j is selected that can be replaced when the

i-th solution is better (lines 9-11). The worst nest can be abandoned and a new one

Page 32: A comprehensive review of bat algorithms and their ...

Chapter 3. Swarm intelligence algorithms 19

Algorithm 5 Original Cuckoo search algorithm

Input: Population of nests xi = (xi1, . . . , xiD)T for i = 1 . . . Np, MAX FE .Output: The best solution xbest and its corresponding value fmin = min(f(x)).

1: generate initial host nest locations();2: eval = 0;3: while termination condition not meet do4: for i = 1 to Np do5: xi = generate new solution(xi);6: fi = evaluate the new solution(xi);7: eval = eval + 1;8: j = brand(0, 1) ∗Np+ 1c;9: if fi < fj then

10: xj = xi; fj = fi; // replace j-th solution11: end if12: if rand(0, 1) < pa then13: init nest(xworst);14: end if15: if fi < fmin then16: xbest = xi; fmin = fi; // save the local best solution17: end if18: end for19: end while

built in place of it (lines 12-14). However, tracking the best solution is performed in

lines 15-17.

All lot of research has been performed using this algorithm. Cuckoo has been applied to

numerical optimization [211], multi-objective [240], and also to real-world problems [45,

81].

3.4 Firefly algorithm

The phenomenon of fireflies (Fig. 3.4) is regarding the flash lights which can be admired

on clear summer nights. These lights are produced by a complicated set of chemical

reactions. Firefly flashes are for attracting mating partners and serve as protection

mechanisms for warning off potential predators. Their light intensity I decreases when

the distance r from the light source increases according to term I ∝ r2. On the other

hand, air absorbs the light as the distance from the source increases [70].

When the flashing light is proportional to the objective function of the problem being

optimized (i.e., I(s) ∝ f(s)), where s represents the candidate solution) this behavior

of the fireflies can represent the basis for an optimization algorithm. However, artificial

fireflies obey the following rules:

Page 33: A comprehensive review of bat algorithms and their ...

Chapter 3. Swarm intelligence algorithms 20

Figure 3.4: Firefly

• all fireflies are unisex,

• their attractiveness is proportional to their brightness, and

• the brightness of a firefly is affected or determined by the landscape of the objective

function.

These rules represent the basis on which the firefly algorithm acts [62, 66]. The firefly

algorithm is population-based, where each solution denotes a point within the search

space. A pseudo-code of the original firefly algorithm is presented in Algorithm 6.

Algorithm 6 Pseudo code of the basic Firefly algorithm

Input: Population of fireflies xi = (xi1, . . . , xiD)T for i = 1 . . . Np, MAX FE.Output: The best solution xbest and its corresponding value fmin = min(f(x)).

1: init fa;2: eval = 0;3: while termination condition not meet do4: α= alpha New;5: for i = 1 to Np do6: fi = evaluate the new solution(xi);7: eval = eval + 1;8: if fi ≤ pBest then9: p = xi; pBest = fi; // save the local best solution

10: end if11: xi = generate new solution(xi);12: end for13: if pBest ≤ fmin then14: xbest = p; fmin = pBest; // save the global best solution15: end if16: end while

As can be seen from Algorithm 6, the search process of the firefly algorithm consists of

the following components:

Page 34: A comprehensive review of bat algorithms and their ...

Chapter 3. Swarm intelligence algorithms 21

• init fa (line 1): initializing the algorithm’s parameters and generating the initial

firefly population,

• evaluate the new solution (line 6): evaluating the new solution,

• find the local best solution (lines 8-10): save the population best solution,

• generate new solution (line 11): moving the fireflies towards the search space ac-

cording to the attractiveness of their neighbor’s solutions,

• find the global best solution (lines 13-15): determining the global best solution.

Firefly has been applied to many problems in optimization. Readers might check the

more important works on firefly algorithm in [62, 66, 80, 143, 186, 230, 247].

3.5 Flower pollination algorithm

Pollination is a natural process of flowers, where one pollen from the anthers of a flower

is transferred to the stigma of the same flower or another flower [200]. The process

of pollination is a precondition for the second process called fertilization, which allows

flowers to develop seeds [56, 88, 128]. Currently, two main groups of pollination exist:

• self-pollination and

• cross-pollination.

Self-pollination arises when the pollen and pistil are from the same plant, while cross-

pollination appears when the pollen and pistil are from different plants. A pollinator

is necessary for cross-pollination. Wind or animals can play the roles of pollinators.

The wind transfers pollen from one plant to another (e.g., grasses), while animals move

pollen from the anthers to the stigmas of flowers. The better-known animal pollinators

are bees (Fig. 3.5), birds, and many insect species.

Flower pollination mimics pollination in nature. Author has idealized the following four

rules of the FPA algorithm:

• Biotic and cross-pollination are considered as global pollination processes with

pollen-carrying pollinators performing Levy flights.

• Abiotic and self-pollination are considered as local pollinations.

Page 35: A comprehensive review of bat algorithms and their ...

Chapter 3. Swarm intelligence algorithms 22

Figure 3.5: Bee pollinating flowers

• Flower constancy can be considered as the reproductive probability is proportional

to the similarities between the two flowers involved.

• Local pollination and global pollination are controlled by a switch probability

p ∈ [0, 1]. Due to the physical proximity and other factors such as wind, local

pollination can have a significant fraction p within the overall pollination activities.

The pollinations of plants and flowers have also been transferred into the computer’s

world. The flower pollination algorithm (FPA) is the newest algorithm in this thesis,

and some preliminary result are presented in [236].

A pseudo-code of the FPA algorithm is presented in Algorithm 7.

Flower pollination (Algorithm 7) starts with random generation of the initial population

(line 1). Then, the initial population is evaluated (line 2) and the best solution is

determined (line 3). The main loop (lines 5-19) consists of two parts. In the first part

(lines 6-17), a new solution is generated for each pollen i in the population, either globally

by Levy flight in the vicinity of the best solution (line 8) or locally by subtracting two

randomly selected vectors in the population and adding this difference scaled by some

randomly selected value ε ∈ [0, 1] to the original i-th pollen (line 10). The switch between

global and local modifications (line 7) is regulated by so named switch probability p that

is permanently set in line 4. The new solution is then evaluated (line 12). If the fitness

value of the new solution is better than the older one, the older i-th pollen is replaced

by the new solution (lines 14-16). In the second part, the best global solution is declared

(line 18).

Page 36: A comprehensive review of bat algorithms and their ...

Chapter 3. Swarm intelligence algorithms 23

Algorithm 7 Pseudo code of the FPA algorithm

Input: Bat population xi = (xi1, . . . , xiD)T for i = 1 . . . Np, MAX FE.Output: The best solution xbest and its corresponding value fmin = min(f(x)).

1: init pollen;2: eval = evaluate the new population;3: fmin = find the best solution(xbest);4: p = rand(0, 1);5: while termination condition not meet do6: for i = 1 to Np do7: if rand(0, 1) < p then8: y = generate new solution global(xi);9: else

10: y = generate new solution local(xi);11: end if12: fnew = evaluate the new solution(y);13: eval = eval + 1;14: if fnew ≤ fi then15: xi = y; fi = fnew; // save the local best solution16: end if17: end for18: fmin = find the best solution(xbest);19: end while

3.6 Particle swarm optimization

Particle swarm optimization (PSO) was one of the first swarm intelligence algorithms

to be presented at an International conference on Neural Networks by Kennedy and

Eberhart in 1995 [121]. PSO is inspired by the social foraging behavior of some animals

such as flocking behavior of birds (Fig. 3.6) and schooling behavior of fish. This algorithm

optimizes a problem by iteratively improving the candidate solution [194–196].

Figure 3.6: PSO

In PSO, there is a population of candidate solutions (particles) that move around in the

search space according to mathematical formulas. Each particle in the algorithm has its

local best known postition and is guided towards the best known positions in the search

space. The best positions in search space are updated by all the particles in the search

space [22, 31, 169].

Page 37: A comprehensive review of bat algorithms and their ...

Chapter 3. Swarm intelligence algorithms 24

Algorithm 8 Pseudo code of the PSO algorithm

Input: PSO population of particles xi = (xi1, . . . , xiD)T for i = 1 . . . Np, MAX FE.Output: The best solution xbest and its corresponding value fmin = min(f(x)).

1: init particles;2: eval = 0;3: while termination condition not meet do4: for i = 1 to Np do5: fi = evaluate the new solution(xi);6: eval = eval + 1;7: if fi ≤ pBesti then8: pi = xi; pBesti = fi; // save the local best solution9: end if

10: if fi ≤ fmin then11: xbest = xi; fmin = fi; // save the global best solution12: end if13: xi = generate new solution(xi);14: end for15: end while

A pseudo-code of the particle swarm optimization is presented in Algorithm 8. This

algorithm starts with random initialization of the particle velocities and their positions

(line 1). Two loops emerge after initialization. The outer-loop (lines 3-15) repeats until

the termination condition is not met. The following actions are performed for each

particles in the inner-loop (lines 4-14). The particle i is evaluated at first (line 5). If the

corresponding fitness value fi is better than its local best value pBesti the i-th particle

is declared as the local best value pi (lines 7-9). Additionally, if the fitness value fi is

also better than the global best fmin , the i-th particle also becomes the new global best

(lines 10-12). Finally, a new solution is generated according to the new local and global

best solutions (line 13).

PSO are used in numerical optimization [214], large scale global optimization [256],

multi-modal optimization [138], multi-objective optimization [105] and also in real world

applications, e.g., electromagnetics [183], antenna design [110], clustering [212], etc. PSO

is still in extensive development.

Page 38: A comprehensive review of bat algorithms and their ...

Chapter 4

A comprehensive review of bat

algorithms

Who does not know Aesop’s fable ’The bat and the weasels’ which finishes with the

sageness ’It is wise to turn circumstances to good account.’, and exhibits persons without

distinct characteristics. Similar to such people, bats (Fig. 4.1) could be seen as flying

mice (in England they are also named blind mice), although they are neither birds nor

mice. These prominent animals represent a biological system that uses decentralized

decision making as well as coordinated movement in order to survive. They usually

live in colonies. The world’s largest urban bat colony resides underneath the Ann W.

Richards Congress Avenue Bridge in Austin Texas. These bats are migratory, spending

their summers in Austin and their winters in Mexico. Interestingly, there are more bats

than people in Austin during the summer.

Figure 4.1: A bat.

Bats use echolocation as their primary mechanism for orientation towards a surface,

although not all the species are blind. Additionally, this mechanism serves as a tool for

finding their prey and discriminating between different types of insects. As matter of

25

Page 39: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 26

fact, bats are not the only creatures who use echolocation because there are also other

remarkable animals who use it. For example:

• Dolphins: They use sound production and reception for navigation, communica-

tion, hunting and defense against predators in darker waters [119] (Fig. 4.2).

• Toothed whales: have developed a remarkable sensory ability used for locating

food and for navigation underwater. They produce sounds by moving air between

air-spaces or sinuses in their head.

• Shrew: A shrew or shrew mouse is a small mammal. They use echolocation only

for investigating their habitats rather than additionally for pinpointing food.

• Oilbirds: They also use echolocation for navigation in the same way as bats, but

with a high-pitched clicking sound of around 2 kHz that is audible to humans.

Moreover, some non-sighted people also learn to used echolocation for navigation within

their surroundings. Before the bat algorithm is introduced, a short overview of their

biological basics and behavior of bats is presented. Then, the original bat algorithm

is described. Subsequently, a detailed analysis is presented of those papers that have

tackled the bat algorithm. This analysis is followed by a question: ’There are so many

algorithms, so which one is the most suitable for solving my problem’? Finally, some

directions for further development are outlined at the end of this section.

Figure 4.2: Dolphins use echolocation to determine food

4.1 Biological foundations of bats

Bats are well-known creatures, which can be found almost everywhere on planet Earth.

They are the only mammals that fly. Bats belong to the order Chiroptera, the forelimbs

Page 40: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 27

of whom form webbed-wings, making them the only mammals naturally capable of true

and sustained flight. Bats represent about 20 percent of classified mammals in the world.

There are about 1240 bat species, which are are also divided in two suborders:

• the less specialized and largely fruit-eating mega-bats and

• flying foxes, and the more highly specialized and echolocating micro-bats.

To help bats find and hunt their prey in the dark, most species have developed a special

navigation system, i.e., echolocation [92]. Echolocation might be imagined as though we

were standing at the entrance to a cave and then shouting one word. We will then hear

our own voice echoing back an instant later. That is, we produce sound by rushing air

from out lungs past vibrating local chords. These vibrations cause fluctuations within

the rushing air, which create sound-waves. Sound-waves are often simplified to being

described in terms of sinusoidal plane waves which have the following generic properties:

frequency, wavelength, wavenumber, amplitude, sound pressure, sound intensity, speed

of sound and direction. A sound-wave is just a moving pattern of fluctuations within

air pressure. The changing air pressure pushes the surrounding air particles out and

then pulls them back in. These particles then push and pull the particles next to them,

passing on the energy and pattern of the sound. In this way, sound can travel long

distances through the air.

The first person to introduce the word echolocation was the zoologist Donald Griffin who

discovered how bats navigated. Furthermore, his work also helped during the research

and development of sonar and radar, when a lot of research were done by observing the

bats, their living, hunting, echolocation, and much more. Some of the more prominent

works about bats and their echolocation are presented in [6, 58, 89, 91, 103, 159, 208].

4.2 The original bat algorithm

The bat algorithm (BA) was developed by Xin-She Yang in 2010 [233] [235, 243]. The

inspirations for these works were micro-bats and their echolocations. The author wanted

to mimics their natural behavior and succeed in creating a powerful algorithm which

could be applied to almost all areas of optimization. The pseudo-code of BA is illustrated

in Algorithm 9.

Bats’ behavior is captured within the fitness function of the problem to be solved. The

actions of the original BA algorithm presented in Algorithm 9 can be described as follows.

This algorithm consists of the following components, in general:

Page 41: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 28

Algorithm 9 Original Bat algorithm

Input: Bat population xi = (xi1, . . . , xiD)T for i = 1 . . . Np, MAX FE.Output: The best solution xbest and its corresponding value fmin = min(f(x)).

1: init bat();2: eval = evaluate the new population;3: fmin = find the best solution(xbest); {initialization}4: while termination condition not meet do5: for i = 1 to Np do6: y = generate new solution(xi);7: if rand(0, 1) > ri then8: y = improve the best solution(xbest)9: end if{ local search step }

10: fnew = evaluate the new solution(y);11: eval = eval + 1;12: if fnew ≤ fi and N(0, 1) < Ai then13: xi = y; fi = fnew;14: end if{ save the best solution conditionally }15: fmin=find the best solution(xbest);16: end for17: end while

• initialization (lines 1-3): initializing the algorithm parameters, generating the ini-

tial population, evaluating this, and finally, determining the best solution xbest in

the population,

• generate the new solution (line 6): moving the virtual bats in the search space

according to physical rules of bat echolocation,

• local search step (lines 7-9): improving the best solution using random walk direct

exploitation (RWDE) heuristic [178],

• evaluate the new solution (line 10): evaluating the new solution,

• save the best solution conditionaly (lines 12-14): saving the new best solution un-

der some probability Ai similar to simulated annealing [126],

• find the best solution (line 15): finding the current best solution.

Note that these components are denoted in the algorithm either as function names,

when the function call is performed in one line or a component name is designated as

a comment between two curly brackets in the last line, when it comprises more lines.

Initialization of the bat population is performed randomly. Generating the new solutions

Page 42: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 29

is performed according to the following equations:

Q(t)i = Qmin + (Qmax −Qmin)N(0, 1),

v(t+1)i = vti + (xti − best)Q

(t)i ,

x(t+1)i = x

(t)i + v

(t+1)i ,

(4.1)

where N(0, 1) is a random number drawn from a Gaussian distribution with zero mean

and a standard deviation of one. A RWDE heuristic [178] implemented in the function

improve the best solution modifies the current best solution according to the equation:

x(t) = best + εA(t)i N(0, 1), (4.2)

where N(0, 1) denotes the random number drawn from a Gaussian distribution with

zero mean and a standard deviation of one, ε being the scaling factor, and A(t)i the

loudness. A local search is launched with the probability of pulse rate ri. As al-

ready stated, the probability of accepting the new best solution in the component

save the best solution conditionaly depends on loudness Ai. Actually, the original BA

algorithm is controlled by two algorithm parameters: the pulse rate ri and the loudness

Ai. Typically, the rate of pulse emission ri increases and the loudness Ai decreases when

the population draws nearer to the local optimum. Both characteristics imitate natu-

ral bats, where the rate of pulse emission increases and the loudness decreases when a

bat finds a prey. Mathematically, these characteristics are captured using the following

equations:

A(t+1)i = αA

(t)i , r

(t)i = r

(0)i [1− exp(−γε)], (4.3)

where α and γ are constants. Actually, the α parameter plays a similar role as the

cooling factor in the simulated annealing algorithm that controls the convergence rate

of this algorithm.

In summary, the origin of BA can be found in an PSO algorithm [121] hybridized with

RWDE and simulated annealing heuristics. The former represents the local search that

directs the bat search process towards improving the best solution, while the latter

takes care of the population diversity. In other words, the local search can be connected

with exploitation, while simulated annealing using the exploration component of the bat

search process. The exploitation is controlled by the parameter r and exploration by the

parameter A. As a result, the BA algorithm tries to explicitly control the exploration

and exploitation components within its search process.

Page 43: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 30

4.3 Applications of the bat algorithm

Nowadays, BA and its variants are applied for solving many optimization and classi-

fication problems, as well as several engineering problems in practice.The taxonomy

of the developed BA applications is illustrated in Fig. 4.3. As can be seen from this

figure, BA has been applied to the following classes of optimization problems: continu-

ous, constrained, and multi-objective. In addition, it is used for classification problems

in: clustering, neural networks, and feature selection. Finally, BAs are used in many

branches of engineering. In this review, we focused on the following engineering areas:

image processing, industrial design, scheduling, electronics, spam filtering, and even

sport.

Figure 4.3: Classification of Bat algorithms

4.3.1 Optimization

Optimization problem can be formally defined as a quadrille 〈I, S, fo, g〉, where

• I: is a set of instances x ∈ I,

• S: is the function that to each instance x ∈ I assigns a set of feasible solutions

S(x),

• fo: is the objective function that to each feasible solution s ∈ S(x) of instance

x ∈ I assigns the value f0(s) ∈ R,

• g: is the goal that determines whether the minimum (g=min) or maximum (g=max)

values of the objective function is to be searched for.

Page 44: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 31

A detailed review of research papers that deal with solving the optimization problems

with the BA algorithm, is summarized in Table 4.1.

Table 4.1: Review of papers solving the optimization problems with BA.

Topic ReferencesContinuous optimization [233] [68] [249] [217] [223] [140]Constrained optimization [82]Multi-objective optimization [234]

In remainder of this subsection, papers from the optimization domains where the BA

algorithm was mainly applied, i.e., the continuous, constrained, and multi-objective are

presented in detail.

4.3.1.1 Continuous optimization

The original BA [233] was applied to various benchmark functions. In this paper, the

author compared his novel algorithm against the genetic algorithm and particle swarm

optimization. The obtained results in this study were exciting as the author made certain

remarks which might be solved in the future:

• improve convergence rate and

• usage of BA on larger dimensions of problems.

The convergence rate was improved in study [68], where the authors hybridized the

original BA with differential evolution strategies (HBA). However, BA regarding the

high dimensions was still a very challenging task. In study [67] the same authors dealt

with hybridizing the BA algorithm using differential evolution strategies and a random

forests machine learning method (HBARF).

Yilmaz and Kucuksille [249] proposed the improved bat algorithm (IBA) for continuous

optimization by introducing three modifications:

• Inertia weight factor modification: Exploitation capability of BA was improved by

inserting a linear decreasing inertia weight factor [158],

• Adaptive frequency modification: Each dimension of a solution was assigned a

frequency separately,

• Scout bee modification: Because of the lack of exploration in BA, it was hybridized

using an artificial bee colony algorithm for improving exploration.

Page 45: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 32

Their IBA produced better performance than the original BA.

Wang and Guo [217] hybridized a BA with a harmony search (HS/BA) [85]. The im-

provement included the addition of a pitch adjustment operation in harmony search

serving as a mutation operator during the process of bat updating, with the aim of

speeding up convergence, thus making the approach more feasible for a wider range of

real-world applications. Verification of this approach was tested on 14 standard bench-

mark functions. Experiments showed that their approach was superior compared to

ACO, BA, BBO, DE, ES, GA, HS, PSO and SGA. Moreover, some HS/BA parame-

ters were analyzed during their study. In summary, the following remarks might be

concluded from it:

• The HS/BA attempts to improve the merits of BA and HS in order to avoid all

bats getting trapped in inferior local optimal regions,

• the HS/BA enables the bats to have more diverse exemplars to learn from, as the

bats are updated at each iteration and also form new harmonies when searching

within a larger search space,

• this new method can speed up the global convergence rate without losing the

strong robustness of the basic BA,

• authors observe that the proposed HS/BA makes good use of information from past

solutions more effectively by generating better quality solutions more frequently,

when compared to the other population-based optimization algorithms such as

ACO, BA, BBO, DE, ES, GA, HS, PSO, and SGA.

• authors concluded that the HS/BA significantly improves the performances of HS

and BA for most multi-modal and unimodal problems.

Xie et al. [223] proposed a new BA with a differential operator and Levy flights to cope

with the slow convergence rate and low accuracy of BA. The authors used a differential

operator to accelerate the convergence speed of the algorithm. On the other hand,

the Levy flights trajectory can ensure the diversity of the population against premature

convergence and prevent the algorithm from trapping in local minima. Experiments were

conducted on 14 standard benchmark functions and also on some nonlinear equations.

The results showed that the proposed algorithm was feasible and effective, and also had

superior approximation capabilities within high-dimensional spaces.

Liu et al. [140] combined the classic BA with the Doppler effect (DEBA). This new

algorithm was very efficient. The authors mentioned that further improvements would

be concentrated on convergence of the algorithm, and proper parameter settings.

Page 46: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 33

4.3.1.2 Constrained optimization

Constraint problems can be divided into two different types: constraint satisfaction

problems (CSP) and constraint optimization problems (COP). In contrast, if the problem

is unconstrained, it is referred to as a free optimization problem (FOP) [48]. CSP is

defined as a pair 〈S, φ〉, where S denotes a search space and φ is a Boolean function on

S that represents a feasibility condition. In fact, this function divides the search space

S into feasible and infeasible regions. A solution of the CSP problem is each s ∈ S with

φ(s) = true. On the other hand, COP is defined as a triple 〈S, f, φ〉 where S denotes

a search space, f is a real valued fitness function, and φ is a Boolean function on S. A

solution of this problem is the s ∈ S(x) with φ(s) = true and S(x) = S(x∗).

Gandomi et al. [82] applied BA on constrained optimization problems (COBA). The

authors tested BA on classical benchmark functions and also further on real-world con-

strained optimization tasks (e.g., Pressure vessel design 4.4).

Figure 4.4: Pressure vessel

After extensive tests and experiments the authors made some interesting statements:

• BA is very efficient,

• BA is, in many cases, superior to the existing algorithms from literature,

• BA uses good features from other meta-heuristic algorithms (e.g. PSO, HS),

• PSO and HS are special cases of BA,

• BA has a better dynamic control of exploration and exploitation.

4.3.1.3 Multi-objective optimization

Multi-objective optimization is a kind of optimization where a problem is optimized

according to more than one criterion. Many real-world optimization problems do not

Page 47: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 34

estimate a solution according to only one criterion, but according to several criteria.

The single objective optimization searches for a single solution, while multi-objective

optimization for a set of optimal solutions, also named the Pareto front [182] [213].

There are two goals during multi-objective optimization:

• to discover solutions as close to the Pareto front as possible

• to find solutions as diverse as possible within the obtained non-dominated front.

Many algorithms have been developed in the past for solving multi-objective prob-

lems. Some of the more prominent algorithms are: AbYSS [157], IBEA, DEMO [182],

GDE3 [134], FastPGA [54], MOCHC [156], MOEA/D-DE [136], NSGA-II [39], PAES [127],

SPEA2 [258], VEGA. There is also a special Java framework jMetal [46] [47] that is

dedicated to multi-objective optimization. Many multi-objective algorithms are imple-

mented in this framework, and furthermore, many multi-objective benchmark problems

are available within it.

In 2011 Xin-She Yang [234] created a BA for multi-objective optimization (MOBA).

Because of its simplicity, it uses a weighted sum in order to combine all objectives into

a single objective:

f =

K∑k=1

wkfk, and

K∑k=1

wk = 1, (4.4)

where K designates the number of objectives to be optimized. The author tested this

algorithm on a family of benchmark functions and on design optimization. The results

were promising but in the future several tests need to be performed.

4.3.2 Classification

The classification algorithm is a procedure for selecting a hypothesis from a set of al-

ternatives that best fits a set of observations or data. Usually, this kind of algorithm

appears in machine learning, data mining, and neural networks. Although classifications

can be considered as optimization, Holland [122] wrote that learning (as a component

part of classification) is viewed as a process of adaptation to a particularly unknown

environment, not as an optimization problem. Contributed papers from this area are

reviewed in the reminder of this subsection (Table 4.2).

Page 48: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 35

Table 4.2: Review of papers solving classification problems with BA.

Topic ReferencesClustering [123]Neural networks [150] [124]Feature selection [154]

4.3.2.1 Clustering

Clustering is a exploratory data mining technique for the unsupervised classification of

patterns into groups, named clusters. Clustering is used in many fields, e.g., machine

learning, image analysis, bioinformatics, information retrieval [94, 107, 108]. A lot of

efficient methods for clustering have been developed in the past. Some authors have also

tried to develop algorithms for clustering using BA.

Khan et al. [123] proposed a fuzzy bat clustering method for the ergonomic screening

of office workplaces. The authors used collated data from a checklist from workplaces

in order to determine any ergonomic related health risks. They defined three clusters

with low, moderate and high ergonomic risks. The results of the experiments showed

the following advantages of this method:

• the method is fast and effective for screening workplaces with major ergonomic

problems within a company, and

• provides better performance when compared with a fuzzy c-means clustering al-

gorithm.

4.3.2.2 Neural networks

Mishra et al. [150] proposed a model using BA for updating the weights of a functional

link artificial neural network classifier. The authors compared the proposed model with

other methods, like FLANN [25, 125], PSO-FLANN [40], and after simulations concluded

that this method was superior and faster than the other two methods.

Khan et al. [124] used the BA algorithm for training standard and eLearning datasets for

classification purposes. The authors used the dataset as proposed in [171]. Experiments

were done on two gradient descent algorithms:

• back-propagation,

• Levenberg-Marquardt,

Page 49: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 36

and three population based heuristics, i.e., BA, GA, and PSO. The final results showed

that BA outperformed the other algorithms in training feed forward neural networks.

4.3.2.3 Feature selection

Feature selection is a technique that aims to find the more important information from

a set of features. Feature selection belongs to a group of optimization problems. For

these problems, the authors [154] developed a new version of BA called the Binary bat

algorithm (BBA). The experiments were conducted on five public datasets and thus

BBA was compared with PSO, FFA and GSA. BBA outperformed the other algorithms

by optimizing the 3 of 5 datasets.

4.3.3 Engineering applications

Bat algorithms have become a crucial technology for solving problems in engineering

practice. Nowadays, there are applications from almost every engineering domain. Some

of these (Table 4.3) are reviewed in the reminder of the work.

Table 4.3: Review of papers tackling engineering applications with BA.

Topic ReferencesImage processing [253]Industrial design [242] [17] [177] [205]Scheduling [153] [218] [146]Electronics [181] [244]Spam filtering [155]Sports [3]

4.3.3.1 Image processing

Zhang and Wang [253] created a novel bat algorithm with mutation (BAM). This al-

gorithm was proposed for solving image matching problem. A new modification was

applied to mutate between bats during the process of new solutions updating. The com-

parative study encompassed BA, DE and SGA. The tests and experiments showed that

this approach was more effective and feasible than other models. The authors also wrote

that the next approach can accelerate the global convergence speed.

Page 50: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 37

4.3.3.2 Industrial design

Yang and Gandomi [242] presented a BA for solving engineering problems. They vali-

dated the performance of new BA on several benchmark engineering design problems.

An excellent comparative study carried-out on seven nonlinear design tasks was pre-

sented and showed that BA performed superiorly during such optimization. The bat

algorithm was more powerful that the genetic algorithm, particle swarm optimization,

and harmony search. The authors stated that GA and PSO could be special cases of BA

algorithm after some simplifications. At the end of their article, some remarks about

control parameters were highlighted. The authors stated that some sensitivity stud-

ies needed to be done in the future in order to obtain the proper parameters for BA.

Adjustment of these parameters might help to better convergence rate of BA.

Bora et al. [17] applied BA to the brush-less DC wheel motor problem. The authors

solved the brush-less DC wheel motor problems twice, i.e., firstly, according to the single

criterion and secondly, according to multi-objective criteria. In the first case, the prob-

lem consisted of five design parameters, and six constraints, while in the second case, of

two objectives, five design parameters, and five constraints. The authors experiments

showed that BA is a robust tool for optimization in this engineering domain. Moreover,

BA is also competitive with other state-of-the-art methods for multi-objective optimiza-

tion. Interestingly, authors stated that BA merges some features of particle swarm

optimization, simulated annealing and NSGA-II algorithm, and thus provides a high

quality Pareto Front.

Ramesh et al. [177] applied BA for multi-objective optimization problem in power sys-

tems. The application of BA in that paper was based on mathematical modeling for

solving economic, emission, and combined economic and emissions’ dispatch problems

by a single equivalent objective function. BA was applied to two realistic systems under

different load condition. In order to prove the performance of this algorithm, the authors

compared this solution with RGA [202], SGA [202], HGA [202], and ABC [116]. The

conclusions from the comparison between these algorithms exposed that BA was better

than the other algorithms in the tests. Thus, the authors found that some of the special

characteristics of the BA algorithm are:

• quality of the solutions,

• stable convergence characteristics and

• good computational efficiency.

Page 51: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 38

In their book chapter, Tamiru and Hashim [205] presented the use of fuzzy systems for

modeling energy destructions in the main components of an industrial gas turbine. They

presented a model identification study that used fuzzy systems and BA.

4.3.3.3 Scheduling

Musikapun et al. [153] presented a BA based scheduling tool for solving multi-stage

multi-machine multi-product scheduling problem. The authors designed this algorithm

to minimize a combination of earliness and tardiness penalties cost and the correctly

sequence operations required for manufacturing components and also satisfying assembly

precedence relationship. The results of extensive experiments showed that the quality of

the results depended on the tuning of appropriate parameters. These parameters might

be identified by statistical tools. The BA algorithm using optimized setting was much

better than those using non-optimized settings.

Wang et al. [218] applied BA to uninhabited combat air vehicle (UCAV) path planning

that is a complicated high-dimensional optimization problem which mainly focused on

optimizing the flight route by considering the different kinds of constrains under com-

plicated battle field environments. The authors used the original BA and furthermore

improved the original BA with mutation. This mutation modified virtual bats during

a process of bat movement. Then, the UCAV could find a safe path by connecting the

chosen nodes of the coordinates while avoiding the threatened areas, and thus costing

minimum fuel. This new approach accelerated the global convergence speed while still

preserving the robustness of the original BA. The experiment showed that the proposed

approach was more effective and feasible during UCAV path planning than the other

models. The authors made some remarks about BA. They stated that the BA is a sim-

ple, fast, and robust global optimization algorithm but it may lack the diversity of a bat

population. In order to avoid this problem, they added mutation during the movement

of virtual bats. The authors summarized the following remarks from the experimental

results:

• The proposed BAM approach is effective and efficient. It can solve the UCAV

path planning problem effectively.

• The overall performance of BAM is superior to or highly competitive with the

original BA.

• BAM and other population-based optimization methods were compared for dif-

ferent maximum generations and dimensions. Under the majority of conditions,

BAM is significantly better than other population-based optimization methods.

Page 52: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 39

• BAM and BA algorithms were compared with the different parameters set and

under almost all conditions, BAM was far better than BA.

• BAM was insensitive to parameter loudness, discovery rate, weighting and scaling

factors. It means that users do not need to fine-tune parameters.

Marichelvam et al. [146] presented a BA for solving real-world multi-stage hybrid flow

shop scheduling problems. Multiple parallel machines were considered at each stage in

the HFS. The HFS scheduling problem is known to be strongly NP-hard problem. This

was also a first step towards solving this problem using a BA. The authors developed

BA for the HFS scheduling problem in order to minimize makespan and mean flow

time. Their experiments showed that BA showed better results than many other meta-

heuristics.

4.3.3.4 Electronics

Reddy et al. [181] presented a novel method for determining optimum capacitor locations

using the fuzzy approach and the capacitor sizing problem for loss minimization, using

BA. The authors tested this method with several bus systems and the results were

encouraging. The real power loss had been reduced significantly by installing capacitors

at all potential locations. The authors showed that bat control parameters like A, r, α,

and γ played an important role for the performance of a BA. As a result, convergence

of the algorithm depended on proper parameter settings.

Yang et al. [244] presented a BA that was applied to design problems in microelectronics,

more specifically to topology optimization problems. Their paper demonstrated that

BA was efficient when finding the optimal shapes for heat transfer design problems in

microelectronics.

4.3.3.5 Spam filtering

E-mail spam is also called a junk mail. This is an identical message sent to many

recipients by electronic mail. Many spam filters has been developed in the past in order

to cope with the junk E-mail, . Bloom filter is a data structure used to test whether

the element is a member of this set. Th Bin Bloom filter (BBF) groups the words into

numbers of bins with different false positive rates based on the weights of the spam

words. Natarajan et al. [155] performed a comparative study between cuckoo search

and BA for minimizing the total membership invalidation cost of the BBFs, by finding

the optimal false positive-rates and number of elements stored in every bin.

Page 53: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 40

4.3.3.6 Sports

Akhtar et al. [3] applied BA for estimating the full-body human pose in video sequences.

Full-body human pose estimation is a very challenging problem of computer vision. It

also has a very valuable asset in many industries, e.g., movie and cartoon productions,

robotics, sports. Tracking a body in 3D is a non-linear and multi-modal optimization

problem. The BA algorithm was used by the authors for solving this problem. Each

bat represented a potential solution in the search space which might be a skeletal con-

figuration of the human body. Their tests were compared with Particle Filter, Annealed

Particle Filter and PSO. Analysis showed that BA performed better than the other three

algorithms in tests. BA appeared more robust in such problems. The authors also no-

ticed that with any increase in the bat population, the tracking accuracy also increased,

but the computational burden also increased.

4.4 There are so many algorithms...

If we refer to Chapter 3, the following questions could arisen by the readers. Do we really

need so many algorithms? Which is better? What is the future of these algorithms?

Where to use certain algorithms the most effectively? There are only some questions

with which a developer is confronted, when he/she wants to apply some SI algorithm to

a real-world problem.

In order to find the right answers to the above questions, criteria need to be defined that

could be served as a basis on which comparisons between different algorithms could be

performed as fairly as possible. In this study, much attention was especially devoted to

algorithm performance measures, benchmark test suites, and problem dimensions. The

performance measures in SI are of a statistical nature because of stochastic nature of SI

algorithms as such. Typically, three basic performance measures are used as proposed

in [48]:

• Success rate,

• Effectiveness (solution quality),

• Efficiency (speed).

The stochastic nature of SI algorithms means that the solution of a problem cannot

be found using some algorithm runs. Frequently, this is the case in combinatorial op-

timization, where the success rate (SR) is calculated as a ratio between the successful

Page 54: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 41

runs and all the runs. As an effectiveness measure, the mean best fitness (MBF) can

be defined reflecting the average of the best fitness values over all runs. Especially in

continuous optimization, measures such as the best-ever fitness, worst-ever fitness, stan-

dard deviation and median of fitness values calculated over a number of runs, are also

of interest [48]. An efficiency of the SI algorithm is typically estimated by the number

of abstract search steps, although in the continuous optimization CPU times are usually

employed.

In the experimental comparisons between between SI algorithms, besides defining the

performance measures, selecting the benchmark problems plays an important role. Ac-

cording to Eiben et Smith in [48], there are three different approaches:

• using problem instances from academic benchmark repositories,

• using problem instances created by problem instance generators,

• using real-world problem instances.

The benchmark suite should contain high-dimensional objective functions because these

are representative of real-world problems [48]. In contrast, the low-dimensional functions

(e.g., with n = 2) can be solved optimally using traditional algorithms. Therefore, these

are not suitable representatives of real-world problems where the SI algorithms would

be applied.

Nowadays, more test suites exist for comparing the performance of various algorithms.

These test suites have been carefully designed in order to capture a characteristic va-

riety of fitness landscapes. Moreover, to avoid any tending to bias the result to zero

for some algorithms, the global optima of these objective functions can be moved and

rotated. Therefore, searching for the global optimum becomes more difficult for most

of the algorithms. Certain function suites are provided, for example, by BBOX [93]

or CEC’05 [203] and are standardized. Unfortunately, many authors define their own

function benchmarks which consist of standard functions as the Ackley, Griewank, and

Rastrigin functions, but without moving and rotating options. In general, such bench-

marks, also named proprietary suites, should be more appropriate to their algorithms.

In order to prove our hypothesis, an experiment was performed in which all papers about

bat algorithm tackling continuous optimization were taken into account. These papers

were compared according to performance measures, the dimensions of the problems to be

solved, the used benchmark suite, and algorithms, with which the proposed algorithms

in the paper were compared. The results of this analysis are presented in Table 4.4.

Page 55: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 42

Table 4.4: Experimental setups in analyzed papers tackling continuous optimization.

Alg. Ref. Meas. Dim. Evals. Suite ComparisonBA [233] mean, stdev 2-256 5,000G Prop-10 GA, PSOHBA [68] best, worst, mean, 10, 20, 30 1,000D Prop-5 BA

stdev, medianHBARF [67] best, worst, mean, 10 1,000D Prop-5 BA, HBA

stdev, medianIBA [249] best, worst, mean, 10, 30, 50 200D Prop-10 BA

stdev, medianBA/HS [217] best, mean 20 50G Prop-14 ACO, BA, BBO, ES,

GA, HS, PSO, SGALevy [223] best, worst, mean, 2, 5, 20 24,000 Prop-14 BA

stdev, medianDEBA [140] mean, stdev 2-256 ≤ 10−14 Prop-5 BA, PSOCOBA [82] best, worst, mean, 2-20 20,000 Prop-13 N/A

stdev, medianMOBA [234] Error distance 2 5,000G Prop-4 N/A

Table 4.4 is divided into seven rows representing the algorithm names, the reference to

the paper, where these algorithms were proposed, the performance measures used in

experiments, the dimensions of functions to be solved, the permitted number of fitness

function evaluations (also termination condition), the used benchmark function suites,

and algorithms (if they existed), with which the proposed algorithm was compared.

The record ’2-256’ in column ’Meas.’ means that one study using the functions with

dimensions in a range between 2 and 256 was performed in the paper [233], whilst

the record ’10, 20, 30’ means that three studies tackling functions with dimensions

D = 10, D = 20, and D = 30 were handled in the paper [68]. In the column ’Evals’, a

character ’G’ behind the number designates the number of generations, the character ’D’

the dimension of the problem, and no character denotes the number of fitness function

evaluations. In the first case, the algorithm will be terminated when the number of

generation is exhausted, in the second case, the number is multiplied by a problem

dimension and the product determines the number of fitness function evaluations needed

to terminate the algorithm, and finally, the number denotes how many fitness function

evaluations are permitted before the algorithm is terminated. Additionally, a term

≤ 10−14 determines the termination condition as a tolerance of the best fitness value,

i.e., the DEBA algorithm terminates when a precision of the best fitness value falls under

the tolerance. A number behind a term ’Prop-’ in column ’Suite’ designates the number

of functions in the suite.

The following conclusions can be derived from the table:

• The BA algorithm is in its growing phase, yet.

• The number of papers tackling this algorithm is not huge.

Page 56: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 43

• Most of the papers are published in journals of lower impact factors.

• In these papers, the authors use different performance measures, experiments were

conducted on non-standardized benchmark functions of various sizes, on different

dimensions, and also with different number of fitness function evaluations. More-

over, most of the comparative studies were performed using original BA algorithm

only.

As a result, the quality of results of the BA algorithm is hard to be estimated, because

the algorithm quality is shown by solving the hardest problems (i.e., large scale) and

comparing the obtained results with the state-of-the-art algorithms.

4.5 Future research in bat algorithm

This survey has tried to encompass all the available literature about BAs so far. As we

can see, many new BAs have been proposed and applied to many problem domains. In

numerical optimization, the BA algorithm has shown effectiveness especially on small

dimensions. The same is also valid for engineering optimizations as well as real-world

problems. In the future, several tests will be needed to find a theoretical background of

BA. On the other hand, many new hybridization methods awaiting for the developers

of new BAs. Let us expose only the more important tasks which need to be performed

in the near future:

• more theoretical background on the BA algorithm,

• hybridization with other meta-heuristics and also machine learning techniques [69]

(e.g, extremely randomized trees [86], decision trees [19]), gradient boosting [74,

75]),

• apply BA on combinatorial optimization (e.g, graph coloring, b-coloring),

• the BA is inefficient when problems with higher dimensions need to be optimized,

• better control of exploration and exploitation,

• adaptation and self-adaptation of control parameters,

• working with subpopulations,

• applying BA for use in industries with real-world environment,

• etc.

Page 57: A comprehensive review of bat algorithms and their ...

Chapter 4. A comprehensive review of bat algorithms 44

This study has attempted to find some answer to these questions, for example, hy-

bridization with DE strategies in place of RWDE local search. Indeed it is expected

that the global search capabilities of BA algorithms could be improved and therefore,

premature convergence should be avoided. Furthermore, machine-learning techniques

have became a very interesting domain of research, also in SI and EC. Some of these

methods have been applied to the original BA algorithm. Finally, randomization meth-

ods play a crucial role in each stochastic algorithm. In line with this, a comparative

study of some randomization methods (e.g., Uniform and Gaussian distribution, Levy

flight, chaos maps and random sampling in turbulent fractal clouds) was performed in

order to show how the particular randomization method influences the performance of

the original BA algorithms.

Page 58: A comprehensive review of bat algorithms and their ...

Chapter 5

Hybridization of bat algorithm

As can be seen in Chapter 4, the bat algorithm has been applied to many real-world

problems. However, its performance is still subject to the ’No Free Launch’ (NFL) the-

orem [221]. It states that any two algorithms are equivalent when their performance are

compared across all possible problems. Fortunately, the performance of some algorithm

can be improved when a problem-specific knowledge is incorporated into it. Typically,

this knowledge is incorporated in the sense of heuristic operators, initialization proce-

dures, fitness functions, etc [48]. Combining the problem-specific heuristics with the BA

algorithm leads to a hybrid algorithm.

Actually, the bat algorithm is inherently hybridized by definition. That is, this algo-

rithm provides the local search step explicitly. A population diversity is ensured by

randomly generating new solutions similar to simulated annealing. This phase is also

a candidate for hybridization. Problem-specific knowledge can also be incorporated by

moving the virtual bats and by evaluating the fitness function. Furthermore, the initial

bat population can be generated by incorporating solutions from existing algorithms or

by using the heuristics, local search, etc.

The practical work of this thesis focused on the hybridizations and extensions of the

original bat algorithm. Two hybridizations have been developed, in line with this. Both

refer to the local search step of the bat algorithm. The former modifies the solution using

’DE/rand/1/bin’ strategy, while the latter uses random forest regression. Additionally,

the experiments with bat algorithms using various distributions for generating random

numbers were conducted, in order to show how these distributions affect the results of

the original bat algorithm.

The structure of this chapter is as follows. Firstly, hybridizations with differential evolu-

tion and random forests are described. Then, the various possibilities for generating the

45

Page 59: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 46

random numbers are illustrated, like Levy flights, chaos, and random sampling in turbu-

lent fractal cloud. The chapter continues with an extensive description of experiments

and results, and finishes with an analysis of the obtained results.

5.1 Hybridization using differential evolution strategies

In order to improve the bat algorithm behavior for higher-dimensional problems, the

original bat has been hybridized with differential evolution strategies. Before this hy-

brid bat algorithm (HBA) is described in detail, the differential evolution strategies are

illustrated in the next subsection.

5.1.1 Differential evolution

Differential evolution (DE) [37] is an evolutionary algorithm appropriate for continuous

and combinatorial optimization that was introduced by Storn and Price in 1995 [199].

This is a population-based algorithm that consists of Np real-valued vectors representing

the candidate solutions, as follows:

x(t)i = (x

(t)i1 , . . . , x

(t)iD), for i = 1 . . .Np, (5.1)

where D denotes the dimension of the problem.

The DE supports a differential mutation, a differential crossover and a differential se-

lection. In particular, the differential mutation randomly selects two solutions and adds

a scaled difference between these to the third solution. This mutation can be expressed

as follows:

u(t)i = x

(t)r0 + F · (x(t)

r1 − x(t)r2 ), for i = 1 . . .Np, (5.2)

where F ∈ [0.1, 1.0] (Price and Storn proposed F ∈ [0.0, 2.0], but normally this interval

is used in the DE community) denotes the scaling factor as a positive real number that

scales the rate of modification while r0, r1, r2 are randomly selected values in the

interval 1 . . .Np.

Uniform crossover is employed as a differential crossover by the DE. The trial vector is

built from parameter values copied from two different solutions. Mathematically, this

crossover can be expressed as follows:

Page 60: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 47

wi,j =

u(t)i,j randj(0, 1) ≤ CR ∨ j = jrand,

x(t)i,j otherwise,

(5.3)

where CR ∈ [0.0, 1.0] controls the fraction of the parameters that are copied to the trial

solution. Note, the relation j = jrand ensures that the trial vector is different from the

original solution x(t)i .

Mathematically, a differential selection can be expressed as follows:

x(t+1)i =

w(t)i if f(w

(t)i ) ≤ f(x

(t)i ),

x(t)i otherwise .

(5.4)

In a technical sense, crossover and mutation can be performed in several ways in dif-

ferential evolution. Therefore, a specific notation was used to describe the varieties of

these methods (also strategies) generally. For example, ’rand/1/bin’ denotes that the

base vector is randomly selected, 1 vector difference is added to it, and the number of

modified parameters in the mutant vector follows a binomial distribution [199].

5.1.2 Hybrid Bat Algorithm

The Hybrid Bat Algorithm [68] (HBA) was obtained by hybridizing the original BA with

DE strategies, where in place of the improve the best solution function that implements

the RWDE heuristic the implementation of ’DE/rand/1/bin’ DE strategy is employed.

The HBA pseudo-code is illustrated in Algorithm 10.

Algorithm 10 Modification in Hybrid Bat Algorithm (HBA)

1: if rand(0, 1) > ri then2: ri=1...3 = brand(0, 1) ∗Np+ 1c;3: n = rand(1, D);4: for i = 1toD do5: if ((rand(0, 1) < CR)||(n == D)) then6: yn = xr1,n + F ∗ (xr2,n − xr3,n);7: n = (n+ 1)%(D + 1);8: end if9: end for{DE/rand/1/bin}

10: end if

The HBA in Algorithm 10 operates as follows. Three random solutions in a virtual bat

population are selected (line 2). Then, a trial solution y is generated using ’rand/1/bin’

DE strategy (lines 4-9) which enters into normal BA selection processing. Note that the

destiny of this trial solution is very uncertain. Although the best solution is obtained

Page 61: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 48

during this variation process, this new solution can be survived only conditionally, i.e.,

depending on the probability A. This process logic can prevent the BA search process

to fall into a local optima.

5.2 Hybridization with random forests

Making decisions based on the input of multiple experts has been a common practice

in human civilization [170]. The computational intelligence (CI) and machine learning

(ML) community have studied methods that share such a joint-decision procedure. In

line with this, ensemble learning has emerged that introduces robustness and accuracy

in a decision process. Ensemble learning can be applied to many real-world applications.

In our study, ensemble learning was used for predicting the more appropriate regression

real-valued vector obtained from an ensemble of multiple DE strategies. In place of

the ordinal offspring, the predicted regression vector was entered into the original BA

selection process. Here, random forests (Figure 5.1) is used as the ensemble learning

method.

Figure 5.1: Random forests

5.2.1 Random Forest

The random forest (RF) is a machine learning method that is regarded as ensemble-

learning that generates several classifiers, and then aggregates their results [170]. The

ensemble-learning consists of many decision trees. RF was proposed by Leo Breiman in

2001 [20] and add an additional layer of randomness to its predecessor’s bagging [21]. In

bagging, the successive layers are independently constructed from a bootstrap sample of

Page 62: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 49

the data and a simple majority vote is taken for prediction [139]. RF changes how the

classification or regression trees are constructed. In standard classification trees, each

node is split using the best split amongst all variables, while in RF each node is split

using the best amongst a subset of predictors randomly chosen at that node [139]. This

strategy turns out to perform well compared to many other classifiers, including support

vector machines (SVM) and neural networks (NN), and is robust against over-fitting [20].

Additionally, it is also very user-friendly because it demands only one parameter (the

number of trees in the forest also the number of estimators) by running.

5.2.2 Hybrid Bat Algorithm with Random Forest

Similarly to HBA, the hybrid BA with random forest (HBARF) also modifies the

local search step (line 7-9) of the original BA algorithm, where in place of the im-

prove the best solution an implementation of the random forest (RF) regression method

is applied on ensemble DE strategies. A pseudo-code of the HBARF algorithm is illus-

trated in Algorithm 11.

Algorithm 11 Original RF algorithm

Input: Bat population xi = (xi1, . . . , xiD)T for i = 1 . . . Np, MAX FE.Output: The best solution xbest and its corresponding value fmin = min(f(x)).

1: if (rand(0, 1) > ri) then2: for j = 1 to 10 do3: vj = ES(j, i, best);4: end for5: t = Rand 1 Bin(i, best);6: clf = RandomForestRegressor(n estimators = 10);7: clf = clf.fit(t,v);8: y = clf.predict(t);9: end if

As can be seen from Algorithm 11, HBARF defines an ensemble of ten DE-strategies

(Table 5.1), that when using the function ES, creates ten different solutions vj from

the candidate solution xi , each obtained with another strategy [145] (lines 2-4). For

the random forest algorithm, these ten solutions represent the training set vj . Then,

the candidate solution xi is modified by ’rand/1/bin’ strategy (line 5) that acts as a

validation set t. Based on these two sets, an RF regression with ten estimators is

launched (line 6), which on the basis of the test and validation sets (line 7) predicts the

new candidate solution y (line 8).

A pseudo-code of the ES function is presented in Algorithm 12. In fact, this function

operates as a trigger that on the basis of the input parameter strat, launches the appro-

priate function that implements the appropriate DE strategy. As a result, this function

Page 63: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 50

Table 5.1: Ensemble of DE-strategies

Strategy Expression

Best/1/Exp x(t+1)i,j = best

(t)j + F · (x(t)r1,j − x

(t)r2,j)

Rand/1/Exp x(t+1)i,j = x

(t)r1,j + F · (x(t)r2,j − x

(t)r3,j)

RandToBest/1/Exp x(t+1)i,j = x

(t)i,j + F · (best(t)j − x

(t)i,j ) + F · (x(t)r1,j − x

(t)r2,j)

Best/2/Exp x(t+1)i,j = best

(t)j + F · (x(t)r1,j + x

(t)r2,j − x

(t)r3,j − x

(t)r4,j)

Rand/2/Exp x(t+1)i,j = x

(t)r1,j + F · (x(t)r2,j + x

(t)r3,j − x

(t)r4,j − x

(t)r5,j)

Best/1/Bin x(t+1)i,j = best

(t)j + F · (x(t)r1,j − x

(t)r2,j)

Rand/1/Bin x(t+1)i,j = x

(t)r1,j + F · (x(t)r2,j − x

(t)r3,j)

RandToBest/1/Bin x(t+1)i,j = x

(t)i,j + F · (best(t)j − x

(t)i,j ) + F · (x(t)r1,j − x

(t)r2,j)

Best/2/Bin x(t+1)i,j = best

(t)j + F · (x(t)r1,j + x

(t)r2,j − x

(t)r3,j − x

(t)r4,j)

Rand/2/Bin x(t+1)i,j = x

(t)r1,j + F · (x(t)r2,j + x

(t)r3,j − x

(t)r4,j − x

(t)r5,j)

Algorithm 12 The ES algorithm the ensemble DE strategy

Input: Index of the DE strategy strat , index of the current and best solution.Output: The trial vector s.

1: if strat == 0 then2: s = Best 1 Exp(current, best);3: else if strat == 1 then4: ...5: else6: s = Rand 2 Bin(current, best);7: end if8: return s;

returns a trial solution s. The implementations of DE strategies are presented in Al-

gorithm 13. Actually, only the implementation of ’rand/1/bin’ is presented in detail

because of the paper’s length limitation. The implementations of other DE strategies

can be found in [174].

In summary, let us mention that this local search step of the HBARF does not demand

any additional fitness function evaluation because the regression vector is predicted on

the basis of pure mathematical statistics that do not dispose any information about their

fitness. On the other hand, the original BA and HBA algorithms were implemented in the

C++ programming language, while the HBARF algorithm combines the implementation

of the BA in C++ together with the RF implementation in Python using the scikit-learn

python library [167]. However, the RF regression method also has crucial implications

on the time complexity of HBARF, as can be seen in the remainder of this section.

5.3 Randomized Bat algorithm

A bat algorithm is a stochastic meta-heuristic that incorporates randomness into search

process. Stochastic optimization searches for optimal solutions by involving randomness

Page 64: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 51

Algorithm 13 Ensemble Strategies functions

Global: Bat population xi = (xi1, . . . , xiD)T for i = 1, . . . , Np.Input: Index of the current current and best solution best .Output: Trial vector s.

1: double* Rand 1 Exp(int current, int best) {2: . . . }3: . . .4: double* Best 1 Exp(int current, int best) {5: ri=1...3 = rand(0, Np);6: n = rand(1, D);7: for i = 1toD do8: if ((rand(0, 1) < CR)||(n == D)) then9: sn = xr1,n + F ∗ (xr2,n − xr3,n);

10: n = (n+ 1)%(D + 1);11: end if12: end for13: return s; }14: . . .15: double* Rand 2 Bin(int current, int best) {16: . . . }

in some constructive way [102]. In contrast, if optimization methods provide the same

results when doing the same things, these methods are said to be deterministic [57]. If the

deterministic system behaves unpredictably, it arrives at a phenomenon of chaos [57]. As

a result, randomness in SI algorithms plays a crucial role because this phenomenon affects

the exploration and exploitation in the search process [34]. Essentially, the randomness is

useful in determining the next point within the search space. This affects the exploration

of the new solution. When the new solution is better than the original the search process

progresses, while the search process can indicate stagnation when the original solution

no longer being improved.

In line with this, several random number distributions can be helpful. For example,

uniform distribution generates each point of the search space using the same probability.

On the other hand, Gaussian distribution is biased towards the observed solution, that is

that the smaller modifications occur more often than the larger ones [48]. On the other

hand, the appropriate distribution depends on the problem to be solved, more precisely

on a fitness landscape that maps each position within the search space into a fitness

value. When the fitness landscape is flat, uniform distribution is more preferable for

the stochastic search process, while in rougher fitness landscapes Gaussian distribution

should be more appropriate.

This section deals with a BA algorithm that uses different random number distributions

(also randomization methods), like Uniform, Gaussian, Levy flights, Chaotic maps, and

the Random sampling in turbulent fractal cloud. while the observed randomization

Page 65: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 52

methods are well-known and widely used (e.g., uniform, Gaussian, Levi flights, and

chaotic maps), the random sampling in turbulent fractal cloud is taken from astronomy

and, as we know, it is the first time it is being used for optimization purposes. Here,

two well-known chaotic maps are also taken into account, i.e., Kent and Logistic.

5.3.1 Continuous random number distributions

Continuous random number distributions [78] are defined by a probability density func-

tion p(x), such that the probability of x occurring within the infinitesimal range x to

x+ dx is p · dx. The cumulative distribution function for the lower tail P (x) is defined

as the integral

P (x) =

∫ x

−∞p(x) · dx, (5.5)

which provides the probability of a variate taking a value of less than x. The cumulative

distribution function for the upper tail Q(x) is defined as the integral

Q(x) =

∫ +∞

xp(x) · dx, (5.6)

which provides the probability of a value greater than x.

The upper and lower cumulative distribution functions are related by P (x) +Q(x) = 1

and satisfy the following limitations 0 ≤ P (x) ≤ 1 and 0 ≤ Q(x) ≤ 1. In the remainder

of the thesis, the following distributions:

• Uniform,

• Gaussian,

• Levy flights,

• Chaotic maps, and

• Random sampling in turbulent fractal cloud

are illustrated in more detail.

5.3.1.1 Uniform distribution

Uniform continuous distribution has the density function, as follows:

p(x) =

{1b−a a ≤ x ≤ b,0 otherwise.

(5.7)

Page 66: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 53

Note that each possible value of the uniform distributed random variable is within op-

tional interval [a, b], on which the probability of each sub-interval is proportional to its

length. If a ≤ u < v ≤ b then the following relation holds:

P (u < x < v) =v − ub− a

. (5.8)

Normally, the uniform distribution is obtained by a call to the random number genera-

tor [78]. Note that the discrete variate functions always return a value of type unsigned

which on most platforms means a random value from the interval [0, 232 − 1]. In order

to obtain the random generated value within the interval [0, 1], the following mapping

is used:

r = ((double)rand()/((double)(RAND MAX) + (double)(1))), (5.9)

where r is the generated random number, the function rand() is a call of the random

number generator, and the RAND MAX is the maximal number of the random value

(232 − 1).

5.3.1.2 Normal or Gaussian distribution

Normal or Gaussian distribution is defined by the following density function:

p(x) =1

σ√

2φe−

12

(x−aσ

)2 . (5.10)

The distribution depends on parameters a ∈ R and σ > 0. This distribution is denoted

as N(a, σ). The standardized normal distribution is obtained when the distribution has

an average value of zero with a standard deviation of one, i.e., N(0, 1). In this case, the

density function is simply defined as:

p(x) =1√2φe−

12x2 . (5.11)

The Gaussian distribution has the property that approximately 2/3 of the samples drawn

lie within one standard deviation [48]. That is, that most of the modifications made on

the virtual particle will be small, while there is a non-zero probability of generating very

large modifications because the tail of the distribution never reaches zero [48].

Page 67: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 54

5.3.1.3 Levy flights

In reality, resources are distributed non-uniformly in Nature. This means, that the

behavior of a typical forager needing to find these resources as fast as possible does not

obey Gaussian distribution. In order to simulate foragers search strategies, Levy flights

is closer to their behavior [109]. It belongs to a special class of α-stable distributions

defined by a Fourier transform [78]:

p(x) =1√2φ

∫ +∞

−∞e−itx−|ct|

α. (5.12)

The α-stable means that it exhibits the same probability density distributions for each

randomly generated variable. This density function has two parameters: scale c ∈ R and

exponent α ∈ [0, 2]. A main characteristic of this distribution is that it has an infinite

variance.

Interestingly, for α = 1 the density function reduces to the Cauchy distribution, while

for α = 2 it is a Gaussian distribution with σ =√

2c. For α < 1 the tails of the

distribution become extremely wide [109]. The more appropriate setting of this param-

eter for optimization is therefore α ∈ (1.0, 2.0), where the distribution is non-Gaussian

with no variance (as in the case of Levy flights) or with no mean, variance or higher

moments defined (as in the case of Cauchy distribution). Essentially, the difference be-

tween non-Gaussian and Gaussian distributions is that the tail of the distribution by

the former is wider than by the latter. This means, the probability of generating very

large modifications is much higher by Levy flights than by Gaussian distribution.

5.3.1.4 Chaotic maps

Chaos is a phenomenon encountered in science and mathematics wherein a determinis-

tic (rule-based) system behaves unpredictably [57]. Let us assume a Logistic equation

defined as a map:

xn+1 = rxn(1− xn), (5.13)

where xn ∈ [0, 1] and parameter r is a parameter. A generated sequence of numbers

by iterating a Logistic map (also orbit) with r = 4 are chaotic. That is, it poses the

following propositions [57]:

• the dynamic rule of generating the sequence of numbers is deterministic,

• the orbits are aperiodic (they never repeat),

Page 68: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 55

• the orbits are bounded (they stay between the upper and lower limits, normally,

within the interval [0, 1]),

• the sequence has sensitive dependence on the initial condition (also SDIC).

Similar behavior can also be observed by the Kent map [8]. The Kent map is one of the

more studied chaotic maps that has been used to generate pseudo-random numbers in

many applications, like secure encryption. It is defined as follows:

x(n+ 1) =

{x(n)m , 0 < x(n) ≤ m,

(1−x(n))1−m , m < x(n) < 1,

(5.14)

where 0 < m < 1. Hence, if x(0) ∈ [0, 1], for all n ≥ 1, x(n) ∈ [0, 1]. In accordance with

the propositions in [8], m = 0.7 was used in our experiments.

5.3.1.5 Random sampling in turbulent fractal cloud

Stars’ formation begins with a random sampling of mass within a fractal cloud [51].

This random sampling is performed by the initial mass function (IMF) and represents a

basis for a new sampling method named as ’random sampling in turbulent fractal cloud’

(RSiTFC).

This method can be described as follows. Let us consider a fractal cloud that is divided

into a hierarchical structure consisting of several levels with a certain number of cloud

pieces containing a certain number of sub-pieces. Then, a sampling method randomly

samples a cloud piece from any level. The sampled pieces at the top of this hierarchical

structure are denser than the pieces at the bottom. When the cloud piece is chosen

the initial mass of that piece is identified and the piece representing the formed star is

removed from the hierarchy. This process is repeated until all of the cloud is chosen [51].

This mentioned method is formally illustrated in Algorithm 14.

The algorithm RSiTFC consists of six parameters: scaling factor L, number of levels

H, number of sub-pieces for each piece N , fractal cloud pieces x, level h, and piece

number i. The scaling factor is determined as S = L−h when calculated from the fractal

dimension expressed as D = logNlogL . The number of levels H determines the depth of the

hierarchical structure. The number of pieces increases with level h according to Poisson

distribution PN (h) = Nhe−N/h!. The length of fractal cloud pieces x is limited by NH

and consists of elements representing the initial mass of the star to be formed. Level h

denotes the current hierarchical level, while i determines the star to be formed.

Page 69: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 56

Algorithm 14 RSiTFC(L,H,N, x, h, i)

Input: L scaling factor, H number of levels, N number of sub-pieces

Output: ∗x fractal cloud, h current level, ∗i piece number

1: if (i == 0) then

2: x = new double [NH ];

3: for j = 0 to j < Nh do

4: x[∗i] = 2 ∗ (U(0, 1)− 0.5)/Lh + 0.5;

5: ∗i = ∗i+ 1;

6: end for

7: else

8: for j = 0 to j < Nh do

9: x[∗i] = x[∗i] + 2 ∗ (U(0, 1)− 0.5)/Lh;

10: ∗i = ∗i+ 1;

11: end for

12: end if

13: if (h < H) then

14: return RSiTFC(L,H,N, x, h+ 1, i);

15: end if

16: return x;

For example, let N = 2 be a constant number of sub-pieces for each piece. Then, there

is one cloud at the top level h = 0, with two pieces inside this cloud at h = 1, etc. For

H = 4, the total number of pieces is expressed as 1 + 2 + 4 + 8 + 16 = 31 [51].

5.3.1.6 Characteristics of the mentioned randomized methods

The characteristics of the mentioned randomized methods are analyzed in this exper-

iment. Indeed, two million random numbers were generated for each method and are

represented in Fig. 5.2. This figure is divided into six diagrams. Each of these presents

a histogram for a specific randomized method. Note that the histogram is a natural

graphical representation of the distribution of random values. Each histogram consists

of intervals coated on the x-axis denoting the value of the statistical variable, and their

frequencies coated on the y-axis. Indeed, the range of real values [−9.9, 9.9] is divided

into 101 intervals each of width 0.2. The interval zero comprises the range [−0.1, 0.1],

while intervals -50 and 50 capture values < −9.9 and > 9.9, respectively.

The following characteristics can be summarized from the presented plots:

Page 70: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 57

(a) Uniform (b) Gaussian

(c) Levy flights (α = 1.2) (d) Kent chaotic map

(e) Logistic chaotic map (f) RSiTFC (L = 8, H = 3, N = 3)

Figure 5.2: Results of the various randomized methods

1. At first glance, a Kent chaotic map is similar to uniform distribution, but the

frequencies of the intervals slightly differ between each other in the former. On

the other hand, the Logistic chaotic map is an inversion of both the previously

mentioned, because the border intervals 0 and 5 exhibit higher frequencies than

the inner.

2. Gaussian distribution is more compact than Levy flights because the latter enables

the generating of random numbers that are outside the intervals -50 and 50.

3. The RSiTFC plot exhibits more peaks. This behavior might be useful for the

optimization of multi-modal problems.

Page 71: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 58

In summary, three randomized methods generate random numbers into the interval [0, 1],

with the other three into an interval that is much wider than that mentioned.

5.3.2 Variants of the Randomized Bat Algorithm

Randomized BA (RBA) is based on the original BA that is upgraded with the mentioned

randomized methods. In place of the original Eq. 4.2, the modified equation is used by

RFA, as follows:

x(t) = best + εA(t)i Ri, (5.15)

where Ri denotes one of the randomized methods presented in Table 5.2.

Table 5.2: Variant of the RBA algorithm

Randomization method Random generator Implementation RBA variant

Uniform distributed Ui(0, 1) Standard C-library UBAGaussian distributed Ni(−∞,+∞) GSL-library NBALevy flights Li(−∞,+∞) GSL-library LBAKent chaotic map CKi (0, 1) From scratch CBA1Logistic chaotic map CLi (0, 1) From scratch CBA2RSiTFC Fi(−∞,+∞) From scratch FBA

As can be seen from Table 5.2, the six randomized methods are used in the RBA algo-

rithm that also differ according to the interval of generated random values. The former

returns the random values over an interval [0, 1], like Uniform distribution, and both

chaotic maps, while the latter within an interval (−∞,+∞), like Gaussian, Levy flights,

and RSiTFC. When the random value is generated within the interval [0, 1], this value is

extended to the interval [−1, 1] using a formula ri = 2(ri−0.5). However, the generated

solution value xi is verified as lying within the valid interval xi ∈ [lbi, ubi], in both cases.

Interestingly, some implementations of random generators can be found in Standard

C-library (as a standard random generator for generating uniformly distributed random

values), another in the GNU Scientific Library [78] (as random generators for generating

the Gaussian and Levy flights distributed random values), and the rest were developed

from scratch (as chaotic maps and RSiTFC). According to the used randomized method,

six different variants of the RBA algorithm are developed, i.e., UBA, NBA, LBA, CBA1,

CBA2, and FBA).

Page 72: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 59

5.4 Experiments and results

The goal of our experimental work was threefold. Firstly, to show that the BA algorithm

hybridized with the ’rand/1/bin’ DE strategy and the random forest regression can im-

prove the results of the original BA algorithm. In line with this, both HBA and HBARF

were developed according to specifications described at the beginning of this chapter, and

they results were compared with the results obtained by other well-known algorithms,

like FA, DE and ABC. Secondly, to show how using the various probability distribution

affects the results of the original algorithm. For this experiment, various probability

distributions were employed from standard libraries or developed from scratch and in-

corporated into the original BA algorithm. Thus, six BA algorithms were developed, as

follows:

• UBA: using uniform distribution,

• NBA: using Gaussian distribution,

• LBA: using Levy distribution,

• CBA1: using the Kent chaos map,

• CBA2: using the logistic chaos map,

• FBA: using random sampling in turbulent fractal clouds.

Thirdly, some variants of such a BA algorithm can obtain the results that are comparable

with the results of the other well-known algorithms, like FA, DE and ABC.

The BA parameters in the experiments were set as follows: loudness A0 = 0.5, pulse

rate r = 0.5, frequencies Q(t)i ∈ [0.0, 2.0], while the DE with the strategy ’rand/1/bin’

as well as ensemble DE strategies operated with the following parameters: F = 0.5,

CR = 0.9. The number of independent runs was 25. The mentioned parameter setting

represented the best values of those found after extensive experimental work.

The rest of this chapter is structured as follows. The next section represents a suite of

five well-known functions on which all the tests were conducted. Then PC configuration

on which experiments were performed is, then illustrated. Afterwards after representing

the results of the first experiment, the results of the second experiment follow. This

section concludes with a discussion about the obtained results.

Page 73: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 60

5.4.1 Test suite

For the first experimental part, a proprietary benchmark suite consisting of only 5 func-

tions was taken from the literature [231] because random forests have a very huge time

complexity. For the second experimental part, a Black-Box (BBOX) [93] standard func-

tion benchmark suite consisting of 24 function was taken into account. The task of the

function optimization was to find the global optimums of each function in the suite. Note

that all functions in the proprietary function benchmark suite have global optimums of

zero, while this optimum is shifted and rotated by the BBOX function benchmark suite.

Detailed characteristics of the functions in the proprietary test suite are represented

in the remainder of the paper. A detailed description of the characteristics of BBOX

benchmark functions is outside the scope of this paper, and can be found in [60].

5.4.1.1 Griewangk’s function

The aim of this function is overcoming failures that are optimizing each variable inde-

pendently. The function is multi-modal, since the number of local optima increases with

the dimensionality. After sufficiently high dimensions (n > 30), multi-modality seems

to disappear and the problem becomes unimodal.

f1(~x) = −n∏i=1

cos

(xi√i

)+

n∑i=1

x2i

4000+ 1, (5.16)

where −600 ≤ xi ≤ 600. The function has global minimum at 0.

5.4.1.2 Rosenbrock’s function

The Rosenbrock function, similarly to Rastrigin’s has its value 0 at global minimum. The

global optimum is located inside a parabolic, narrow shaped flat valley. Variables are

strongly dependent from each other, since it is difficult to converge the global optimum.

f2(~xi) =

n−1∑i=1

100 (xi+1 − x2i )

2 + (xi − 1)2, (5.17)

where −15.00 ≤ xi ≤ 15.00.

5.4.1.3 De Jong’s sphere function

f3(s) =

D∑i=1

s2i , (5.18)

Page 74: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 61

where si ∈ [−15, 15] and global minimum f∗ = 0 of which is at s∗ = (0, 0, . . . , 0). The

function is unimodal and convex.

5.4.1.4 Rastrigin’s function

Based on a Sphere function, the Rastrigin function adds cosine modulation for creating

many local minima. Because of this feature, the function is multimodal. Its global

minimum is at the value 0.

f4(~xi) = n ∗ 10 +n∑i=1

(x2i − 10 cos(2πxi)), (5.19)

where −15.00 ≤ xi ≤ 15.00.

5.4.1.5 Ackley’s function

The complexity of this function is moderated, since there is an exponential term that

covers its surface with numerous local minima. It is based on the gradient slope. Only

the algorithm that uses the gradient steepest descent will be trapped in a local optima.

Search strategy, analyzing wider area, will be able to cross the valley among the optima

and achieve better results.

f5(x) =

n−1∑i=1

[20+e−20e−0.2√

0.5(x2i+1+x

2i )−

e0.5(cos(2πxi+1)+cos(2πxi))],

(5.20)

where −32.00 ≤ xi ≤ 32.00. The global minimum of this function is at 0.

5.4.2 PC configuration

Configuration of the PC on which experiments were executed was as follows:

• HP Pavilion g4,

• processor Intel(R) Core(TM) i5 @ 2.40 GHz,

• memory 8 GB,s

• implemented in C++.

Page 75: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 62

5.4.3 Influence of hybridizations

The results of the experiments are illustrated in Table 5.3, where the best results accord-

ing to the best, worst, mean, median, and standard deviation values, and requested time

for calculating the solution in seconds, obtained by the algorithms BA, HBA, HBARF,

FA, DE and ABS when optimizing the functions f1-f5 with dimension D = 10 were

bold. In this experiment, each algorithm run with a population size Np = 10, while as

the termination condition the MAX FE = 1000 ·D function evaluations was used.

Table 5.3: The results of experiments (D=10)

Alg. Value f1 f2 f3 f4 f5

BA

Best 3.29E+01 1.07E+04 5.33E+01 6.07E+01 1.37E+01Worst 1.73E+02 1.58E+06 3.11E+02 5.57E+02 2.00E+01Mean 8.30E+01 5.53E+05 1.44E+02 2.27E+02 1.75E+01

Median 3.91E+01 4.69E+05 6.44E+01 1.06E+02 1.68E+00StDev 6.94E+01 4.71E+05 1.48E+02 2.17E+02 1.73E+01Time 9.28E-02 8.72E-02 8.28E-02 9.00E-02 8.96E-02

HBA

Best 2.25E-09 6.34E-02 4.83E-09 5.12E+00 6.31E-04Worst 3.97E-05 5.10E+02 2.89E-03 2.38E+01 2.00E+01Mean 3.18E-06 6.22E+01 1.26E-04 1.55E+01 1.16E+01

Median 8.66E-06 1.15E+02 5.66E-04 4.46E+00 9.26E+00StDev 1.14E-07 7.73E+00 1.66E-07 1.69E+01 1.78E+01Time 1.88E-02 1.20E-02 8.80E-03 1.56E-02 1.44E-02

HBARF

Best 1.44E-11 5.00E-05 2.36E-06 3.09E-05 7.21E-04Worst 6.35E-04 1.99E+00 5.90E-02 1.02E+01 3.53E-01Mean 3.92E-05 2.64E-01 5.92E-03 5.92E-01 3.14E-02

Median 1.05E-06 1.58E-01 4.50E-04 1.07E-01 1.27E-02StDev 1.25E-04 5.44E-01 1.22E-02 2.00E+00 6.76E-02Time 1.42E+03 1.43E+03 1.42E+03 1.38E+03 1.42E+03

FA

Best 1.05E+00 4.91E+06 1.43E+01 4.40E+02 2.10E+01Worst 1.11E-01 4.91E+06 1.43E+01 4.28E+02 2.09E+01Mean 4.66E-01 4.91E+06 1.43E+01 4.38E+02 2.10E+01

Median 1.25E-01 4.91E+06 1.43E+01 4.39E+02 2.10E+01StDev 4.08E-01 2.75E+00 0.00E+00 3.68E+00 5.18E-02Time 1.70E-01 1.40E-01 5.00E-02 1.90E-01 1.80E-01

DE

Best 4.46E+00 3.25E+04 8.85E+03 6.81E+01 6.87E+00Worst 9.06E+01 3.29E+07 3.71E+05 1.51E+03 1.74E+01Mean 2.91E+01 5.47E+06 1.04E+05 4.63E+02 1.19E+01

Median 2.50E+01 8.96E+06 9.96E+04 3.79E+02 2.81E+00StDev 2.15E+01 1.69E+06 6.33E+04 3.67E+02 1.21E+01Time 1.70E-01 8.00E-02 8.00E-02 1.30E-01 1.70E-01

ABC

Best 1.67E-08 1.67E-02 1.07E-16 9.95E-14 1.44E-10Worst 3.69E-02 6.63E+00 1.34E-15 2.07E-04 1.48E-07Mean 1.51E-02 1.68E+00 3.80E-16 8.70E-06 1.39E-08

Median 9.36E-03 1.87E+00 2.54E-16 4.13E-05 3.03E-08StDev 1.48E-02 6.94E-01 3.19E-16 4.45E-11 5.17E-09Time 9.00E-02 2.00E-02 2.00E-02 7.00E-02 7.00E-02

Page 76: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 63

It can be seen from Table 5.3 that both hybridized algorithms improved the results of

the BA. Both mentioned algorithms outperformed the results of other algorithms in

test also by optimizing the functions f1 and f2, while the ABC outperformed the other

algorithms in test by optimizing the functions f3 − f5. According to time complexity,

the faster algorithms was HBA by optimizing all functions, while the HBARF was the

most time complex, primarily, because of using the RF regression method.

In order to show that the improvements were also significant, the Friedman’s non-

parametric tests of classifiers consisting of results achieved by each algorithm according

to the best, worst, median, standard deviation and median values were conducted with

a significant level 0.05. As a result, six algorithms (classifiers) were compared according

to 5 ∗ 5 = 25 variables. The Friedman test [76, 77] compares the average ranks of the

algorithms. As closer rank to value one, better is the algorithm. A null-hypothesis

states that two algorithms are equivalent and, therefore, their ranks should be equal. If

the null-hypothesis is rejected, i.e., the performance of the algorithms is statistically dif-

ferent, the Bonferroni-Dunn test [41] is performed that calculates the critical difference

between the average ranks of those two algorithms. When the statistical difference is

higher than the critical difference, the algorithms are significantly different. The critical

difference is presented in diagrams as lines. Two algorithms are significantly different

if their corresponding lines do not overlap. The equation for the calculation of critical

difference can be found in [41].

BA

HBA

HBARF

FA

DE

ABC

1 2 3 4 5 6

Average rank (D=10)

Figure 5.3: Friedman’s non-parametric test

As can be seen from Figure 5.3, ABC, HBA and HBARF significantly improved the

results of the other algorithms in test, i.e., DE, FA and BA. Indeed, the ABC is signifi-

cantly better than the HBA also. On the other hand, HBARF improved HBA. From the

comparison of the results obtained by both hybridized BA algorithms with each other,

it can be concluded that the HBARF achieved the best results according to the mean

and standard deviation values, while the results of HBA are more dissipated around a

mean value.

Page 77: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 64

5.4.4 Influence of randomness

The influence of randomness on the results of the original BA algorithm was tested,

in this experiment. Note that the original BA algorithm uses the Normal distribution.

According to our terminology, the original BA algorithm is NBA variant of RBA. Two

experiments were conducted, as follows:

• influence of randomization methods on the results of RBA,

• comparing the results of various RBA variants with other algorithms, like FA, DE,

and ABC.

In the former experiments, the best randomization methods in BA were determined,

while in the latter, the BA algorithms using these randomization methods are compared

with the other well-known algorithms in order to show how crucial the selection of the

proper randomization method is for the results of the BA algorithm.

5.4.4.1 Influence of randomization methods on the results of RBA

The impact of various randomized methods on the results of the RBA algorithms was

verified, in this experiment. Therefore, the six variants of the RBA algorithms, i.e.

UBA, NBA, LBA, CBA1, CBA2, and FBA were applied to the test BBOX benchmark

suite. Indeed, the experimental setup was employed as presented at the beginning of

this section. Additionally, each algorithm was run with population size Np = 100, while

as a termination condition MAX FE = 1000 ·D fitness function evaluations was used.

Although the functions with dimensions D = 10, D = 30, and D = 50 were optimized,

only those results optimizing the functions with dimension D = 30 are presented in

Table 5.4, because of the limitation of the thesis length. Furthermore, although the

results were accompanied according to the best, worst, mean, standard deviation and

median measures, the mean and standard deviation are only presented in the table. The

best results are bold in this table.

As can be seen from Table 5.4, the LBA algorithm randomized with Levy flights outper-

formed the results of other variants of the RBA algorithm except by optimizing functions

f15, f17 f18, f19, and f21, where the NBA algorithm was better.

Friedman tests were conducted using a significance level 0.05. Here, six classifiers are

compared according to 24 ∗ 5 = 120 variables, where 24 denotes the number of functions

and 5 the number of measures. The results of the Friedman non-parametric test are pre-

sented in Fig. 5.4, being divided into three diagrams that show the ranks and confidence

Page 78: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 65

Table 5.4: Results of RBA according to various randomization methods (D=30)

Function Meas. UBA NBA LBA CBA1 CBA2 FBA

f1Mean 7.13E+01 6.94E+00 7.10E-05 7.76E+01 7.68E+01 6.24E+01Stdev 2.11E+01 5.49E+00 1.10E-05 2.05E+01 1.99E+01 1.71E+01

f2Mean 1.47E+06 1.18E+05 3.89E+02 1.56E+06 1.52E+06 1.19E+06Stdev 7.08E+05 7.11E+04 2.25E+02 7.47E+05 7.16E+05 5.49E+05

f3Mean 6.20E+02 4.85E+02 1.80E+02 6.50E+02 6.43E+02 5.79E+02Stdev 9.93E+01 1.10E+02 4.99E+01 9.48E+01 8.61E+01 1.19E+02

f4Mean 8.02E+02 6.41E+02 2.51E+02 8.57E+02 8.67E+02 7.94E+02Stdev 1.35E+02 1.84E+02 1.01E+02 1.42E+02 1.26E+02 1.19E+02

f5Mean 9.04E+01 7.41E+01 3.46E-02 1.17E+02 1.20E+02 8.55E+01Stdev 3.72E+01 5.53E+01 3.58E-02 4.41E+01 4.61E+01 5.26E+01

f6Mean 6.29E+04 3.79E+03 4.04E+01 6.95E+04 6.56E+04 6.10E+04Stdev 4.72E+04 7.16E+03 3.79E+01 7.91E+04 8.05E+04 9.62E+04

f7Mean 4.52E+02 4.06E+02 1.58E+02 4.62E+02 4.56E+02 3.40E+02Stdev 1.98E+02 1.44E+02 7.09E+01 2.54E+02 2.69E+02 1.45E+02

f8Mean 3.69E+04 5.12E+02 5.12E+01 3.89E+04 3.85E+04 1.99E+04Stdev 1.92E+04 4.67E+02 4.58E+01 1.88E+04 1.85E+04 1.36E+04

f9Mean 6.26E+03 6.55E+01 3.85E+01 7.74E+03 7.85E+03 4.64E+03Stdev 2.52E+03 4.16E+01 2.63E+01 3.18E+03 3.40E+03 2.00E+03

f10Mean 1.23E+06 2.15E+05 9.43E+03 9.13E+05 8.62E+05 9.85E+05Stdev 8.33E+05 2.18E+05 3.49E+03 3.97E+05 3.95E+05 5.83E+05

f11Mean 4.12E+02 2.73E+02 1.88E+02 4.16E+02 4.19E+02 4.02E+02Stdev 1.11E+02 1.31E+02 8.38E+01 1.13E+02 1.14E+02 1.03E+02

f12Mean 1.37E+08 1.96E+07 1.72E+03 1.31E+08 1.30E+08 1.04E+08Stdev 4.27E+07 1.86E+07 6.19E+03 5.84E+07 5.78E+07 4.65E+07

f13Mean 1.70E+03 5.59E+02 1.90E+01 1.69E+03 1.68E+03 1.54E+03Stdev 1.74E+02 2.72E+02 1.82E+01 2.18E+02 2.16E+02 2.23E+02

f14Mean 2.52E+01 6.50E+00 2.72E-03 2.68E+01 2.64E+01 2.04E+01Stdev 7.11E+00 3.54E+00 2.46E-04 9.30E+00 9.08E+00 5.41E+00

f15Mean 6.72E+02 4.66E+02 5.71E+02 6.04E+02 6.00E+02 6.15E+02Stdev 1.20E+02 1.38E+02 1.74E+02 1.27E+02 1.38E+02 1.04E+02

f16Mean 3.42E+01 3.06E+01 2.57E+01 3.12E+01 3.00E+01 2.92E+01Stdev 4.20E+00 5.03E+00 4.89E+00 5.94E+00 4.69E+00 4.79E+00

f17Mean 1.00E+01 6.72E+00 7.95E+00 9.36E+00 9.22E+00 1.01E+01Stdev 2.09E+00 1.43E+00 1.44E+00 1.79E+00 1.92E+00 2.36E+00

f18Mean 4.15E+01 2.62E+01 3.05E+01 3.89E+01 3.93E+01 3.81E+01Stdev 9.97E+00 5.51E+00 8.01E+00 7.52E+00 7.38E+00 1.27E+01

f19Mean 7.83E+00 6.74E+00 7.10E+00 7.81E+00 7.87E+00 7.89E+00Stdev 1.50E+00 1.15E+00 1.35E+00 1.09E+00 1.07E+00 1.12E+00

f20Mean 4.70E+03 2.29E+00 1.84E+00 1.01E+04 9.80E+03 2.95E+03Stdev 4.69E+03 3.30E-01 1.90E-01 7.85E+03 7.53E+03 1.78E+03

f21Mean 6.88E+01 1.57E+01 7.71E+00 6.95E+01 6.92E+01 5.84E+01Stdev 6.41E+00 9.50E+00 8.30E+00 1.11E+01 1.12E+01 1.33E+01

f22Mean 7.23E+01 1.58E+01 1.21E+01 7.23E+01 7.16E+01 6.24E+01Stdev 5.41E+00 7.83E+00 1.76E+01 5.49E+00 5.24E+00 1.11E+01

f23Mean 3.63E+00 1.38E+00 1.28E+00 3.80E+00 3.46E+00 3.76E+00Stdev 7.96E-01 6.34E-01 4.18E-01 7.07E-01 1.23E+00 9.16E-01

f24Mean 4.16E+02 2.75E+02 3.90E+02 4.24E+02 4.19E+02 4.00E+02Stdev 4.89E+01 4.19E+01 7.88E+01 4.98E+01 5.55E+01 4.30E+01

intervals (critical differences) for the algorithms under consideration. Again, as closer

the rank value to one, better is the algorithm. The diagrams are organized according to

the dimensions of functions.

Page 79: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 66

UBA

NBA

LBA

CBA1

CBA2

FBA

1 2 3 4 5 6

Average rank (D=10)

1 2 3 4 5 6

Average rank (D=30)

1 2 3 4 5 6

Average rank (D=50)

Figure 5.4: Results of the Friedman non-parametric test on different variants of RBAalgorithms

The first diagram in Fig. 5.2 shows that the LBA significantly outperforms the results of

all other variants of RBA, except NBA, when optimizing functions with dimension D =

10. In the second diagram representing the optimization of functions with dimension

D = 30, it can be seen that two algorithms, i.e., LBA and MBA, significantly outperform

the results of the other algorithms in tests. Finally, optimizing the functions with

dimension D = 50 performed by LBA, NBA and FBA are significantly better than

by the other algorithms in tests.

5.4.4.2 Comparative study

In this experiment, the original BA algorithm (NBA) was compared with other well-

known algorithms as FA, DE, and ABC, as well as the more promising variants of the

RBA algorithm, like LBA and FBA.

The specific FA parameters were set as follows: the randomized parameter α = 0.1, the

attractiveness β = 0.2, and the fixed light absorption γ = 0.9. The DE parameters were

set as follows: the amplification factor of the difference vector F = 0.5, and the crossover

control parameter CR = 0.9. The percentage of onlooker bees for the ABC algorithm

was 50% of the colony, the employed bees represented another 50% of the colony, while

one scout bee was generated in each generation (i.e., limits = 100, when the population

size is Np = 100).

The results of comparing the mentioned algorithms by optimizing the functions with

dimension D = 10, are presented in Table 5.5. Again, only one instance of data repre-

senting the results of optimizing the functions with dimension D = 50 is illustrated in

the table, although the experiments were conducted on all three observed dimensions.

Actually, the results are presented according to the measures mean and standard devia-

tion, although these were accompanied according to all five measures mentioned before.

The best results of the algorithms are written in bold.

Page 80: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 67

Table 5.5: Comparing the results of RBA with the results of other algorithms (D=50)

Function Meas. NBA LBA FBA FA DE ABC

f1Mean 4.66E+00 2.81E-04 1.18E+02 2.15E+02 8.89E-03 2.26E-01Stdev 6.31E+00 3.20E-05 2.93E+01 4.06E+01 2.72E-03 2.70E-01

f2Mean 1.62E+05 4.71E+02 2.54E+06 5.81E+06 1.30E+01 5.33E+02Stdev 1.47E+05 2.27E+02 8.21E+05 2.59E+06 5.05E+00 1.04E+03

f3Mean 1.08E+03 2.14E+02 1.17E+03 1.42E+03 4.11E+02 8.23E+01Stdev 1.92E+02 5.33E+01 1.58E+02 1.96E+02 2.52E+01 2.11E+01

f4Mean 1.25E+03 2.50E+02 1.44E+03 1.43E+03 4.65E+02 1.05E+02Stdev 2.27E+02 7.58E+01 2.08E+02 1.38E+02 2.34E+01 2.19E+01

f5Mean 2.79E+02 2.31E-03 2.57E+02 7.78E+02 5.17E-02 1.09E+01Stdev 7.63E+01 2.39E-03 8.11E+01 4.13E+01 2.54E-02 9.91E+00

f6Mean 3.50E+03 2.71E+01 2.43E+05 7.51E+05 1.66E+02 4.40E+02Stdev 5.70E+03 2.72E+01 9.20E+04 2.63E+05 4.90E+01 1.04E+02

f7Mean 9.52E+02 3.24E+02 7.97E+02 1.36E+03 2.98E+01 3.09E+02Stdev 2.28E+02 8.09E+01 1.92E+02 4.11E+02 9.33E+00 3.87E+01

f8Mean 3.17E+02 9.00E+01 7.06E+04 9.66E+04 5.91E+01 2.35E+02Stdev 2.60E+02 3.07E+01 2.35E+04 2.94E+04 4.21E+00 8.68E+01

f9Mean 5.84E+01 5.59E+01 6.00E+03 1.63E+03 6.06E+01 2.74E+02Stdev 1.82E+01 1.90E+01 2.19E+03 3.15E+02 1.02E+01 1.03E+02

f10Mean 3.66E+05 1.65E+04 1.62E+06 8.89E+06 3.74E+05 3.10E+05Stdev 2.98E+05 4.57E+03 8.73E+05 4.65E+06 9.49E+04 7.48E+04

f11Mean 3.33E+02 2.92E+02 5.46E+02 3.61E+02 4.18E+02 4.35E+02Stdev 1.76E+02 1.12E+02 8.76E+01 1.43E+02 7.54E+01 3.33E+01

f12Mean 1.85E+07 4.58E+02 2.16E+08 3.62E+08 1.69E+04 6.28E+05Stdev 1.80E+07 3.58E+02 6.62E+07 1.25E+08 9.99E+03 7.06E+05

f13Mean 5.49E+02 1.07E+01 2.28E+03 2.96E+03 4.71E+01 6.18E+02Stdev 3.44E+02 6.62E+00 2.70E+02 2.99E+02 1.07E+01 8.70E+01

f14Mean 8.03E+00 5.36E-03 3.47E+01 5.78E+01 8.13E-02 3.45E+00Stdev 4.91E+00 6.46E-04 8.38E+00 1.27E+01 1.22E-02 1.74E+00

f15Mean 1.05E+03 1.23E+03 1.15E+03 1.46E+03 4.42E+02 1.01E+03Stdev 2.78E+02 2.81E+02 2.21E+02 2.04E+02 2.79E+01 1.14E+02

f16Mean 3.79E+01 3.22E+01 3.64E+01 4.90E+01 3.75E+01 2.09E+01Stdev 6.40E+00 5.08E+00 3.77E+00 5.12E+00 2.49E+00 2.16E+00

f17Mean 7.28E+00 9.23E+00 1.05E+01 1.24E+01 6.73E-01 1.19E+01Stdev 1.27E+00 1.97E+00 1.90E+00 2.35E+00 1.78E-01 9.68E-01

f18Mean 3.33E+01 3.74E+01 4.10E+01 4.68E+01 4.55E+00 4.44E+01Stdev 8.82E+00 8.31E+00 6.13E+00 9.66E+00 8.98E-01 5.54E+00

f19Mean 6.97E+00 7.53E+00 8.16E+00 7.47E+00 7.76E+00 1.32E+01Stdev 1.10E+00 9.91E-01 8.86E-01 5.63E-01 6.93E-01 1.37E+00

f20Mean 2.59E+00 1.91E+00 1.08E+04 1.16E+04 3.58E+00 1.36E+00Stdev 2.98E-01 2.20E-01 5.68E+03 3.63E+03 2.39E-01 1.57E-01

f21Mean 9.97E+00 6.91E+00 7.32E+01 8.16E+01 5.47E+00 3.89E+00Stdev 1.24E+01 1.22E+01 6.41E+00 5.70E+00 4.23E+00 2.57E+00

f22Mean 1.49E+01 1.05E+01 7.60E+01 8.54E+01 4.74E+00 4.62E+00Stdev 9.16E+00 7.61E+00 5.49E+00 6.07E+00 4.33E+00 3.49E+00

f23Mean 2.45E+00 2.01E+00 4.72E+00 5.66E+00 4.05E+00 3.01E+00Stdev 7.27E-01 1.02E+00 5.11E-01 8.50E-01 4.83E-01 3.88E-01

f24Mean 4.65E+02 5.95E+02 7.48E+02 6.98E+02 4.83E+02 1.20E+03Stdev 8.13E+01 1.26E+02 5.86E+01 3.70E+01 2.46E+01 7.44E+01

As can be seen from Table 5.5, the LBA algorithm outperformed the results of the other

algorithms 10 times, DE and ABC six times, and NBA and FA only once. Also here,

the Friedman tests using the significance level 0.05 were conducted according to all the

Page 81: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 68

observed dimensions of the functions. The results of these tests are presented in Fig. 5.5,

which is divided into three diagrams.

NBA

LBA

FBA

FA

DE

ABC

1 2 3 4 5 6

Average rank (D=10)

1 2 3 4 5 6

Average rank (D=30)

1 2 3 4 5 6

Average rank (D=50)

Figure 5.5: Results of the Friedman non-parametric test on suite of test algorithms

The first diagram presented the results of the Friedman test comparing the results

obtained by optimizing the functions with dimensions D = 10. From this diagram, it

can be shown that ABC, DE and LBA outperformed the results of the other algorithms

significantly in the test. A similar situation also appears by optimizing the functions

with dimension D = 30. Finally, on the functions with dimension D = 50, the DE and

LBA achieved significantly better results of all other algorithms except ABC. Note that

satisfactory results were also reached by NBA.

5.5 Discussion

In general, our experimental work started from the following three assumptions:

• hybridization improves the results of the original BA algorithm,

• selection of the randomized method has a crucial impact on the results of the

original BA algorithm, and

• the results of the hybridized algorithm are comparable with the results of the other

EA and SI algorithms.

In order to confirm the first assumption, the original BA algorithm was hybridized using

DE strategy ’rand/1/bin’ (HBA), on the one hand, and with ensemble DE strategies and

random forest that predicts the best solution in a population on the basis of statistical

regression, on the other hand. Both hybridizations outperform the results of the original

BA algorithm significantly. Therefore, the first assumption can be accepted.

For the acceptance of the second assumption, the original BA algorithm (NBA) was

hybridized using five various randomization methods (UBA, LBA, CBA1, CBA2, FBA).

Each of these six algorithms were applied to the BBOX function benchmark suite in order

Page 82: A comprehensive review of bat algorithms and their ...

Chapter 5. Bat Algorithm 69

to show that the obtained results are significantly different. Experiments on 24 functions

confirmed that the selection of randomized method has a crucial impact on the results

of the BA algorithm.

Finally, the BA algorithm hybridized with the best randomized methods from the previ-

ous experiment were compared with the other well-known algorithms, like FA, DE, and

ABC. It turned out that the results of the LBA improved according to the results of DE

and ABC by increasing the dimensions of the functions. This meant, the results of LBA

by dimension D = 50 are almost equal to the results of the DE that obtained the best

results by optimizing the functions of all dimensions. As a result, our last assumption

can be partly accepted.

Page 83: A comprehensive review of bat algorithms and their ...

Chapter 6

Conclusion

The goal of this thesis was threefold. Firstly, a comprehensive review of those papers

tackling a BA algorithm was performed, in order to get broader view of this domain.

Then, some hybridizations of the original BA algorithm were proposed for further im-

proving the results of this. Finally, an extensive study was performed in which the

influence of the various randomization methods were analyzed. One intention of this

analysis was to show that the selection of the proper randomized method has a crucial

impact on the results of the optimization.

A comprehensive review of the BA algorithm showed that this algorithm is at the

beginning phase of development. That is, many features like self-adaptation, multi-

population, etc. have not been incorporated in the algorithm, as yet. On the other

hand, it has emerged for solving the more real-world problems. Although papers dealing

with this topic are arising almost every day, the impact factors of journals in which these

are published is lower at the moment. This means, in order for this algorithm to become

matured, some time might need to elapse. If this thesis helps to further popularize the

BA algorithm, we have reached our purpose.

Three conclusions were confirmed by our experimental work:

• hybridization has beneficial effects on the results of the original BA algorithm,

• the selection of randomized methods has a crucial impact on the results of the

original BA algorithm,

• the results of the hybridized BA algorithm with Levy flights are comparable with

the results of the other well-known algorithms, like FA, DE, and ABC.

In general, the message of this thesis should be as follows. The original BA algorithm

works well on problems of lower dimensions but because most real-world problems are

70

Page 84: A comprehensive review of bat algorithms and their ...

Chapter 6. Conclusion 71

higher dimensionally, using hybridization that can improve the results significantly is

inevitable. Therefore, the directions available for the further development of this algo-

rithm are enormous. Let us mention only some of these: better control of exploration

and exploitation, adaptation and self-adaptation of control parameters, working with

subpopulations, etc.

Page 85: A comprehensive review of bat algorithms and their ...

Appendix A

List of bio-inspired and swarm

intelligence algorithms

Many bio-inspired algorithms were developed so far, but there are no complete list of

all algorithms. In this thesis, we tried also to collect a list of main algorithms, which

are presented in the next table.

72

Page 86: A comprehensive review of bat algorithms and their ...

Appendix A. List of bio-inspired algorithms 73

Sw

arm

inte

llig

ence

bas

edal

gori

thm

sB

io-i

nsp

ired

(not

SI-

bas

ed)

algo

rith

ms

Alg

orit

hm

Auth

orR

efer

ence

Alg

orit

hm

Auth

orR

efer

ence

Acc

eler

ated

PSO

Yan

get

al.

[235

,24

1]A

tmos

pher

ecl

ouds

model

Yan

and

Hao

[228

]A

nt

colo

ny

opti

miz

atio

nD

orig

o[4

2]B

ioge

ogra

phy-b

ased

opti

miz

atio

nSim

on[1

97]

Art

ifici

alb

eeco

lony

Kara

bog

aan

dB

astu

rk[1

16]

Bra

inSto

rmO

pti

miz

atio

nShi

[193

]B

acte

rial

fora

gin

gP

assi

no

[165

]D

iffer

enti

alev

oluti

onSto

rnan

dP

rice

[199

]B

acte

rial-

GA

For

agin

gC

hen

etal

.[2

4]D

olphin

echol

oca

tion

Kav

ehan

dF

arhou

di

[119

]B

atalg

orit

hm

Yan

g[2

33]

Jap

anes

etr

eefr

ogs

callin

gH

ernan

dez

and

Blu

m[9

9]B

eeco

lony

opti

miz

atio

nT

eodor

ovic

and

Del

l’O

rco

[209

]E

co-i

nsp

ired

evol

uti

onar

yal

gori

thm

Par

pin

elli

and

Lop

es[1

64]

Bee

syst

emL

uci

can

dT

eodor

ovic

[142

]E

gypti

anV

ult

ure

Sur

etal

.[2

04]

Bee

Hiv

eW

edde

etal

.[2

19]

Fis

h-s

chool

Sea

rch

Lim

aet

al.

[7,

38]

Wol

fse

arch

Tang

etal

.[2

07]

Flo

wer

pol

linat

ion

algo

rith

mY

ang

[236

,245

]B

ees

algo

rith

ms

Pham

etal

.[1

68]

Gen

eex

pre

ssio

nF

erre

ira

[59]

Bee

ssw

arm

opti

miz

atio

nD

rias

etal

.[4

4]G

reat

salm

onru

nM

ozaff

ari

[151

]B

um

ble

bee

sC

omel

las

and

Mar

tinez

[32]

Gro

up

sear

chop

tim

izer

He

etal

.[9

7]C

at

swar

mC

hu

etal

.[2

6]H

um

an-I

nsp

ired

Alg

orit

hm

Zhan

get

al.

[254

]C

onsu

ltant-

guid

edse

arch

Iord

ache

[106

]In

vasi

vew

eed

opti

miz

atio

nM

ehra

bia

nan

dL

uca

s[1

48]

Cuck

oo

sear

chY

ang

and

Deb

[237

]M

arri

age

inhon

eyb

ees

Abbas

s[1

]E

agle

stra

tegy

Yan

gan

dD

eb[2

38]

OptB

ees

Mai

aet

al.

[144

]F

ast

bac

teri

alsw

arm

ing

algo

rith

mC

hu

etal

.[2

7]P

addy

Fie

ldA

lgor

ithm

Pre

mar

atne

etal

.[1

72]

Fir

efly

algo

rith

mY

ang

[232

]R

oach

infe

stat

ion

algo

rith

mH

aven

s[9

6]F

ish

swarm

/sch

ool

Li

etal

.[1

37]

Quee

n-b

eeev

oluti

onJung

[111

]G

ood

latt

ice

swarm

opti

miz

ati

onSu

etal

.[2

01]

Shuffl

edfr

ogle

apin

gal

gori

thm

Eusu

ffan

dL

anse

y[5

5]G

low

worm

swar

mop

tim

izat

ion

Kri

shnan

and

and

Ghos

e[1

32,

133]

Ter

mit

eco

lony

opti

miz

atio

nH

eday

atza

deh

etal

.[9

8]H

iera

rchic

alsw

arm

model

Chen

etal

.[2

3]P

hysi

csan

dC

hem

istr

ybas

edal

gori

thm

sK

rill

Her

dG

andom

ian

dA

lavi

[79]

Big

ban

g-big

Cru

nch

Zan

di

etal

.[2

51]

Mon

key

searc

hM

uch

erin

oan

dSer

ef[1

52]

Bla

ckhol

eH

atam

lou

[95]

Par

ticl

esw

arm

algo

rith

mK

enned

yan

dE

ber

har

t[1

21]

Cen

tral

forc

eop

tim

izat

ion

For

mat

o[7

3]V

irtu

al

ant

alg

orit

hm

Yan

g[2

46]

Char

ged

syst

emse

arch

Kav

ehan

dT

alat

ahar

i[1

20]

Vir

tual

bee

sY

ang

[229

]E

lect

ro-m

agnet

ism

opti

miz

atio

nC

uev

aset

al.

[35]

Wei

ghtl

ess

Sw

arm

Alg

orit

hm

Tin

get

al.

[210

]G

alax

y-b

ased

sear

chal

gori

thm

Shah

-Hos

sein

i[1

90]

Oth

eralg

orit

hm

sG

ravit

atio

nal

sear

chR

ashed

iet

al.

[179

]A

narc

hic

soci

ety

opti

miz

ati

onShay

eghi

and

Dad

ashp

our

[191

]H

arm

ony

sear

chG

eem

etal

.[8

5]A

rtifi

cial

coop

erati

vese

arch

Civ

icio

glu

[29]

Inte

llig

ent

wat

erdro

pShah

-Hos

sein

i[1

87]

Bac

ktr

ackin

gop

tim

izat

ion

sear

chC

ivic

iogl

u[3

0]R

iver

form

atio

ndynam

ics

Rab

anal

etal

.[1

75]

Diff

eren

tial

searc

hal

gori

thm

Civ

icio

glu

[28]

Sel

f-pro

pel

led

par

ticl

esV

icse

k[2

15]

Gra

mm

atic

alev

oluti

on

Rya

net

al.

[184

]Sim

ula

ted

annea

ling

Kir

kpat

rick

etal

.[1

26]

Imp

eria

list

com

pet

itiv

eal

gori

thm

Ata

shpaz

-Gar

gari

and

Luca

s[4

]Sto

chas

tic

dif

usi

onse

arch

Bis

hop

[12]

Lea

gue

cham

pio

nsh

ipal

gori

thm

Kash

an[1

18]

Spir

alop

tim

izat

ion

Tam

ura

and

Yas

uda

[206

]Soci

alem

otio

nal

opti

miz

atio

nX

uet

al.

[227

]W

ater

cycl

eal

gori

thm

Esk

andar

etal

.[5

3]

Page 87: A comprehensive review of bat algorithms and their ...

Appendix B

Kratek povzetek v slovenskem

jeziku

Inteligenca roja je moderni in zelo ucinkovit mehanizem za resevanje tezkih proble-

mov v racunalnistvu, inzeniringu, matematiki, ekonomiji, medicini in optimizaciji. In-

teligenca roja je kolektivno obnasanje decentraliziranih in samoorganiziranih sistemov.

To raziskovalno podrocje je veja umetne inteligence in je, zaradi podobnih karakteris-

tik, zelo podobna evolucijskemu racunanju. Po domace bi lahko povedali tudi, da je

inteligenca roja sestra evolucijskemu racunanju. V preteklosti je bilo razvitih veliko

algoritmov inteligence rojev, kateri so bili uporabljeni na problemih iz realnega sveta.

Glavni del te magistrske naloge je posvecen algoritmu netopirjev, ki je bil razvit nedavno,

in je pokazal dobre rezultate pri zvezni optimizaciji funkcij manjsih dimenzij.

Priujoa magistrska naloga sestoji iz sestih vecjih poglavij. V prvem poglavju je na kratko

predstavljena teorija evolucijskih algoritmov. Dodan je tudi krajsi uvod v inteligenco

rojev. V naslednjem poglavju se osredotocamo na bioloske znacilnosti inteligence ro-

jev. Tretje poglavje obsega natancne opise in prikaze najbolj popularnih algoritmov

inteligence rojev. V cetrtem je narejena obsirna studija celotnega podrocja algoritmov

netopirjev. V petem poglavju je predstavljen prakticni del, kjer prikazemo hibridizacijo

algoritma netopirjev z diferencialno evolucijo in nakljucnimi gozdovi. Tretji del pe-

tega poglavja zavzema primerjalna studija uporabe razlicnih nakljucnih distribucij v

algoritmu netopirjev. Magistrsko nalogo zakljucimo s povzetkom opravljenega dela in

napovemo smeri njenega razvoja v prihodnosti.

74

Page 88: A comprehensive review of bat algorithms and their ...

Appendix C

Bibliography of candidate

IZTOK FISTER [91936]

Personal bibliography for the period

2010-2013

ARTICLES AND OTHER COMPONENT PARTS

1.01 Original scientific article

1. FISTER, Iztok, FISTER, Dusan, YANG, Xin-She. A hybrid bat algorithm. Elek-

trotehniski vestnik. [English print ed.], 2013, vol. 80, t. 1-2, str. 1-7. [COBISS.SI-ID

17010966]

2. FISTER, Iztok, YANG, Xin-She, BREST, Janez, FISTER, Iztok. Modified firefly

algorithm using quaternion representation. Expert syst. appl.. [Print ed.], Available

online 11 July 2013, str. 1-4, doi: 10.1016/j.eswa.2013.06.070. [COBISS.SI-ID 17010710]

3. FISTER, Iztok, FISTER, Iztok, YANG, Xin-She, BREST, Janez. A comprehensive

review of firefly algorithms. Swarm and evolutionary computation, Available online 24

June 2013, str. 1-4, doi: 10.1016/j.swevo.2013.06.001. [COBISS.SI-ID 17010454]

4. FISTER, Iztok, KOSAR, Tomaz, FISTER, Iztok, MERNIK, Marjan. Easytime++ :

a case study of incremental domain-specific language development. Inf. technol. valdyn.,

2013, vol. 42, no. 1, str. 77-85. [COBISS.SI-ID 16746518]

5. FISTER, Iztok, MERNIK, Marjan, FISTER, Iztok, HRNCIC, Dejan. Implemen-

tation of EasyTime formal semantics using a LISA compiler generator. Comput. Sci.

Inf. Syst., Sep. 2012, vol. 9, no. 3, str. 1019-1044, doi: 10.2298/CSIS111110021F.

[COBISS.SI-ID 16059414]

75

Page 89: A comprehensive review of bat algorithms and their ...

Appendix C. Bibliography of candidate 76

6. FISTER, Iztok, FISTER, Iztok, BREST, Janez. A hybrid artificial bee colony algo-

rithm for graph 3-coloring. Lect. notes comput. sci., str. 66-74, ilustr. [COBISS.SI-ID

15979286]

7. FISTER, Iztok, FISTER, Iztok, MERNIK, Marjan, BREST, Janez. Design and

implementation of domain-specific language easytime. Comput. syst. struct., Oct. 2011,

vol. 37, iss. 4, str. 151-167, doi: 10.1016/j.cl.2011.04.001. [COBISS.SI-ID 14934550]

8. FISTER, Iztok, FISTER, Iztok. Merjenje casa na sportnih tekmovanjih z domensko

specificnim jezikom EasyTime. Elektrotehniski vestnik. [Slovenska tiskana izd.], 2011,

letn. 78, st. 1/2, str. 37-41. [COBISS.SI-ID 7174707]

9. FISTER, Iztok, FISTER, Iztok. Koncept sistema za odkrivanje voznje v zavetrju na

triatlonih Ironman. Elektrotehniski vestnik. [Slovenska tiskana izd.], 2011, letn. 78, st.

4, str. 217-222. [COBISS.SI-ID 262326784]

10. FISTER, Iztok, FISTER, Iztok. Measuring time in sporting competitions with the

domain-specific language easy time. Elektrotehniski vestnik. [English print ed.], 2011,

vol. 78, no. 1-2, str. 36-41, ilustr. [COBISS.SI-ID 16264982]

11. FISTER, Iztok, FISTER, Iztok. Concept of drafting detection system in Ironmans.

Elektrotehniski vestnik. [English print ed.], 2011, vol. 78, no. 4, str. 218-222, ilustr.

[COBISS.SI-ID 15613974]

1.08 Published scientific conference contribution

12. FISTER, Iztok, FISTER, Iztok, BREST, Janez. Comparing various regression

methods on ensemble strategies in differential evolution. V: 19th International Con-

ference on Soft Computing, June 26-28 Brno, 2013, Czech Republic. Mendel 2013 :

evolutionary computation, genetic programming, swarm intelligence, fuzzy logic, neural

networks, fractals, Bayesian methods, (Mendel ... (Brno, Print)). Brno: Brno Uni-

versity of Technology, Faculty of Mechanical Engineering, Institute of Automation and

Computer Science, 2013, str. 87-92. [COBISS.SI-ID 16987414]

13. FISTER, Iztok, FISTER, Iztok, BREST, Janez, YANG, Xin-She. Memetic firefly

algorithm for combinatorial optimization. V: FILIPIC, Bogdan (ur.), SILC, Jurij (ur.).

Bioinspired optimization methods and their applications : proceedings of the Fifth In-

ternational Conference on Bioinspired Optimization Methods and their Applications -

BIOMA 2012, 24-25 May 2012, Bohinj, Slovenia. Ljubljana: Jozef Stefan Institute,

2012, str. 75-86. [COBISS.SI-ID 16029206]

14. FISTER, Iztok, FISTER, Iztok, BREST, Janez, ZUMER, Viljem. Memetic artificial

bee colony algorithm for large-scale global optimization. V: IEEE World Congress on

Page 90: A comprehensive review of bat algorithms and their ...

Appendix C. Bibliography of candidate 77

Computational Intelligence, Brisbane, Australia. IEEE WCCI 2012. [S. l.: IEEE, 2012],

iEEE CEC, str. 3038-3045. [COBISS.SI-ID 16116246]

15. FISTER, Iztok, KOSAR, Tomaz, MERNIK, Marjan, FISTER, Iztok. Upgrading

EasyTime : from a textual to a visual language. V: ZAJC, Baldomir (ur.), TROST,

Andrej (ur.). Zbornik enaindvajsete mednarodne Elektrotehniske in racunalniske kon-

ference ERK 2012, 17.-19. september 2012, Portoroz, Slovenija, (Zbornik ... Elek-

trotehniske in raunalniske konference ERK ...). Ljubljana: IEEE Region 8, Slovenska

sekcija IEEE, 2012, zv. B, str. 7-10, ilustr. [COBISS.SI-ID 16304406]

16. FISTER, Iztok, MERNIK, Marjan, FISTER, Iztok, HRNCIC, Dejan. Implemen-

tation of the domain-specific language easy time using a LISA compiler generator. V:

GANZHA, Maria (ur.), MACIASZEK, Leszek (ur.), PAPRZYCKI, Marcin (ur.). Fed-

CSIS : proceedings of the Federated Conference on Computer Science and Information

Systems, September 18-21, 2011, Szczecin, Poland. Los Alamitos: IEEE Computer

Society Press, 2011, str. 809-816, ilustr. [COBISS.SI-ID 15388694]

17. FISTER, Iztok, FISTER, Iztok, BREST, Janez, BOSKOVIC, Borko. Odkrivanje

voznje v zavetrju na triatlonskih tekmovanjih : stvarnost ali iluzija = Detecting of

draft in triathlon competitions : reality or illusion. V: ZAJC, Baldomir (ur.), TROST,

Andrej (ur.). Zbornik dvajsete mednarodne Elektrotehniske in racunalniske konference

ERK 2011, 19.-21. september 2011, Portoroz, Slovenija, (Zbornik ... Elektrotehniske in

racunalniske konference ERK ...). Ljubljana: IEEE Region 8, Slovenska sekcija IEEE,

2011, zv. B, str. 111-114. [COBISS.SI-ID 15358486]

1.12 Published scientific conference contribution abstract

18. FISTER, Iztok, FISTER, Iztok. Uporaba domensko specificnega jezika pri mer-

jenju casa na sportnih tekmovanjih. V: ZAJC, Baldomir (ur.), TROST, Andrej (ur.).

Zbornik devetnajste mednarodne Elektrotehniske in racunalniske konference ERK 2010,

Portoroz, Slovenija, 20.-22. september 2010, (Zbornik ... Elektrotehniske in racunalniske

konference ERK ...). Ljubljana: IEEE Region 8, Slovenska sekcija IEEE, 2010, zv. B,

str. 409-410. [COBISS.SI-ID 14437654]

1.16 Independent scientific component part or a chapter in a monograph

19. FISTER, Iztok, YANG, Xin-She, BREST, Janez, FISTER, Iztok. Memetic self-

adaptive firefly algorithm. V: YANG, Xin-She. Swarm intelligence and bio-inspired com-

putation : theory and applications. 1st ed. Amsterdam ... [et al.]: Elsevier, cop. 2013,

part 1, str. 73-102. http://www.sciencedirect.com/science/article/pii/B9780124051638000041.

[COBISS.SI-ID 16921622]

MONOGRAPHS AND OTHER COMPLETED WORKS

Page 91: A comprehensive review of bat algorithms and their ...

Appendix C. Bibliography of candidate 78

2.11 Undergraduate thesis

20. FISTER, Iztok. Domain-specific language for time measuring on sport competitions

: B. Sc. Thesis = Domensko specificni jezik za merjenje casa na sportnih tekmovanjih :

diplomsko delo univerzitetnega studijskega programa. Maribor: [I. Fister], 2011. XI, 66

f., ilustr. http://dkum.uni-mb.si/Dokument.php?id=22659. [COBISS.SI-ID 15211286]

Page 92: A comprehensive review of bat algorithms and their ...

Appendix D

Short biography of candidate

Name and surname: IZTOK FISTER

Birth data: 15.04.1989 in Murska Sobota

Education:

1996 - 2004 Primary School Bakovci

2004 - 2008 Grammar School Franc Miklosic

2008 - 2011 University of Maribor, Faculty of Electrical Engineering and Computer

Science. Bsc in Computer science and information technologies.

79

Page 93: A comprehensive review of bat algorithms and their ...

Bibliography

[1] H. A. Abbass. Mbo: Marriage in honey bees optimization-a haplometrosis polygynous

swarming approach. In Evolutionary Computation, 2001. Proceedings of the 2001 Congress

on, volume 1, pages 207–214. IEEE, 2001.

[2] B. Akay and D. Karaboga. A modified artificial bee colony algorithm for real-parameter

optimization. Information Sciences, 192:120–142, 2012.

[3] S. Akhtar, A. Ahmad, and E. Abdel-Rahman. A metaheuristic bat-inspired algorithm

for full body human pose estimation. In Computer and Robot Vision (CRV), 2012 Ninth

Conference on, pages 369–375. IEEE, 2012.

[4] E. Atashpaz-Gargari and C. Lucas. Imperialist competitive algorithm: an algorithm for

optimization inspired by imperialistic competition. In Evolutionary Computation, 2007.

CEC 2007. IEEE Congress on, pages 4661–4667. IEEE, 2007.

[5] T. Back, D. B. Fogel, and Z. Michalewicz. Handbook of evolutionary computation. IOP

Publishing Ltd., 1997.

[6] M. Barragan, S. Martınez, J. Marchal, M. Bullejos, R. Diaz De La Guardia, and A. Sanchez.

Repeated dna sequences in the microbat species miniopterus schreibersi (vespertilionidae;

chiroptera). Hereditas, 137(1):65–71, 2003.

[7] C. J. Bastos Filho, F. B. de Lima Neto, A. J. Lins, A. I. Nascimento, and M. P. Lima. Fish

school search. In R. Chiong, editor, Nature-Inspired Algorithms for Optimisation, volume

193 of Studies in Computational Intelligence, pages 261–277. Springer, 2009.

[8] S. Behnia, A. Akhavan, A. Akhshani, and A. Samsudin. A novel dynamic model of

pseudo random number generator. Journal of Computational and Applied Mathematics,

235(12):34–55, 2011.

[9] J. E. Bell and P. R. McMullen. Ant colony optimization techniques for the vehicle routing

problem. Advanced Engineering Informatics, 18(1):41–48, 2004.

[10] G. Beni and J. Wang. Swarm intelligence in cellular robotic systems. In Robots and

Biological Systems: Towards a New Bionics?, pages 703–712. Springer, 1993.

[11] H.-G. Beyer and H.-P. Schwefel. Evolution strategies–a comprehensive introduction. Nat-

ural computing, 1(1):3–52, 2002.

80

Page 94: A comprehensive review of bat algorithms and their ...

Bibliography 81

[12] J. Bishop. Stochastic searching networks. In Artificial Neural Networks, 1989., First IEE

International Conference on (Conf. Publ. No. 313), pages 329–331. IET, 1989.

[13] C. Blum and D. Merkle. Swarm intelligence. Springer, 2008.

[14] C. Blum, J. Puchinger, G. Raidl, and A. Roli. Hybrid metaheuristics in combinatorial

optimization: A survey. Applied Soft Computing, 11(6):4135–4151, 2011.

[15] C. Blum and A. Roli. Metaheuristics in combinatorial optimization: Overview and con-

ceptual comparison. ACM Computing Surveys, 35(3):268–308, 2003.

[16] E. Bonabeau, M. Dorigo, and G. Theraulaz. Swarm intelligence: from natural to artificial

systems. Oxford university press, 1999.

[17] T. C. Bora, L. d. S. Coelho, and L. Lebensztajn. Bat-inspired optimization approach for

the brushless dc wheel motor problem. Magnetics, IEEE Transactions on, 48(2):947–950,

2012.

[18] I. Boussaıd, J. Lepagnot, and P. Siarry. A survey on optimization metaheuristics. Infor-

mation Sciences, 2013, Article in press.

[19] L. Breiman. Classification and regression trees. CRC press, 1993.

[20] L. Breiman. Random forests. Machine learning, 45(1):5–32, 2001.

[21] L. Breiman. Using iterated bagging to debias regressions. Machine Learning, 45(3):261–

277, 2001.

[22] J. Brownlee. Clever Algorithms: Nature-Inspired Programming Recipes. Jason Brownlee,

2011.

[23] H. Chen, Y. Zhu, K. Hu, and X. He. Hierarchical swarm model: a new approach to

optimization. Discrete Dynamics in Nature and Society, 2010, Article ID 379649, 2010.

[24] T.-C. Chen, P.-W. Tsai, S.-C. Chu, and J.-S. Pan. A novel optimization approach:

bacterial-ga foraging. In Innovative Computing, Information and Control, 2007. ICI-

CIC’07. Second International Conference on, pages 391–391. IEEE, 2007.

[25] W.-Y. Chen, S.-H. Chen, and C.-J. Lin. A speech recognition method based on the se-

quential multi-layer perceptrons. Neural Networks, 9(4):655–669, 1996.

[26] S.-A. Chu, P.-W. Tsai, and J.-S. Pan. Cat swarm optimization. Lecture Notes in Computer

Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in

Bioinformatics), 4099 LNAI:854–858, 2006.

[27] Y. Chu, H. Mi, H. Liao, Z. Ji, and Q. Wu. A fast bacterial swarming algorithm for high-

dimensional function optimization. In Evolutionary Computation, 2008. CEC 2008.(IEEE

World Congress on Computational Intelligence). IEEE Congress on, pages 3135–3140.

IEEE, 2008.

Page 95: A comprehensive review of bat algorithms and their ...

Bibliography 82

[28] P. Civicioglu. Transforming geocentric cartesian coordinates to geodetic coordinates by

using differential search algorithm. Computers & Geosciences, 46:229–247, 2012.

[29] P. Civicioglu. Artificial cooperative search algorithm for numerical optimization problems.

Information Sciences, 229:58–76, 2013.

[30] P. Civicioglu. Backtracking search optimization algorithm for numerical optimization prob-

lems. Applied Mathematics and Computation, 219(15):8121–8144, 2013.

[31] M. Clerc. Particle swarm optimization. Wiley-ISTE, 2010.

[32] F. Comellas and J. Martinez-Navarro. Bumblebees: a multiagent combinatorial op-

timization algorithm inspired by social insect behaviour. In Proceedings of the first

ACM/SIGEVO Summit on Genetic and Evolutionary Computation, pages 811–814. ACM,

2009.

[33] M. Crepinsek, S.-H. Liu, and M. Mernik. Exploration and exploitation in evolutionary

algorithms: a survey. ACM Computing Surveys (CSUR), 45(3):35, 2013.

[34] M. Crepinsek, M. Mernik, and S.-H. Liu. Analysis of exploration and exploitation in

evolutionary algorithms by ancestry trees. International Journal of Innovative Computing

and Applications, 3(1):11–19, 2011.

[35] E. Cuevas, D. Oliva, D. Zaldivar, M. Perez-Cisneros, and H. Sossa. Circle detection using

electro-magnetism optimization. Information Sciences, 182(1):40–55, 2012.

[36] C. Darwin. On the origin of species. Reprinted by Harvard University Press(1964)., 1859.

[37] S. Das and P. N. Suganthan. Differential evolution: A survey of the state-of-the-art.

Evolutionary Computation, IEEE Transactions on, 15(1):4–31, 2011.

[38] F. de Lima Neto, A. Lins, A. I. Nascimento, M. P. Lima, et al. A novel search algorithm

based on fish school behavior. In Systems, Man and Cybernetics, 2008. SMC 2008. IEEE

International Conference on, pages 2646–2651. IEEE, 2008.

[39] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan. A fast and elitist multiobjective genetic

algorithm: Nsga-ii. Evolutionary Computation, IEEE Transactions on, 6(2):182–197, 2002.

[40] K. Deep and J. C. Bansal. Mean particle swarm optimisation for function optimisation.

International Journal of Computational Intelligence Studies, 1(1):72–92, 2009.

[41] J. Demsar. Statistical comparisons of classifiers over multiple data sets. Journal of Machine

Learning Research, 7:1–30, December 2006.

[42] M. Dorigo. Optimization, learning and natural algorithms. Ph. D. Thesis, Politecnico di

Milano, Italy, 1992.

[43] K. A. Dowsland and J. M. Thompson. An improved ant colony optimisation heuristic for

graph colouring. Discrete Applied Mathematics, 156(3):313–324, 2008.

Page 96: A comprehensive review of bat algorithms and their ...

Bibliography 83

[44] H. Drias, S. Sadeg, and S. Yahi. Cooperative bees swarm for solving the maximum weighted

satisfiability problem. In Computational Intelligence and Bioinspired Systems, pages 318–

325. Springer, 2005.

[45] I. Durgun and A. R. Yildiz. Structural design optimization of vehicle components using

cuckoo search algorithm. MP Materials Testing, 54(3):185, 2012.

[46] J. J. Durillo and A. J. Nebro. jmetal: A java framework for multi-objective optimization.

Advances in Engineering Software, 42:760–771, 2011.

[47] J. J. Durillo, A. J. Nebro, F. Luna, B. Dorronsoro, and E. Alba. jMetal: A Java Framework

for Developing Multi-Objective Optimization Metaheuristics. Technical Report ITI-2006-

10, Departamento de Lenguajes y Ciencias de la Computacion, University of Malaga,

E.T.S.I. Informatica, Campus de Teatinos, December 2006.

[48] A. Eiben and J. Smith. Introduction to Evolutionary Computing. Springer-Verlag, Berlin,

2003.

[49] A. E. Eiben and C. Schippers. On evolutionary exploration and exploitation. Fundamenta

Informaticae, 35(1):35–50, 1998.

[50] A. E. Eiben and J. E. Smith. Introduction to evolutionary computing (natural computing

series). Springer, 2008.

[51] B. G. Elmegreen. The initial stellar mass function from random sampling in a turbulent

fractal cloud. Astrophysical Journal, 486(2):944–954, 1997.

[52] A. P. Engelbrecht. Fundamentals of computational swarm intelligence, volume 1. Wiley

Chichester, 2005.

[53] H. Eskandar, A. Sadollah, A. Bahreininejad, and M. Hamdi. Water cycle algorithm–a

novel metaheuristic optimization method for solving constrained engineering optimization

problems. Computers & Structures, 110-111:151–166, 2012.

[54] H. Eskandari, C. D. Geiger, and G. B. Lamont. FastPGA: A dynamic population sizing

approach for solving expensive multiobjective optimization problems. In S. Obayashi,

K. Deb, C. Poloni, T. Hiroyasu, and T. Murata, editors, Evolutionary Multi-Criterion

Optimization. 4th International Conference, EMO 2007, volume 4403 of Lecture Notes in

Computer Science, pages 141–155. Springer, 2007.

[55] M. Eusuff and K. Lansey. Optimization of water distribution network design using the

shuffled frog leaping algorithm. Journal of Water Resources Planning and Management,

129(3):210–225, 2003.

[56] K. Faegri, L. Van der Pijl, et al. The principles of pollination ecology. Pergamon Press,

Oxford, 1966.

[57] D. P. Feldman. Chaos and Fractals: An Elementary Introduction. Oxford University Press,

USA, 2012.

Page 97: A comprehensive review of bat algorithms and their ...

Bibliography 84

[58] M. B. Fenton and G. P. Bell. Recognition of species of insectivorous bats by their echolo-

cation calls. Journal of Mammalogy, 62(2):233–243, 1981.

[59] C. Ferreira. Gene expression programming: a new adaptive algorithm for solving problems.

Complex Systems, 13:87–129, 2001.

[60] S. Finck, N. Hansen, R. Ros, and A. Auger. Real-Parameter Black-Box Optimization

Benchmarking 2010: Presentation of the Noiseless Functions. Technical Report 20, Na-

tional Institute for Research in Computer Science and Control, Rocquencourt, French,

2009.

[61] I. Fister, I. J. Fister, J. Brest, and V. Zumer. Memetic artificial bee colony algorithm for

large-scale global optimization. In Evolutionary Computation (CEC), 2012 IEEE Congress

on, pages 1–8. IEEE, 2012.

[62] I. Fister, I. J. Fister, X.-S. Yang, and J. Brest. A comprehensive review of firefly algorithms.

Swarm and Evolutionary Computation, 2013.

[63] I. Fister, M. Mernik, and J. Brest. Hybridization of evolutionary algorithms. In E. Kita,

editor, Evolutionary Algorithms. Intech, 2011.

[64] I. Fister, M. Mernik, and B. Filipic. A hybrid self-adaptive evolutionary algorithm for

marker optimization in the clothing industry. Applied Soft Computing, 10(2):409–422,

2010.

[65] I. Fister, X.-S. Yang, J. Brest, and I. J. Fister. Memetic self-adaptive firefly algorithm. In

Y. X.-S. et al. (Eds.), editor, Swarm Intelligence and Bio-Inspired Computation. Elsevier,

2013.

[66] I. Fister, X.-S. Yang, J. Brest, and I. J. Fister. Modified firefly algorithm using quaternion

representation. Expert Systems with Applications, 2013, Article in press.

[67] I. J. Fister, D. Fister, and I. Fister. Differential evolution strategies with random forest

regression in the bat algorithm. In Proceeding of the fifteenth annual conference companion

on Genetic and evolutionary computation conference companion, GECCO ’13 Companion,

pages 1703–1706, New York, NY, USA, 2013. ACM.

[68] I. J. Fister, D. Fister, and X.-S. Yang. A hybrid bat algorithm. Electrotechnical review,

80(1-2):1–7, 2013.

[69] I. J. Fister, I. Fister, and J. Brest. Comparing various regression methods on ensemble

strategies in differential evolution. In Proceedings of 19th International Conference on Soft

Computing MENDEL, pages 87–92, 2013.

[70] I. J. Fister, X. Yang, I. Fister, and J. Brest. Memetic firefly algorithm for combinatorial

optimization. In B. Filipic and J. Silc, editors, Bioinspired optimization methods and their

applications : proceedings of the Fifth International Conference on Bioinspired Optimiza-

tion Methods and their Applications - BIOMA 2012, pages 75–86. Jozef Stefan Institute,

2012.

Page 98: A comprehensive review of bat algorithms and their ...

Bibliography 85

[71] I. J. Fister, X.-S. Yang, I. Fister, J. Brest, and D. Fister. A brief review of nature-inspired

algorithms for optimization. Electrotechnical review, 80(3), 2013.

[72] D. B. Fogel. Applying evolutionary programming to selected traveling salesman problems.

Cybernetics and systems, 24(1):27–36, 1993.

[73] R. A. Formato. Central force optimization: A new metaheuristic with applications in

applied electromagnetics. Progress In Electromagnetics Research, 77:425–491, 2007.

[74] J. Friedman. Greedy function approximation: a gradient boosting machine.(english sum-

mary). Annals of Statistics, 29(5):1189–1232, 2001.

[75] J. Friedman. Stochastic gradient boosting. Computational Statistics & Data Analysis,

38(4):367–378, 2002.

[76] M. Friedman. The use of ranks to avoid the assumption of normality implicit in the analysis

of variance. Journal of the American Statistical Association, 32:675–701, December 1937.

[77] M. Friedman. A comparison of alternative tests of significance for the problem of m

rankings. The Annals of Mathematical Statistics, 11:86–92, March 1940.

[78] D. Galassi and a. Et. GNU Scientific Library: Reference Manual, Edition 1.15. Network

Theory, Ldt., 2011.

[79] A. H. Gandomi and A. H. Alavi. Krill herd: a new bio-inspired optimization algorithm.

Communications in Nonlinear Science and Numerical Simulation, 17(12):4831–4845, 2012.

[80] A. H. Gandomi, X.-S. Yang, and A. H. Alavi. Mixed variable structural optimization using

firefly algorithm. Computers & Structures, 89(23):2325–2336, 2011.

[81] A. H. Gandomi, X.-S. Yang, and A. H. Alavi. Cuckoo search algorithm: a metaheuristic ap-

proach to solve structural optimization problems. Engineering with Computers, 29(1):17–

35, 2013.

[82] A. H. Gandomi, X.-S. Yang, A. H. Alavi, and S. Talatahari. Bat algorithm for constrained

optimization tasks. Neural Computing and Applications, pages 1–17, 2012, Article in press.

[83] M. Garey and D. Johnson. Computers and Intractability: A Guide to the Theory of NP-

Completeness. W.H. Freeman & Co., New York, NY, USA, 1979.

[84] S. Garnier, J. Gautrais, and G. Theraulaz. The biological principles of swarm intelligence.

Swarm Intelligence, 1(1):3–31, 2007.

[85] Z. W. Geem, J. H. Kim, and G. Loganathan. A new heuristic optimization algorithm:

harmony search. Simulation, 76(2):60–68, 2001.

[86] P. Geurts, D. Ernst, and L. Wehenkel. Extremely randomized trees. Machine learning,

63(1):3–42, 2006.

[87] D. E. Goldberg and J. H. Holland. Genetic algorithms and machine learning. Machine

learning, 3(2):95–99, 1988.

Page 99: A comprehensive review of bat algorithms and their ...

Bibliography 86

[88] V. Grant and K. A. Grant. Flower pollination in the phlox family. Columbia University

Press, 1965.

[89] D. R. Griffin, F. A. Webster, and C. R. Michael. The echolocation of flying insects by

bats. Animal Behaviour, 8(3):141–154, 1960.

[90] C. Grosan, A. Abraham, and H. Ishibuchi. Hybrid evolutionary algorithms. Springer

Publishing Company, 2007.

[91] G. F. Gunnell, B. F. Jacobs, P. S. Herendeen, J. J. Head, E. Kowalski, C. P. Msuya,

F. A. Mizambwa, T. Harrison, J. Habersetzer, and G. Storch. Oldest placental mammal

from sub-saharan africa: Eocene microbat from tanzaniaevidence for early evolution of

sophisticated echolocation. Palaeontologia Electronica, 5(10), 2003.

[92] E. Hagen. Ask a biologist: Echolocation. http://askabiologist.asu.edu/

echolocation. Accessed 25 July 2013.

[93] N. Hansen, A. Auger, R. Ros, S. Finck, and P. Posık. Black-box optimization benchmark-

ing (bbob) 2013. http://coco.gforge.inria.fr/doku.php?id=bbob-2013. Accessed 20

July 2013.

[94] J. A. Hartigan. Clustering algorithms. John Wiley & Sons, Inc., 1975.

[95] A. Hatamlou. Black hole: A new heuristic optimization approach for data clustering.

Information Sciences, 222:175–184, 2012.

[96] T. C. Havens, C. J. Spain, N. G. Salmon, and J. M. Keller. Roach infestation optimization.

In Swarm Intelligence Symposium, 2008. SIS 2008. IEEE, pages 1–7. IEEE, 2008.

[97] S. He, Q. Wu, and J. Saunders. Group search optimizer: an optimization algorithm

inspired by animal searching behavior. Evolutionary Computation, IEEE Transactions on,

13(5):973–990, 2009.

[98] R. Hedayatzadeh, F. Akhavan Salmassi, M. Keshtgari, R. Akbari, and K. Ziarati. Termite

colony optimization: A novel approach for optimizing continuous problems. In Electrical

Engineering (ICEE), 2010 18th Iranian Conference on, pages 553–558. IEEE, 2010.

[99] H. Hernandez and C. Blum. Distributed graph coloring: an approach based on the calling

behavior of japanese tree frogs. Swarm Intelligence, 6(2):117–150, 2012.

[100] J. H. Holland. Adaptation in natural and artificial systems: an introductory analysis with

applications to biology, control and artificial intelligence. MIT press, 1992.

[101] O. Holland and C. Melhuish. Stigmergy, self-organization, and sorting in collective robotics.

Artificial Life, 5(2):173–202, 1999.

[102] H. Hoos and T. Stutzle. Stochastic Local Search: Foundations & Applications. Morgan

Kaufmann Publishers, San Francisco, CA, USA, 2004.

Page 100: A comprehensive review of bat algorithms and their ...

Bibliography 87

[103] C. Hourigan, C. Johnson, and S. K. Robson. The structure of a micro-bat community in

relation to gradients of environmental variation in a tropical urban area. Urban Ecosystems,

9(2):67–82, 2006.

[104] D. Hrncic. Memetski algoritem za sklepanje o kontekstno neodvisnih gramatikah in nje-

gova uporaba pri nacrtovanju domensko specificnega jezika. Ph. D. Thesis, University of

Maribor, Slovenia, 2012.

[105] X. Hu and R. Eberhart. Multiobjective optimization using dynamic neighborhood particle

swarm optimization. In Evolutionary Computation, 2002. CEC’02. Proceedings of the 2002

Congress on, volume 2, pages 1677–1681. IEEE, 2002.

[106] S. Iordache. Consultant-guided search: a new metaheuristic for combinatorial optimiza-

tion problems. In Proceedings of the 12th annual conference on Genetic and evolutionary

computation, pages 225–232. ACM, 2010.

[107] A. K. Jain and R. C. Dubes. Algorithms for clustering data. Prentice-Hall, 1988.

[108] A. K. Jain, M. N. Murty, and P. J. Flynn. Data clustering: a review. ACM computing

surveys (CSUR), 31(3):264–323, 1999.

[109] M. Jamil and H.-J. Zepernick. Levi flights and global optimization. In X.-S. Yang, Z. Cui,

R. Xiao, A. H. Gandomi, and M. Karamanoglu, editors, Swarm Intelligence and Bio-

inspired Computation, pages 49–72. Elsevier, Oxford, 2013.

[110] N. Jin and Y. Rahmat-Samii. Advances in particle swarm optimization for antenna designs:

Real-number, binary, single-objective and multiobjective implementations. Antennas and

Propagation, IEEE Transactions on, 55(3):556–567, 2007.

[111] S. H. Jung. Queen-bee evolution for genetic algorithms. Electronics letters, 39(6):575–576,

2003.

[112] F. Kang, J. Li, and Q. Xu. Structural inverse analysis by hybrid simplex artificial bee

colony algorithms. Computers & Structures, 87(13):861–870, 2009.

[113] D. Karaboga and B. Akay. A comparative study of artificial bee colony algorithm. Applied

Mathematics and Computation, 214(1):108–132, 2009.

[114] D. Karaboga and B. Akay. A survey: algorithms simulating bee swarm intelligence. Arti-

ficial Intelligence Review, 31(1-4):61–85, 2009.

[115] D. Karaboga and B. Basturk. Artificial bee colony (abc) optimization algorithm for solving

constrained optimization problems. In Foundations of Fuzzy Logic and Soft Computing,

pages 789–798. Springer, 2007.

[116] D. Karaboga and B. Basturk. A powerful and efficient algorithm for numerical func-

tion optimization: artificial bee colony (abc) algorithm. Journal of global optimization,

39(3):459–471, 2007.

Page 101: A comprehensive review of bat algorithms and their ...

Bibliography 88

[117] D. Karaboga and B. Basturk. On the performance of artificial bee colony (abc) algorithm.

Applied soft computing, 8(1):687–697, 2008.

[118] A. H. Kashan. League championship algorithm: a new algorithm for numerical function op-

timization. In Soft Computing and Pattern Recognition, 2009. SOCPAR’09. International

Conference of, pages 43–48. IEEE, 2009.

[119] A. Kaveh and N. Farhoudi. A new optimization method: Dolphin echolocation. Advances

in Engineering Software, 59:53–70, 2013.

[120] A. Kaveh and S. Talatahari. A novel heuristic optimization method: charged system

search. Acta Mechanica, 213(3-4):267–289, 2010.

[121] J. Kennedy and R. Eberhart. Particle swarm optimization. In Neural Networks, 1995.

Proceedings., IEEE International Conference on, volume 4, pages 1942–1948. IEEE, 1995.

[122] J. F. Kennedy and R. C. Eberhart. Swarm intelligence. Morgan Kaufmann, 2001.

[123] K. Khan, A. Nikov, and A. Sahai. A fuzzy bat clustering method for ergonomic screening

of office workplaces. In Third International Conference on Software, Services and Semantic

Technologies S3T 2011, pages 59–66. Springer, 2011.

[124] K. Khan and A. Sahai. A comparison of ba, ga, pso, bp and lm for training feed forward

neural networks in e-learning context. International Journal of Intelligent Systems and

Applications (IJISA), 4(7):23, 2012.

[125] E. Kilic and E. Alpaydin. Learning the areas of expertise of classifiers in an ensemble.

Procedia Computer Science, 3:74–82, 2011.

[126] S. Kirkpatrick, C. Gelatt, and M. Vecchi. Optimization by simulated annealing. Science,

220(4598):671–680, 1983.

[127] J. Knowles and D. Corne. The pareto archived evolution strategy: A new baseline algo-

rithm for multiobjective optimization. In Proceedings of the 1999 Congress on Evolutionary

Computation, pages 9–105, Piscataway, NJ, 1999. IEEE Press.

[128] P. Knuth. Handbook of flower pollination, Volume II. Clarendon Press, 1908.

[129] J. R. Koza. Genetic Programming: vol. 1, On the programming of computers by means of

natural selection. MIT press, 1992.

[130] J. R. Koza. Genetic programming as a means for programming computers by natural

selection. Statistics and Computing, 4(2):87–112, 1994.

[131] J. R. Koza, F. H. Bennett III, and O. Stiffelman. Genetic programming as a Darwinian

invention machine. Springer, 1999.

[132] K. Krishnanand and D. Ghose. Detection of multiple source locations using a glowworm

metaphor with applications to collective robotics. In Swarm Intelligence Symposium, 2005.

SIS 2005. Proceedings 2005 IEEE, pages 84–91. IEEE, 2005.

Page 102: A comprehensive review of bat algorithms and their ...

Bibliography 89

[133] K. Krishnanand and D. Ghose. Glowworm swarm optimisation: a new method for optimis-

ing multi-modal functions. International Journal of Computational Intelligence Studies,

1(1):93–119, 2009.

[134] S. Kukkonen and J. Lampinen. GDE3: The third Evolution Step of Generalized Differential

Evolution. In IEEE Congress on Evolutionary Computation (CEC’2005), pages 443 – 450,

2005.

[135] D. Lewis. Convention - A Philosophical Study. Harvard University Press, Cambridge,

1969.

[136] H. Li and Q. Zhang. Multiobjective Optimization Problems With Complicated Pareto Sets,

MOEA/D and NSGA-II. Evolutionary Computation, IEEE Transactions on, 13(2):229–

242, 2009.

[137] X.-L. Li, Z.-J. Shao, and J.-X. Qian. Optimizing method based on autonomous animats:

Fish-swarm algorithm. Xitong Gongcheng Lilun yu Shijian/System Engineering Theory

and Practice, 22(11):32, 2002.

[138] J. J. Liang, A. Qin, P. N. Suganthan, and S. Baskar. Comprehensive learning particle

swarm optimizer for global optimization of multimodal functions. Evolutionary Computa-

tion, IEEE Transactions on, 10(3):281–295, 2006.

[139] A. Liaw and M. Wiener. Classification and regression by randomforest. R News, 2(3):18–

22, 2002.

[140] G. Liu, H. Huang, S. Wang, and Z. Chen. An improved bat algorithm with doppler effect

for stochastic optimization. International Journal of Digital Content Technology and its

Applications, 6(21):326–336, 2012.

[141] S.-H. Liu, M. Mernik, and B. R. Bryant. To explore or to exploit: An entropy-driven

approach for evolutionary algorithms. International journal of knowledge-based and intel-

ligent engineering systems, 13(3):185–206, 2009.

[142] P. Lucic and D. Teodorovic. Bee system: modeling combinatorial optimization trans-

portation engineering problems by swarm intelligence. In Preprints of the TRISTAN IV

triennial symposium on transportation analysis, pages 441–445, 2001.

[143] S. Lukasik and S. Zak. Firefly algorithm for continuous constrained optimization tasks.

In Computational Collective Intelligence. Semantic Web, Social Networks and Multiagent

Systems, pages 97–106. Springer, 2009.

[144] R. D. Maia, L. N. de Castro, and W. M. Caminhas. Bee colonies as model for multimodal

continuous optimization: The optbees algorithm. In Evolutionary Computation (CEC),

2012 IEEE Congress on, pages 1–8. IEEE, 2012.

[145] R. Mallipeddi, P. Suganthan, Q. Pan, and M. Tasgetiren. Differential evolution algorithm

with ensemble of parameters and mutation strategies. Applied Soft Computing, 11(2):1679–

1696, 2011.

Page 103: A comprehensive review of bat algorithms and their ...

Bibliography 90

[146] M. Marichelvam and T. Prabaharan. A bat algorithm for realistic hybrid flowshop schedul-

ing problems to minimize makespan and mean flow time. Ictact Journal on Soft Computing,

3(1), 2012.

[147] J. Mataric and M. Mataric. Behavior-based coordination in multi-robot systems. In

G. Sam S. and L. Frank L., editors, Swarm Intelligence and Bio-Inspired Computation.

Marcel Dekker, 2005.

[148] A. R. Mehrabian and C. Lucas. A novel numerical optimization algorithm inspired from

weed colonization. Ecological Informatics, 1(4):355–366, 2006.

[149] Z. Michalewicz. Evolution strategies and other methods. In Genetic Algorithms+ Data

Structures= Evolution Programs, pages 159–177. Springer, 1996.

[150] S. Mishra, K. Shaw, and D. Mishra. A new meta-heuristic bat inspired classification

approach for microarray data. Procedia Technology, 4:802–806, 2012.

[151] A. Mozaffari, A. Fathi, and S. Behzadipour. The great salmon run: a novel bio–inspired

algorithm for artificial system design and optimisation. International Journal of Bio-

Inspired Computation, 4(5):286–301, 2012.

[152] A. Mucherino and O. Seref. Monkey search: a novel metaheuristic search for global op-

timization. In Data Mining, Systems Analysis and Optimization in Biomedicine, volume

953, pages 162–173, 2007.

[153] P. Musikapun and P. Pongcharoen. Solving multi-stage multi-machine multi-product

scheduling problem using bat algorithm. In 2nd International Conference on Manage-

ment and Artificial Intelligence, volume 35, 2012.

[154] R. Nakamura, L. Pereira, K. Costa, D. Rodrigues, J. Papa, and X.-S. Yang. Bba: A binary

bat algorithm for feature selection. In Graphics, Patterns and Images (SIBGRAPI), 2012

25th SIBGRAPI Conference on, pages 291–297. IEEE, 2012.

[155] A. Natarajan, S. Subramanian, and K. Premalatha. A comparative study of cuckoo search

and bat algorithm for bloom filter optimisation in spam filtering. International Journal of

Bio-Inspired Computation, 4(2):89–99, 2012.

[156] A. J. Nebro, E. Alba, G. Molina, F. Chicano, F. Luna, and J. J. Durillo. Optimal antenna

placement using a new multi-objective chc algorithm. In GECCO ’07: Proceedings of

the 9th annual conference on Genetic and evolutionary computation, pages 876–883, New

York, NY, USA, 2007. ACM Press.

[157] A. J. Nebro, F. Luna, E. Alba, B. Dorronsoro, J. J. Durillo, and A. Beham. AbYSS: Adapt-

ing Scatter Search to Multiobjective Optimization. Evolutionary Computation, IEEE

Transactions on, 12(4), 2008.

[158] A. Nickabadi, M. M. Ebadzadeh, and R. Safabakhsh. A novel particle swarm optimization

algorithm with adaptive inertia weight. Applied Soft Computing, 11(4):3658–3670, 2011.

Page 104: A comprehensive review of bat algorithms and their ...

Bibliography 91

[159] U. M. Norberg and J. M. Rayner. Ecological morphology and flight in bats (mammalia;

chiroptera): wing adaptations, flight performance, foraging strategy and echolocation.

Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences,

pages 335–427, 1987.

[160] Y. S. Ong and A. J. Keane. Meta-lamarckian learning in memetic algorithms. Evolutionary

Computation, IEEE Transactions on, 8(2):99–110, 2004.

[161] Y.-S. Ong, M. Lim, and X. Chen. Memetic computation - past, present & future [research

frontier]. Computational Intelligence Magazine, IEEE, 5(2):24–31, 2010.

[162] Y.-S. Ong, M.-H. Lim, N. Zhu, and K.-W. Wong. Classification of adaptive memetic

algorithms: a comparative study. Systems, Man, and Cybernetics, Part B: Cybernetics,

IEEE Transactions on, 36(1):141–152, 2006.

[163] Q.-K. Pan, M. Fatih Tasgetiren, P. N. Suganthan, and T. J. Chua. A discrete artificial

bee colony algorithm for the lot-streaming flow shop scheduling problem. Information

Sciences, 181(12):2455–2468, 2011.

[164] R. Parpinelli and H. Lopes. An eco-inspired evolutionary algorithm applied to numerical

optimization. In Nature and Biologically Inspired Computing (NaBIC), 2011 Third World

Congress on, pages 466–471. IEEE, 2011.

[165] K. M. Passino. Biomimicry of bacterial foraging for distributed optimization and control.

Control Systems, IEEE, 22(3):52–67, 2002.

[166] R. B. Payne and M. D. Sorensen. The cuckoos, volume 15. OUP Oxford, 2005.

[167] F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel,

P. Prettenhofer, R. Weiss, V. Dubourg, et al. Scikit-learn: Machine learning in python.

The Journal of Machine Learning Research, 12:2825–2830, 2011.

[168] D. Pham, A. Ghanbarzadeh, E. Koc, S. Otri, S. Rahim, and M. Zaidi. The bees algorithm-a

novel tool for complex optimisation problems. In Proceedings of the 2nd Virtual Interna-

tional Conference on Intelligent Production Machines and Systems (IPROMS 2006), pages

454–459, 2006.

[169] R. Poli, J. Kennedy, and T. Blackwell. Particle swarm optimization. Swarm intelligence,

1(1):33–57, 2007.

[170] R. Polikar. Ensemble learning. In C. Zhang and Y. Ma, editors, Ensemble Machine

Learning, pages 1–34. Springer-Verlag, Berlin, 2012.

[171] L. Prechelt et al. Proben1: A set of neural network benchmark problems and benchmarking

rules. Fakultat fur Informatik, Univ. Karlsruhe, Karlsruhe, Germany, Tech. Rep, 21:94,

1994.

[172] U. Premaratne, J. Samarabandu, and T. Sidhu. A new biologically inspired optimization

algorithm. In Industrial and Information Systems (ICIIS), 2009 International Conference

on, pages 279–284. IEEE, 2009.

Page 105: A comprehensive review of bat algorithms and their ...

Bibliography 92

[173] K. V. Price. An introduction to differential evolution. In New ideas in optimization, pages

79–108. McGraw-Hill Ltd., UK, 1999.

[174] K. V. Price, R. M. Storn, and J. A. Lampinen. Differential evolution a practical approach

to global optimization. Springer-Verlag, 2005.

[175] P. Rabanal, I. Rodrıguez, and F. Rubio. Using river formation dynamics to design heuristic

algorithms. In Unconventional Computation, pages 163–177. Springer, 2007.

[176] P. Rabanal, I. Rodrıguez, and F. Rubio. Finding minimum spanning/distances trees by

using river formation dynamics. In Ant Colony Optimization and Swarm Intelligence,

pages 60–71. Springer, 2008.

[177] B. Ramesh, V. C. J. Mohan, and V. V. Reddy. Application of bat algorithm for combined

economic load and emission dispatch. International Journal of Electrical and Electronic

Engineering & Applications, 2(1), 2013.

[178] S. Rao. Engineering optimization : theory and practice. John Wiley & Sons, Inc., New

York, NY, USA, 2009.

[179] E. Rashedi, H. Nezamabadi-Pour, and S. Saryazdi. Gsa: a gravitational search algorithm.

Information sciences, 179(13):2232–2248, 2009.

[180] E. Rashedi, H. Nezamabadi-Pour, and S. Saryazdi. Bgsa: binary gravitational search

algorithm. Natural Computing, 9(3):727–745, 2010.

[181] M. V. U. Reddy and A. Manoj. Optimal capacitor placement for loss reduction in distri-

bution systems using bat algorithm. IOSR Journal of Engineering, 2(10), 2012.

[182] T. Robic and B. Filipic. Demo: Differential evolution for multiobjective optimization. In

Evolutionary Multi-Criterion Optimization, pages 520–533. Springer, 2005.

[183] J. Robinson and Y. Rahmat-Samii. Particle swarm optimization in electromagnetics. An-

tennas and Propagation, IEEE Transactions on, 52(2):397–407, 2004.

[184] C. Ryan, J. Collins, and M. O. Neill. Grammatical evolution: Evolving programs for an

arbitrary language. In Genetic Programming, pages 83–96. Springer, 1998.

[185] S. Samanta and S. Chakraborty. Parametric optimization of some non-traditional machin-

ing processes using artificial bee colony algorithm. Engineering Applications of Artificial

Intelligence, 24(6):946–957, 2011.

[186] J. Senthilnath, S. Omkar, and V. Mani. Clustering using firefly algorithm: Performance

study. Swarm and Evolutionary Computation, 1(3):164–171, 2011.

[187] H. Shah-Hosseini. Problem solving by intelligent water drops. In Evolutionary Computa-

tion, 2007. CEC 2007. IEEE Congress on, pages 3226–3231. IEEE, 2007.

[188] H. Shah-Hosseini. Intelligent water drops algorithm: A new optimization method for

solving the multiple knapsack problem. International Journal of Intelligent Computing

and Cybernetics, 1(2):193–212, 2008.

Page 106: A comprehensive review of bat algorithms and their ...

Bibliography 93

[189] H. Shah-Hosseini. The intelligent water drops algorithm: a nature-inspired swarm-based

optimization algorithm. International Journal of Bio-Inspired Computation, 1(1):71–79,

2009.

[190] H. Shah-Hosseini. Principal components analysis by the galaxy-based search algorithm: a

novel metaheuristic for continuous optimisation. International Journal of Computational

Science and Engineering, 6(1):132–140, 2011.

[191] H. Shayeghi and J. Dadashpour. Anarchic society optimization based pid control of an

automatic voltage regulator (avr) system. Electrical and Electronic Engineering, 2(4):199–

207, 2012.

[192] P. Shelokar, V. K. Jayaraman, and B. D. Kulkarni. An ant colony approach for clustering.

Analytica Chimica Acta, 509(2):187–195, 2004.

[193] Y. Shi. An optimization algorithm based on brainstorming process. International Journal

of Swarm Intelligence Research (IJSIR), 2(4):35–62, 2011.

[194] Y. Shi and R. Eberhart. Parameter selection in particle swarm optimization. In Evolu-

tionary Programming VII, pages 591–600. Springer, 1998.

[195] Y. Shi and R. Eberhart. Particle swarm optimization: developments, applications and

resources. In Evolutionary Computation, 2001. Proceedings of the 2001 Congress on, vol-

ume 1, pages 81–86. IEEE, 2001.

[196] Y. Shi and R. C. Eberhart. Empirical study of particle swarm optimization. In Evolutionary

Computation, 1999. CEC 99. Proceedings of the 1999 Congress on, volume 3. IEEE, 1999.

[197] D. Simon. Biogeography-based optimization. Evolutionary Computation, IEEE Transac-

tions on, 12(6):702–713, 2008.

[198] A. Singh. An artificial bee colony algorithm for the leaf-constrained minimum spanning

tree problem. Applied Soft Computing, 9(2):625–631, 2009.

[199] R. Storn and K. Price. Differential evolution–a simple and efficient heuristic for global

optimization over continuous spaces. Journal of global optimization, 11(4):341–359, 1997.

[200] K. Strickler. What is pollination? http://www.pollinatorparadise.com/what_is_

pollination.htm. Accessed 25 July 2013.

[201] S. Su, J. Wang, W. Fan, and X. Yin. Good lattice swarm algorithm for constrained

engineering design optimization. In Wireless Communications, Networking and Mobile

Computing, 2007. WiCom 2007. International Conference on, pages 6421–6424. IEEE,

2007.

[202] M. Sudhakaran and S. Slochanal. Integrating genetic algorithms and tabu search for

emission and economic dispatch problems. Journal-Institution of Engineers India Part

Electrical Engineering, 86:22, 2005.

Page 107: A comprehensive review of bat algorithms and their ...

Bibliography 94

[203] P. Suganthan, N. Hansen, J. Liang, K. Deb, Y.-P. Chen, A. Auger, and S. Tiwari. Spe-

cial session on real-parameter optimization at cec-05. http://www.ntu.edu.sg/home/

epnsugan/index_files/CEC-05/CEC05.htm. Accessed 20 July 2013.

[204] C. Sur, S. Sharma, and A. Shukla. Egyptian vulture optimization algorithm–a new nature

inspired meta-heuristics for knapsack problem. In The 9th International Conference on

Computing and InformationTechnology (IC2IT2013), pages 227–237. Springer, 2013.

[205] A. Tamiru and F. Hashim. Application of bat algorithm and fuzzy systems to model

exergy changes in a gas turbine. In Artificial Intelligence, Evolutionary Computing and

Metaheuristics, pages 685–719. Springer, 2013.

[206] K. Tamura and K. Yasuda. Spiral dynamics inspired optimization. Journal of Advanced

Computational Intelligence and Intelligent Informatics, 15(8):1116–1122, 2011.

[207] R. Tang, S. Fong, X.-S. Yang, and S. Deb. Wolf search algorithm with ephemeral memory.

In Digital Information Management (ICDIM), 2012 Seventh International Conference on,

pages 165–172, 2012.

[208] E. C. Teeling, O. Madsen, R. A. Van Den Bussche, W. W. De Jong, M. J. Stanhope,

and M. S. Springer. Microbat paraphyly and the convergent evolution of a key innovation

in old world rhinolophoid microbats. Proceedings of the National Academy of Sciences,

99(3):1431–1436, 2002.

[209] D. Teodorovic and M. DellOrco. Bee colony optimization–a cooperative learning approach

to complex transportation problems. In Advanced OR and AI Methods in Transportation:

Proceedings of 16th Mini–EURO Conference and 10th Meeting of EWGT (13-16 September

2005).–Poznan: Publishing House of the Polish Operational and System Research, pages

51–60, 2005.

[210] T. Ting, K. L. Man, S.-U. Guan, M. Nayel, and K. Wan. Weightless swarm algorithm (wsa)

for dynamic optimization problems. In Network and Parallel Computing, pages 508–515.

Springer, 2012.

[211] M. Tuba, M. Subotic, and N. Stanarevic. Modified cuckoo search algorithm for uncon-

strained optimization problems. In Proceedings of the 5th European conference on Euro-

pean computing conference, pages 263–268. World Scientific and Engineering Academy and

Society (WSEAS), 2011.

[212] D. Van der Merwe and A. Engelbrecht. Data clustering using particle swarm optimization.

In Evolutionary Computation, 2003. CEC’03. The 2003 Congress on, volume 1, pages

215–220. IEEE, 2003.

[213] D. A. V. Veldhuizen and G. B. Lamont. Multiobjective evolutionary algorithms: Analyzing

the state-of-the-art. Evolutionary computation, 8(2):125–147, 2000.

[214] J. Vesterstrom and R. Thomsen. A comparative study of differential evolution, particle

swarm optimization, and evolutionary algorithms on numerical benchmark problems. In

Evolutionary Computation, 2004. CEC2004. Congress on, volume 2, pages 1980–1987.

IEEE, 2004.

Page 108: A comprehensive review of bat algorithms and their ...

Bibliography 95

[215] T. Vicsek, A. Czirok, E. Ben-Jacob, I. Cohen, and O. Shochet. Novel type of phase

transition in a system of self-driven particles. Physical Review Letters, 75(6):1226–1229,

1995.

[216] S. Walton, O. Hassan, K. Morgan, and M. Brown. Modified cuckoo search: A new gradient

free optimisation algorithm. Chaos, Solitons & Fractals, 44(9):710–718, 2011.

[217] G. Wang and L. Guo. A novel hybrid bat algorithm with harmony search for global

numerical optimization. Journal of Applied Mathematics, 2013, Article ID 696491, 2013.

[218] G. Wang, L. Guo, H. Duan, L. Liu, and H. Wang. A bat algorithm with mutation for ucav

path planning. The Scientific World Journal, 2012, 2012.

[219] H. Wedde, M. Farooq, and Y. Zhang. Beehive: An efficient fault-tolerant routing al-

gorithm inspired by honey bee behavior. Lecture Notes in Computer Science (including

subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics),

3172 LNCS:83–94, 2004.

[220] Wikipedia. Ant colony optimization algorithms. http: // en. wikipedia. org/ wiki/

Ant_ colony_ optimization_ algorithms , 2013.

[221] D. H. Wolpert and W. G. Macready. No free lunch theorems for optimization. Evolutionary

Computation, IEEE Transactions on, 1(1):67–82, 1997.

[222] M. Wooldridge. An Introduction to MultiAgent Systems. John Willey & Sohns, Chichester,

UK, 2009.

[223] J. Xie, Y. Zhou, and H. Chen. A novel bat algorithm based on differential operator

and levy-flights trajectory. Computational Intelligence and Neuroscience, 2013, Article ID

453812, 2013.

[224] L. Xie, Y. Tan, J. Zeng, and Z. Cui. Artificial physics optimisation: a brief survey.

International Journal of Bio-Inspired Computation, 2(5):291–302, 2010.

[225] L. Xie, J. Zeng, and Z. Cui. General framework of artificial physics optimization algorithm.

In Nature & Biologically Inspired Computing, 2009. NaBIC 2009. World Congress on,

pages 1321–1326. IEEE, 2009.

[226] L. Xie, J. Zeng, and R. A. Formato. Convergence analysis and performance of the ex-

tended artificial physics optimization algorithm. Applied Mathematics and Computation,

218(8):4000–4011, 2011.

[227] Y. Xu, Z. Cui, and J. Zeng. Social emotional optimization algorithm for nonlinear con-

strained optimization problems. In Swarm, Evolutionary, and Memetic Computing, pages

583–590. Springer, 2010.

[228] G.-W. Yan and Z.-J. Hao. A novel optimization algorithm based on atmosphere clouds

model. International Journal of Computational Intelligence and Applications, 12(01), 2013.

Page 109: A comprehensive review of bat algorithms and their ...

Bibliography 96

[229] X.-S. Yang. Engineering optimizations via nature-inspired virtual bee algorithms. In

Artificial Intelligence and Knowledge Engineering Applications: A Bioinspired Approach,

pages 317–323. Springer, 2005.

[230] X.-S. Yang. Firefly algorithms for multimodal optimization. In Stochastic algorithms:

foundations and applications, pages 169–178. Springer, 2009.

[231] X.-S. Yang. Appendix a: Test problems in optimization. In X.-S. Yang, editor, Engineering

Optimization, pages 261–266. John Wiley & Sons, Inc., Hoboken, NJ, USA, 2010.

[232] X.-S. Yang. Firefly algorithm, stochastic test functions and design optimisation. Interna-

tional Journal of Bio-Inspired Computation, 2(2):78–84, 2010.

[233] X.-S. Yang. A new metaheuristic bat-inspired algorithm. Nature Inspired Cooperative

Strategies for Optimization (NICSO 2010), pages 65–74, 2010.

[234] X.-S. Yang. Bat algorithm for multiobjective optimisation. International Journal of Bio-

Inspired Computation, 3(5):267–274, 2011.

[235] X.-S. Yang. Nature-inspired metaheuristic algorithms. Luniver Press, 2011.

[236] X.-S. Yang. Flower pollination algorithm for global optimization. Unconventional Com-

putation and Natural Computation, pages 240–249, 2012.

[237] X.-S. Yang and S. Deb. Cuckoo search via levy flights. In Nature & Biologically Inspired

Computing, 2009. NaBIC 2009. World Congress on, pages 210–214. IEEE, 2009.

[238] X.-S. Yang and S. Deb. Eagle strategy using levy walk and firefly algorithms for stochastic

optimization. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010),

pages 101–111. Springer, 2010.

[239] X.-S. Yang and S. Deb. Engineering optimisation by cuckoo search. International Journal

of Mathematical Modelling and Numerical Optimisation, 1(4):330–343, 2010.

[240] X.-S. Yang and S. Deb. Multiobjective cuckoo search for design optimization. Computers

& Operations Research, 40:1616–1624, 2013.

[241] X.-S. Yang, S. Deb, and S. Fong. Accelerated particle swarm optimization and support vec-

tor machine for business optimization and applications. In Networked digital technologies,

pages 53–66. Springer, 2011.

[242] X.-S. Yang and A. H. Gandomi. Bat algorithm: a novel approach for global engineering

optimization. Engineering Computations, 29(5):464–483, 2012.

[243] X.-S. Yang and X. He. Bat algorithm: literature review and applications. International

Journal of Bio-Inspired Computation, 5(3):141–149, 2013.

[244] X.-S. Yang, M. Karamanoglu, and S. Fong. Bat algorithm for topology optimization in

microelectronic applications. In Future Generation Communication Technology (FGCT),

2012 International Conference on, pages 150–155. IEEE, 2012.

Page 110: A comprehensive review of bat algorithms and their ...

Bibliography 97

[245] X.-S. Yang, M. Karamanoglu, and X. He. Multi-objective flower algorithm for optimization.

Procedia Computer Science, 18:861–868, 2013.

[246] X.-S. Yang, J. M. Lees, and C. T. Morley. Application of virtual ant algorithms in the

optimization of cfrp shear strengthened precracked structures. In Computational Science–

ICCS 2006, pages 834–837. Springer, 2006.

[247] X.-S. Yang, S. S. Sadat Hosseini, and A. H. Gandomi. Firefly algorithm for solving non-

convex economic dispatch problems with valve loading effect. Applied Soft Computing,

12(3):1180–1186, 2012.

[248] X. Yao, Y. Liu, and G. Lin. Evolutionary programming made faster. Evolutionary Com-

putation, IEEE Transactions on, 3(2):82–102, 1999.

[249] S. Yilmaz and E. U. Kucuksille. Improved bat algorithm (iba) on continuous optimization

problems. Lecture Notes on Software Engineering, 1(3):279–283, 2013.

[250] K.-C. Ying and C.-J. Liao. An ant colony system for permutation flow-shop sequencing.

Computers & Operations Research, 31(5):791–801, 2004.

[251] Z. Zandi, E. Afjei, and M. Sedighizadeh. Reactive power dispatch using big bang-big

crunch optimization algorithm for voltage stability enhancement. In Power and Energy

(PECon), 2012 IEEE International Conference on, pages 239–244. IEEE, 2012.

[252] C. Zhang, D. Ouyang, and J. Ning. An artificial bee colony approach for clustering. Expert

Systems with Applications, 37(7):4761–4767, 2010.

[253] J. W. Zhang and G. G. Wang. Image matching using a bat algorithm with mutation.

Applied Mechanics and Materials, 203:88–93, 2012.

[254] L. M. Zhang, C. Dahlmann, and Y. Zhang. Human-inspired algorithms for continuous

function optimization. In Intelligent Computing and Intelligent Systems, 2009. ICIS 2009.

IEEE International Conference on, volume 1, pages 318–321. IEEE, 2009.

[255] Y. Zhang and L. Wu. Artificial bee colony for two dimensional protein folding. Advances

in Electrical Engineering Systems, 1(1):19–23, 2012.

[256] S. Zhao, J. Liang, P. Suganthan, and M. Tasgetiren. Dynamic multi-swarm particle swarm

optimizer with local search for large scale global optimization. In Evolutionary Compu-

tation, 2008. CEC 2008.(IEEE World Congress on Computational Intelligence). IEEE

Congress on, pages 3845–3852. IEEE, 2008.

[257] G. Zhu and S. Kwong. Gbest-guided artificial bee colony algorithm for numerical function

optimization. Applied Mathematics and Computation, 217(7):3166–3173, 2010.

[258] E. Zitzler, M. Laumanns, and L. Thiele. SPEA2: Improving the strength pareto evolu-

tionary algorithm. Technical Report 103, Computer Engineering and Networks Laboratory

(TIK), Swiss Federal Institute of Technology (ETH), Zurich, Switzerland, 2001.

Page 111: A comprehensive review of bat algorithms and their ...

Bibliography 98

Page 112: A comprehensive review of bat algorithms and their ...

Bibliography 99

Page 113: A comprehensive review of bat algorithms and their ...

Bibliography 100