Genetic Algo 1
Transcript of Genetic Algo 1
-
8/17/2019 Genetic Algo 1
1/11
Nontraditional Optimization
Algorithms
This chapter describes two nontraditional search
and
optimization
methods which are becoming popular in engineering optimization
problems in the recent past . These algorithms are included in
this book not because they are new but because they are found
to be potential search and optimization algorithms for complex
engineering optimization problems. enetic algorithms GAs) mimic
the principles of natural genetics and natural selection to constitute
search and optimization procedures.
Simulated annealing
mimics
the
cooling phenomenon of molten metals
to
constitute a search
procedure. Since both these algorithms are abstractions from a
-
8/17/2019 Genetic Algo 1
2/11
on
Nontraditional Optimization lgorithms
2
91
1975; Michalewicz, 1992)
and through
a number
of internation
al
conference proceedings (Belew
and
Booker, 1991; Forrest , 1993·
Grefenstette, 1985, 1987; Rawlins, 1991; Schaffer, 1989, Whitley
1993).
An
exten.;ive list of GA-related papers is referenced elsewhere
(Goldberg, et. al, 1992). GAs are fundamentally different than
classical optimization algorithms we have discussed in
Chapters
2
through
5. We begin
the
discussion of GAs by first outlining the
working principles of GAs
and then
highlighting
the
differences GAs
have with the
traditional
search methods. Thereafter,
we
show a
computer
simulation
to
illustrate
the
working of GAs.
6.1.1
Working
principles
To illustrate
the
working principles of GAs,
we
first consider
unconstrained optimization problem. Later, we shall discuss how
GAs can
be
used
to
solve a constrained optimization problem . Let
us consider the following maximization problem:
Maximize
f x
),
(L) . (U) . _
..
-
8/17/2019 Genetic Algo 1
3/11
292
Optimization for Engineering Duign: Algorithm ll and Ea amples
to a fixed mapping rule. Usually, the following linear mapping rule
is used: .
(L) x ~ U ) -
X ~ L )
. :
Xi= xi
2
l;
_
1
decoded value (si)· (6.1)
n the above equation, the variable Xi is coded in a substring
·Si
of
length li. The decoded value of a binary substring-si is calculated
as E f ; ; ; ; ~ 2i
Si,
where si E (0, 1) and the .string s is represented as
(st-ISt-2
. . .
s2s1so).
For example, a
f o u r ~ b i t
string
(0111)
has a
decoded value equal
to
((1)2° + (1)2
1
+ (1)2
2
+ (0)2
3
)
or
7.
It is
worthwhile to mention here that with four bits to code each variable,
there are only 2
4
or 16 distinct substrings possible, because each
b i t ~ p o s i t i o n can take a value either 0 or 1. The accuracy that can
be obtained with a
f o u r ~ b i t
coding is only approximately 1/16th of
the search space. But as the string length is increased by one,_the
obtainable accuracy increases exponentially to 1 /32th of the search
space.
It is
not necessary' to code all variables · n equal substring
length. The length of a substring representing a variable depends
on
the
desired accuracy in
that
variable. Generalizing tills concept,
we may say that with an l i ~ b i t coding for a variable, the obtainable
accuracy in
that
variable is approximately
( x ~ U ) -
x L))j2l;.
Once
the coding of the variables has been done, the corresponding point
X= ( x ~ ,
x2, XN f can be found using .Equation (6.1). Thereafter,
the
function value at the point x can also be calculated by
SUbstituting X in fhe given objective function f( X .
Fitness
function
As pointed
out
earlier, GAs mimic
the
u r v i v a l ~ o f ~ t h e ~ f i t t e s t principle
of
nature
to make a search process. -Therefore, GAs are naturally
suitable for solving mci.ximization problems. Minimization problems
are usually transformed into maximization _ problems by some
suitable transformation.
n
general, a
fi ness
f u n c t i o l
F ( ~ )
is
derived from the objective function _and used in successive
genetic operations . Certain genetic operato rs require that the
i t n ~ s s
fu ;,tjon be,nonnegative, although certain.operators do no.t ~ a v e t ~ s
~ . 9 u i r e m e n t . For maximization problems, the fitness function can be
considered to be the same as the objective function or F( x) = f( x).
For minimization problems, the fitness function is an equivalent
maXimization problem chosen such
that
the optimum point remains
unchanged. A number of such transformations are possible. The
following fitness function is often used: ·
F(x) = 1/(1 (x)).
6
.2
)
Nontraditional p t i m i a ~ i o n Algorithm
298
This transformation does not alter the location of the minimum,
but converts a minimization problem to an equivalent maximization
problem. The fitness function value of a string is known as the
st ring's fitness.
The operation of GAs begins with a population of random strings
representing design or decision variables. Thereaft er, each string is
e v ~ u a . t e d to find the fitness value. The population is then operated
by three main
operators-reproduction,
c r Q ~ § l .
e r , ana
mutation
+ to
create a new population of points. The new population is fur ther
evaluated and tested for termination. f the termination criterion is
not met, the population is iteratively operated by the above three
operators and evaluated. This ptocedure · is continued until the
termination criterl.on
is
m,et . One cycle of these operatio:riS and the
subsequent evaluation procedure
is
known as a generation in GA's
terminology. The operators are described next.
G A operators
.l
Reproduction is usually the first operator applied on a population.
Reproduction sele.cts good strings in a population
and
forms a mating
pool.
That
is why'
the
reproduction operator is sometimes known
as. the selection operator. There exist a number of reproduction
operators in GA literature, but the essential idea in all of them is
that
the
b o v e ~ a v e r a g e
strings are picked from
the
current popvJation
and their multiple copies are inserted
i1 1
the mating poof in a
probabilistic fi .anner.
The c o m m o n l y ~ u s e d r e p r o d u c ~ i 9 n
operator is
the proportionate reproduction operator where a string is selected
fqr_the
~ l t
pool with a J : > a b i l i t y proportional to its. ~ t n e
Thus,
the
i ~ t h string in
the
population is selected with a probabili'ty
proportional to Fi. Since the population size is usually kept fixed in
a simple GA,
the
sum of
the
probability of each string being selected
for the mating pool must be one. Therefore, the probability for
selecting
the
th
string is
Fi
Pi= -n--,
F i
i=l
where
n
is the popula tion size. One way to implement ,this selection
scheme is
to
imagine a r o u l e i t e ~ w h e e l with
it s
circumference marked
for each string proportionate to the string's fitness , The r o u l e t t ~
wheel is spun
n
times, each time selecting ·
a.n
instance of the string
chosen by the roulet te-wheel pointer. Since the circumference of the
wheel is marked
a,c;cor
dlna to
a
string's fi.tness, this roulette-wheel
-
8/17/2019 Genetic Algo 1
4/11
294 OptlmiJalfotl / rEngin ee
ring
Du
n
:
l
gorlthm
1
and
mechanism is expected
to
make Fi/
F
copies of
the
i-th string in
the
mating pool. The average fitness of the population is calculated as
n
F= EFifn .
i= l
Figure 6.1 shows
a.
roulette-wheel for
five
individuals having different
fitness values. Since
the third
individual
ha.S
a. higher fitness value
Point Fitness
1 25 .0
2
5.0
3
40.0
4 10.0
5
20.0
5
Figure 6.1 A rouiette-wheel marked for five individuals according
to their fitness values. The third individual has a higher probability
of selection
than
any other
than
any other it
is
expected
that
the roulette-wheel selection will
choose
the
thi rd individual more
than
any oth er individual. This
roulette-wheel selection scheme can be simulated easily.
0
Using the
fitness value
Fi of
all strings,
the
probability
of selecting a.
string
Pi
can be calculated.
T h e r e ~ f t e r t h ClJ
J
.nula.t
.iY:
e probabilitj p
i
of
each string being copied can be calculated by adding the individual
pr
obabilities from the top of
the
list. Thu_, the bottom-most string
in
the
population should have
a.
cumulative probability
Pn)
equal
to
l · The
roulette-wheel concept can be simulated by
realizi
11g
that the
i th
string in
the
population represents
the
cumulative probability
- a l u e s f r o m Pi-1 to Pi. The first string represents the cumulative
values from zero to P
1
• Thus,
the
cumulative probabilityof any string
lies .between 0
to
1. lri order
to
choose n strings, n random numbers
between zero to one are ·created
at
random.
Thus
a string
that
.
represents
the
chosen random number in
the cumulative
probability
range (calculated from
the
fitness values) for the
string is
copied
to the
mating pool. This way,
the
string with a higher fitness
value will represent
a.
rla.rger range in
the
cumula
ti
ve probability
values and there
fo
re has a. higher probabj]lty of bolng copied into
th
e
matin
g pool. On
the
other
han
d,
a.
string with
a.
smaller
fi
tness
value represents a. smaller range in eumula.tive probability values and
has
a.
smaller probability of being copied into
the
mating pool. We
illustrate
the
working
ofthis
roulette-wheel simulation
later
through
a computer simulation of GAs.
lri
reproduction, good strings in a. populat ion are proba.bilistically
assigned a. larger number of copies and a. mating pool is formed.
It
is
important to
note
that
no new strings are formed in
the
reproduction phase.
In
the crossover operator, new strings are
(
created }>y exchanging information among stnngs oof the mating
pool. Many crossover operators exist hi
the
GA literature. In
·
most
crossover operators, two strings are picked
fro:qt the mating pool at
random
and
s·ome portions of
the strh
igs ·are exchanged betwee n
the
strings. A single-point crossover operator is performed by ra.ndonily
choosing
a.·
crossing site along
the
string
and
by exchanging all bits
on
the
right side
of
the crossing site as shown:
f
.
0 010 0 0 0 011 1 1
:::}
111111
111 d
The
t
wo
strings participating in
the
crossover operation are known •
as parent strings and the resulting strings are known a s children
strings.
It is
intuitive from
this
construction
that
good substrings
from parent strings can be combined to form a.
better
child string, if
an
appro priate site is chosen. Since
the
knowledge
of an
appropriate
site is usually
not
known beforehand, a. random site is often chosen.
With a.
random site,
the
children strings produced may
or
may
not
have a. combination of good substrings
f;rom
parent strings, depending
on whether or nQt the crossing site fall1 in the appropriate place. But
we
do
not
worry
about
this too much, because
if
good s.t rings are
created by crossover, there will be more copies
of them
in
the
next
mating pool generated by
the
reproduction operator.
But if
good
strings are
not
created by crossover, they will
not
survive too long,
o
eca.u
se-reproduction will select against those strings in subsequent
r a t i o n s · ·
It
is
cle
ar
from this discussion
that the
effect of crossover may be
g e
trime ntal or beneficial. Thus,, in order to preserve some of
th
e good
strings
that
are already present in
the
mating pool, not all s
tr
ings in
the m
at
ing pool are used in crossover. When a. crossover probability
of
Pc is
used, only
100pc
per cent strings in
the
popul
at
ion are used
in
th
e cross
ov
er operation
and
100(1 -
Pc
per
ce
nt
of the populatjon
-
8/17/2019 Genetic Algo 1
5/11
296
Optimization for En gin eering Design: Algorithms and Examples
remains
a.s they
a.re
in the
current population
1
.
A crossover
operator
is mainly responsible for
the
search of new
strings, even though a.
mutation operator
is also used for this purpose
sparingly.
The mutation operator
changes 1
to
0. a.nd vice versa.
with
a. smail
mutation
proba.bilit.YL..Em. The b it-wise
mutation
is
..- .
~ ~
performed
bit
by bit by flipping
a.
coin
2
with
a.
p r o b a . b i l i t ~
at'a:'ny
oit the
outcome is
t r u e o t h e r w i s e
~
·
The
need for
mutation
is
to create a.
point in the neighEourhood
of the
current po nt, thereby achieving
a.
local search
around the
current solution. Tile
mutation
is
also U'Sed
9
E . J - ~ j t a i n . J i v e r _ s i t y
_
~ ~ _ 2 i u o ~ .
For
e x
a.mple, 7 onsider
the
following population having four eight-bit strings:
0110 1011
0011 1101
0001 0110
0111 1100
Notice
that
a.ll
four strings have
a.
0
in the
left-most
bit
p o s i ~ i o n
f
the true
optimum
solution requires 1 in that position,
then
neither
fe"proauction
nor
crossover
operator
described above will
be
able
to
create
1
in
that position.-
The
inclusion
of mutation
introduces some
piOba.bilitYT NPm) ofturning
o into 1.
- -
These
three
operators a.re simple a.nd straightforward. The
reproduction operator
selects good s trings a.nd
the
crossover
operator
recomljiiies -good substrings from good st_ J:ggs
together
to honefully
crea.te_
a. better
substrin$
;_ l;:_he m u t a . t i ~ n operator
alters
a.
.
string
1
loca.lly to hopefully create a. better string. Even
though
none o
these c l a . i ~ s ~ guaranteed a.nd}or t ~ s t ~ d while crea.tin_g a.
is expected
that
if ba.d strings a.re created thE Y will _ e eliminated by
the
r e ~ d u c t i o n
operator
in .
the
next genera.tio
"ii"'an
d.
f
good
a.re ~ t ~ g ,
they
will
be
in.crea.singly e m J t h ~ ~ ~ t e r e s t e d readers
ma.y refer to Goldberg (1989)
a.nd other
GA
literature
given
in
the
references for
further
insight a.nd some
mathematical
foundations
of
genetic algorithms.
Here, we outline some differences
a.nd
similarities
of
GAs
with
traditional
optimization methods.
_
1
Even though
the
best 1 -
Pc)100%
of the current population can be copied
deterministically to the new population, ·this is usually performed at random.
2
Flipping
of
a coin with a probability
p
is simulated as follows. A number
between 0 to 1 is chosen at random.
f the random
number is smaller than p, th
ontcomc of coin-flipping is true, othcrwiAe the
out
come is fals
e.
Nontraditional Optimization A lgorithms
297
6.1.2
Differences between GAs a,nd traditional
methods
As seen from
the
above description of
the
working principles
of
GAs,
they
a.re ra.dica.lly different from
most of
the
traditional
optimization
methods
described
in Chapters
2
to
4. However,
the
fundamental
differences a.re described
in the
following
paragraphs.
·
GAs
~ o r k
w_ h a. string-coding
of
~ e s instead
thQ
variables.
The
a.dva.nta.ge
of
working
with a.
coiling
of
variables is
that
the
d i n g
discretizes the search space, even
though
the funct
op.
ma.y be
continuous. On
the other hand,
_ince GAs require only
ronchon
v a t u various discrete a. discrete
or
discontinuous
function ca.n
be
lia.naled with no extra. cost. T his a.llows GAs to
be
applied to
a.
wide .variety
of r o h l
An
other
a.dva.nta.ge is that
the
GA
opera.
ors
exploit
the r i t i e s
in st ring-st
:t;.l &.tJ
res
to
_
ma._ke
a.n
effective sea.rcli.
Let
us discuss this
important
aspect
of
GAs
in somewhat more details. A schema (pl. schemata.) repre sents 'a.
number of
strings
~ i t h
similarities
a.t
certain
string
positions. For
example,
in
a.
·five-bit problem,
the
·schema.
10h*) (a. ' '
denotes
either a. 0 or a.1) represents four strings
10100), 10101), 10110),
a.nd (
10111).
In
the
decoded
parameter
space,
a.
schema. represents
a. continuous
or
discontinuous region in
the
search space. Figure 6.2
shows that the above schema. represents one-eighth
of
the search
space. Since in
a.n
£-bit schema., every position ca.n
take either
0, 1,
- -1
I I I I I
I OOO••I 001••1 010••1 011••1 100••1 101••
I I
110••1 111••1
I I I I I
I
I
I
I I I I
I I
xmin
X
max
Figure 6.2
A schema with three fixed positions divides the search
space into eight regions.
The
schema
10h*)
is highlighted.
or *,
there
a.re a. total of 3l schemata. possible. A finite population of
size n contains only n strings,
but
contains many schemata.. Goldberg
(1989) ha.s shown
that
due to
the
action of GA operators, the number
of strings
m H,
t) representing a. schema. H
a.t a.ny
generation t grows
to a.
number
m H,
t
+1) in the next generation a.s follows:
:F H) [ IJ H) ]
m H, t +1 m H, t) --=y- 1 - Pet= Pmo H)J
(6.3)
growth factor , P
wht•ro
:T
II)
Is th
o
fitn
CRil
of
tho rhoma ca lculated
by a.vcra,;in,;
-
8/17/2019 Genetic Algo 1
6/11
the fi
t ness
of
all strings representing
the
schema,
8( /)
is the defining
length of the schema
H
calculated as the difference in the outermost
defined positions,
and o(H)
is
the
order of
the
schema
H
calculated
as
the number of
fixed positions
in the
schema. For example,
the
schema
H =10h
has a defining length equal
to o(H) 3
1 2
and
has
an order o(H)
= 3.
The growth
factor
P
defined ;
in the
above
equation can be
g r ~ t r than,
less
than,
or equal to 1 depending
on the
schema
H
and the
chosen
GA
parameters.
I f
for a schema
the growth factor
P
: 1, the number
of
strings representing that
schema grows with generation, otherwise
the
representative strings
of the
schema reduce with generation.
The
above inequality suggests
that
the
schema having a small defirung length (small
o(H)),
a few
fixed positions (small o(H)),
and
above-average fitness
F H)
>F),
the
growth factor
P
s likely
to
be
greater
than 1. Schemata for
which
the
growth factor is
greater
than 1 grows exponentially with
generation. These
schemata
usually represent a large, good region
(a region with many high fitness points)
in the
search space. These
schemata
are known as
building blocks
in GA parlance. These
building blocks representing different good regions
in the
search space
get exponentially more copies
and
get combined with each
other
by
the
action of GA operators
and
finally form
the optimum o:r
a
near-optimum
solution. Even
though
this is
the
basic
understanding
of how GAs work,
there
exists some
mathematical
rigour
to
this
hypothesis (Davis and Principe, 1991; Vose
and
Liepins, 1991).
Holland (1975) has shown that even though
n
population members
are modified in a generation, about
n
schemata
get processed
in
a generation.
This
leverage comes
without any extra
book-keeping
(Goldberg, 1989)
and
provides
an implicit parallelism
in the working
of genetic algorithms. Even though there are a number of advantages
of using a coding of variables , there are also some disadvantages. One
of the drawbacks of using a coding representa tion is that a meaningful
and an
appropriate
coding of the problem needs to be used, otherwise
GAs
may not
converge
to
the right solution (Goldberg, et. al, 1989;
Kargupta, et, al, '1992). However, a general guideline would be to
use a coding that does
not
make
the
problem
harder
than the original
problem.
The
most striking difference between GAs
and many traditional
optimization methods is
that
GAs work with a eopulation of
points instead
of
a single point. Because
there
are more th
an
One string 6eing processed. simultaneously, it is very likely that
the
expected GA solution may
be
a global solution. Even though some
traditional
algorithms are population-based, like Box's evolutionary
optimization and complex search'methods, those methods do
not
use
previous ly obtu.ncd
lnforma.tlon
offldouUy.
In
GA8 previou sly found
good
lnform.a
tlon ls
o
mpl1asizc
d
using reproduction OJHirator and
propagat
ed adaptively through crossove·r attd mu ta.tiou
o:pe
.
rator
s.
Another
advanta
ge
wi
th a population-based search algorithm is
that multiple optimal solutions can
be capt
ured in the P,Opulation
easily, thereby reducing
th
e effort to use the same a1gorit hm many
times. Some extensions of GAs along these directions- multimo dal
function optimization (Deb, 1989; Goldberg
and
Richardson, 1987)
and multiobjective function optimization (Horn
and
Nafpliotis, 1993;
Schaffer, 1984; Srinivas
and
Deb, 1995)- have been researched
and
are
outlined
in
Section 6 1.
7.
In discussing GA operators or their working principles
in
the
previous section, nothing has been mentioned
about
the gradient or
any
other
auxiliary problem information.
In
fact , GAs do not require
any auxiliary information except the objective function values.
Although
the
direct search methods used in traditional optimization
methods do not explicitly require the gradient information, some of
those methods use search directions that are similar
in
concept to the
gradient of the function. Moreover, some direct search
methods
work
under the assumption that
the
function to be optimized is unimodal
and
continuous.
In
GAs, no such assumption is necessary.
__.:;; One
o t ~ r
difference
in the
operation of
GAs is
he
use
of
probabilities in
their operators
. None of
the
enetic o
era
rs work
-a:eFefmmishc
aJly. n e repro uction
operator,
even though a string
is expected to have
Fi/
F copies
in
the
mating
pool, a siinulatiori
of the
roulette-wheel selection scheme s used to assign
the true
number
of copies. In the crossover operator, even
though
good strings
(obtained from the
mating
pool) are crossed, strings to
be
crossed
are created at
random
and cross-sites are creat ed
at
random.
In the
mutation
operator
, a
random
bit is suddenly altered. The actidn
of these operators
may appear to be
l).aive,
but
careful studies
may
provide some interesting insights
about
this type
of
search.
The
basic
problem with most of
the
traditional methods is that
they
use fixed
transition
rules
to
move from one point
to
another. For instance, in
the steepest descent
method
; the search direction is always calculated
as
the
negative ofthe gradient
at
any point, because in that direction
the reduction in the function value is maximum.
In
trying to solve a
multimodal problem
with many
local
optimum
points (interestingly,
many
real-world engineering optimization problems are likely to
be
multimodal), search procedures may easily get
trapped in
one
of
the local
optimum
points. Consider the bimodal function shown
in Figure 6.3.
The objective function has one local minimum and
one global minimum. I f
the
initial point is chosen
to be
a point
-
8/17/2019 Genetic Algo 1
7/11
800
--
~
Global
opt imum
X
Figure 6.3 An objective function
with
one local
optimum and
one
global
optiqmm. The
point x t) is in
the
local
basin
.
in
the l o c ~ l
basin (point
x t) in the
figure),
the
steepest descent
algorithm will eventually find the local optimu m point. Since the
transition
rules are rigid, there is no escape from these local optima.
The only way to solve the above problem to global optimality is to
have a
starting
point in
the
global basin. Since this information
is usually not known in any problem, the steepest-descent method
and
for
that matter
most
traditional
methods) fails .
o
locate
the
global optimum.
We
show simulation results showing inability of
the steepest descent
method
to find the global
optimum
point on
a multimodal problem
in
Section 6.3. However, these
traditional
methods
can
be
best applied to a special class of problems suitable
for those methods. For example, the gradient .search methods will
outperform
almost any a l g o
i t h m in
solving continuous, unimodal
problems, but
they
are
not
suitable for multimodal problem. Thus, in
general,
traditional
methods are not robust. A
robust
algorithm can
be designed
in
such a way that it uses
the
steepest descent direction
most
of the. time, but also uses
the
steepest ascent direction (or any
other
direction) with some probability. Such a mixed
strategy may
require more number of function evaluations to solve "continuous,
unimodal problems, because of
the
extra
computations involved in
trying with .non-descent directions.
But
this
strategy
may
be
able
to solve complex, multi modal problems to global optimalit
Y
.
In the
muHimodal problem, shown
in
the above figure, the mixed
strategy
may take the point
x t)
into
the
global basin (when t:r.ied with
non-descent directions)
and
finally find the global
optimum
point.
GAs use similar search strategies by using probability in all their
operators. ince an initial
random
population is used, to start with,
the
search can proceed
in
any direction
and
no
major
decisions are
made
in the beginning. · Later on, when the pQpulation begins to
copverge in some bit positions, tho search direction
narro
ws
and
a ;near-
opt
imal solution
is
achieved. T
hi
s na ture of
narrowi
ng t he
search space as
the
sear ch
pr
ogresses is
adaptiv
e
and
is a unique
characteristic o f genetic algorithms.
6.1.3 Similarities between GAs
and
traditional
methods
Even though GAs are different than most
traditional
search
algorithms, there are some similarities. In
traditional
search
methods, where a search direction is used to find a new p
.Qin1....
.a1
~ a s t
two P?ints are 1 l e f i
m P f i c
· or_ ~ c l £ used to ·define
the seardi direction. In
the
Hooke-Jeeves pattern search method, a
pattern move
is
created using two points. In gradien t-based methods,
the
search direction requires derivative information which is usually
calculated using function values at two neighbouring points.
In the
crossover operator (which is mainly responsible for the GA search),
two points a re also used
to
create two new points. Th us, the crossover
?Peration is simi l- --. .
-
8/17/2019 Genetic Algo 1
8/11
30
lml•• ion
tift:
t o r i l t nfl mp
/
along directions (
Ct and c2)
shown in the fi
gur
e (either along soHd
arrows
or
along dashed arrows). The exact locations
of
the children
points along these directions depend on
the
relative distance be
tw
een
the parents
(Deb
and
Agrawal, 1994).
The
points
Yt and
Y2
are the
two typical children points
obtained
after crossing
the parent
points
PI and p
2
•
Thus,
it
may
be envisaged
that
point PI has moved in the
direction from di up to the point YI
and
similarly
the
point P2 has
moved to the point Y2.
Since the two points used in the crossover operator
are
chosen
at random, many
such search directions are possible. Among
them
some directions
may
lead to the global basin and some directions
may
not
. The reproduction
operator
has
an
indirect effect
of
filtering
the good
search directions
and
help guide
the
search.
The
purpose
of
the mutation
operator
is .to create a point in the vicinity of the
current point. The search in the mutation operator is similar to
a local search
method
such as the exploratory search used in the
Hooke-Jeeves
method.
With the discussion
of
the differences and
similarities
of
GAs with
traditional
methods, we are now ready to
present
the
algorithm in a step-by-step format.
Algorithm
Step 1
Choose a coding
to
represent problem
parameters,
a
selection
operator,
a crossover
operator, and
a
mutation operator.
Choose population size, n, crossover probability, Pc and mutation
probability, Pm. Initialize a random population of strings of size f
Choose a
maximum
allowable generation number
tmax·
Set t = 0.
Step 2
Evaluate each string in
the
population.
Step 3
I f t
>
tmax or
other termination
criteria is satisfied,
Terminate.
Step 4
Perform reproduction on
the
population.
Step 5
Perform crossover on
random
pairs
of
strings.
Step 6
Perform
mutation
on every
string.
Step 7
Evaluate strings in the new population . Set = +1
and
go to Step 3.
The algorithm is straightforward with repeated application
of
three operators
(Steps 4 to 7) to a population
of
points. We show the
working
of
this algorithm to the unconstrained Himmelblau function
used in
Chapter
3.
p l i m ~
30
EXERCISE 6 1 1
The object ive is to minimi
ze the
function
,;
..
.
f xt,x2)
=(xi+
X
-11)
2
+ x i+ 7
2
in
th
e interval 0
; ;;
x
1
, x
2
;:;; 6.
Recall
that the true
solution
to
th is
problem is (3, 2 f having a function value equal to ~ e r o .
Step 1
In order
to
solve this problem using genetic algorithms,
we choose
binary
coding to represent variables Xt and x2. In
the calculati
-
8/17/2019 Genetic Algo 1
9/11
104
.::
0
3
A.
0
p...
s
0
'0
'
.::
0
'
)
....c:
p...
.::
0
·-e
;:I
'0
0
.
A.
'0
:: r
'
::
0
;:I
.-I
co
Q)
0
0
A.
bO
.::
::E
-
0 0
......
0
0
...... ......
0
......
0 0 0
...... ...... ......
0
...... ...... ......
0
bo
......
0
......
......
......
......
...... ...... ......
0 0 0 0
0
0
0 ......
0 0
......
0 0
......
0 0 0
0
......
...... ...... 0
0
......
...... ......
............
0
......
0
.s
......
......
0 0
...... 0
0 0 0 0
......
......
......
...... ......
0 0
...... ...... ......
.
0 0 0
0 00 0 0 0
............ ...... ......
0
......
......
0
0
......
0
...
......
0
......
0
...... 0
0 0
0
0 ......
......
0 0 0
0
......
0 0
......
'
0
......
......
......
0 ......
......
......
0
0 ......
......
0 0 0
0 ..... 0
0 0
..0
;:I
......
......
......
......
...... ......
......
......
0 0
...... ...... ...... ...... ......
0 ......
0
...... ......
0
......
0
......
00 0 0 0 0 0
0
...... ......
...... 00
...... ......
0
U
......
0 0
0
...... ...... ......
......
0
...... 0
0 0
......
0
............
0
0
......
o
..................
oooo
......
o
..................
o ...... o
......
o ...... o
ooo ...... o .............................. ooo ...... o ......
oooo
bo
...... 0 ...... 0 ......
0000
..................................................................
.s
oo
............
oooo ...... ooo .................. o
......
o
......
o
ooo ...... oooo ...... ooo
..................
o
..................
o
.....
............
oo .................. ooo ..............................
o ...... o
..............,
'
oo ......
oo
.......................................... ooo ............ ooo
..0
......
......
0
......
......
...... ......
......
0 0
......
......
0 0 0 0 0
......
0
......
;:I
0 0 0
......
00 0
...... ......
00
0 0
0
0
0 0 0 00
U
0
......
0
......
00 0
0 0
......
o
0 0
0
0
............
......
0 0
k.
o ~ o ~ o M o M o o ~ o
10 M 0 M 00 0
,....... ,.......,.......,.......,.......,....... ,.......,.......,....... ,.......,.......,.......
~ ~ O O ~ M ~ ~ ~ M ~ O O O ~ ~ ~ ~ ~ ~ ~ M
~ o ~ ~ M M ~ - ~ o o ~ o o - - o ~ o o ~ o o -
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
00000000000000000000
C,) - M 0 -
00
00
t -
0
o ~ o o ~ ~ o ~ ~ ~ ~ ~ ~ M ~ ~ o
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
ooooooooooooooooooo
I X l l ~
g;
;3
g g
8'
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
00000000000000000000
IM 0
00
~ . " ; )
- - 00
M
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
o o o o o ~ o ~ o o o o ~ o
~ ~ o o t n ~ ~ ~ ~ M M ~ M ~ 0 ~ ~
~ 0 0 0 o o o ~ O ~ O O O O _ O _ O
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
~ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
~ o o > n ~ ~ O O M M ~ ~ ~ o o ~ ~ o ~ ~ M > n
~ 0 0 ~ 0 0 ~ ~ ~ ~ ~ 1 . " ; ) ~ - ~ ~ ~ 0 t n ~ O t n t n
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
~ t n ~ ~ ~ O O O ~ t n ~ ~ ~ O O ~ ~ ~ ~ M ~ ~
t n O ~ ~ - ~ O O M ~ M - ~ - ~ 0 0 ~ ~ - ~ 0 0
~ - - tn
~ ~ 00
M ~ ~ -
-
-
t ~ ~ ~ 8 ~ ~ ~ ~ ~ ~ ~
~ M ~ M ~ O t n M O ~ ~ M ~ O O O t n ~
~ ~ ~ M ~ ~ o ~ ~ ~ o ~ ~ ~ ~ ~ ~ ~ ~ ~
' I ~ M ~ . " ; ) M ~ . " ; ) 0 -
00
M M -
~ ~ t n ~ ~ ~ M ~ M ~ ~ t n ~ ~ O O M ~ ~ M
M ~ ~ ~ ~ M O ~ ~ ~ O O M M ~ O ~ ~ O O M M
~ o M M ~ ~ ~ ~ M o ~ o o ~ ~ ~ ~ M ~ ~
- o . , . . . . o o
......
o.,.....,.....,....oo.,.....,....oo
10
......
00
......
0 ........................ 000
......
0
~ g c ; ~ c ; g ~ c ; ~ c ; ~ c ; : : : : : : g ~
0 0 0
......
0
......
0 0 0 0 .. .......... 0 0 ......
tl.,.....,....oo.,.....,....o.,.....,.....,.....,....ooo.,....
,.oo.,.....,....o.,.....,....oo
......
o.,. . . .oo., . . . .o
0 0 ............ 0 ·
0
......
0
......
0
0
......
000
...... 0 ...... 0 0
......
000
.....
...... oooo
0 ...... 0 ...... ......
o ............ o ............ o
..................
o
........................
bO E ......
o ...... oooo . , . . . .ooo . , . . . . . , . . . . . , . . . .o
.s V- ' ...... 0 0 ............ 0 0 ................. . 0 0 ...... 0 ......
~ ~ o . , . . . . . , . . . . o o . , . . . . . , . . . . o . , . . . . o . , . . . . . , . . . . o . , . . . . o
U ooo
.....
oo
...... ooooo
..................
bCO.,....o.,....o
......
oo.,.....,....o.,.....,....oo
. : :o ......
oo
...... o .................. o ..............................
0 ...... 0 ...... ......
......
0
......
0
......
0
......
000
.....
. . . . . t - tO. . . . - t . . . . - t
......
0
...... 00
......
0000
...... o ......
oo
· E c ; g ~ g c ; ~ c ; ~ : : : ~ ~ : : : : : : c ; : : :
rno
...... o
......
o ............ o ...... o ...... oooo
..o ...... o
......
oo
......
o ...... o ............ oo ............
J 5 : : : g ~ ~ : : : g c ; : : : ~ g : : : g g ; : ~
000
...... ......
0
......
0
...... ......
0
............ 000
00 ......
0
......
~ M ~ ~ ~ ~ ~ ~ o ~ M ~ ~ ~ ~ o o ~ o
, . . . . . . . , . . . . . . . , . . . . . . . , . . . . . . . , . . . . . . . , . . . . . . . , . . . . . . . , . . . . . . . , . . . . . . . , . . . . . . .C '
0
0
A.
bO
s
Q
....c:
,_.-+>
Q
. ::
..0
·
S+>
;:I .::
0
g
bOU
. : : Q
·
;:I
~ ~
~ k .
.::
o-
0
.)
.::
'
rno
......
.::
0 Q
:>.
Q
~ ~
·- Q
~ . . 0
..0 .
0 Q
.....
..o
A. s
Q ;:I
> .::
8
- 0
;:::1"'0
s .::
;:I '
~
C , ~
.::
.9
...,
.)
Q
Q
..., '
.::..._.
;:I 0
8 :>.
...,
0;..=
Q ·
..,,.o
.) '
J,.o
A o
>< .
~ p . . .
~ I : X : l
4
/,IIJ
1
i\ t, th
lH
Hop, wo tlolcct H,J'iugH iu th o
popul a.t i
-
8/17/2019 Genetic Algo 1
10/11
306
Optimization fo r b n
gin
eet·i ng Design; Algorithms and b xamples
~
0 - ·- - - - - - - - - -
c
5
' - . ._0
'
'
o.
-·-·-·-·-·-·- o
"
'
'
4
· · · · · · · · · · · · · · · · · · · · · · · · - ·- ·- · . . " \ 0
c
c · .. ~
'
·.
a \ '
- ::f-.... ··
.
. \ \ \
\. \ \ \ I
(
~
\ \c\·... \ I
~
.
----
I I '
I
I
\ \\·I
1rumum :
'
oint
\
0
i
\ I I :
\ I
lc
\
\ c \ I I \ I I
I.
J.
:.
I
2
~ '
\
'L·
0
.
0
1
2
3
4 5
6
Xt
. . . 1500.0
850.0
500.0
300.0
150.0
75.0
30.0
10.0
o Initial
population
0 Mating
pool
Figure 6.5
The
initial population (marked with
empty
circles)
and
the mating pool (marked with boxes) on a contour plot of the objective
function.
The
best point in
the
population has a function value 39.849
and the average function value of the initial population is 360.540.
pool contains strings
at
random, we pick pairs of strings from the top
of the
list. Thus, strings 3
and
10 participate in
the
first crossover
operation.
When
two strings are chosen for crossover, first a coin
is flipped with a probability Pc
=
0.8 to check whether a crossover
is desired or
not.
If the outcome of the coin-flipping is
true,
the
crossing over is performed, otherwise
the
strings are directly placed
in an intermediate population for subsequent genetic operation. It
turns
out
that
the outcome of the first coin-flipping is
true,
meaning
that
a crossover is required to be performed. The next step is to
find a cross-site
at
random.
We choose a site by creating a
random
number
between (0,
.e-
1) or
(0, 19).
It
turns out
that
the
obtained
random number
is 11. Thus, we cross
the
strings at the site 11
and
create two new strings. After crossover,
the
children strings
are placed in the intermediate population. Then, strings 14 and 2
(selected at random are used in
the
crossover operation. This time
the coin-flipping comes true again and we perform the crossover at
the site 8 found at random. The new children strings are put
into
the
intermediate
population. Figure
6.6
shows how points cross over
and form new points.
The
points marked with a small box are the
points in the mating pool and the points
marked
with a small circle
are children points created a f ~ e r crossover operation. Notice
that
not
N
mthrHltiiiiMI
O t J L i m i
t ~ t l c m
A ouf hm
6
r
:n ..• J
: ;
t • t :
............ ····· o 1iatina
pool
··--
If ·
2
1
·-
o After orosaover
·
· ~ _
·
.a\.
\
\
··.
··
o ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~
0 1 2 3 4 5 6
x
ao
Figure 6.6 The population after the crossover operation. Two
points are crossed over to form two new points.
Of ten
pairs of strings,
seven pairs are crossed.
all 10 pairs of points in
the
mating pool cross with each other. With
th
e flipping
of
a coin
with
a probability Pc
=
0.8,
it
turns out
that
fourth, sevent
h,
and tenth crossovers come out to be false. Th us,
in these cases, the strings are copied directly
into
the
interm
ediat
population. The complete population at the end of the crossover
operation is shown in Table 6.2.
It
is interesting to
note
that
with
Pc =
0.8, the ex
pected number of
crossover in a population
of
size
20 is 0.8 X 20/2 or 8. In this exercise problem, we performed seven
e r s
an d in
three
cases we simply copied the strings to
th
e
ltttormediate population. Figure 6.6 shows that some good points
a.nd
some not -so-good points
are
created after crossover. In some
ascs,
po
i
nts
far away from
the parent
points
are created
and in
some cases points close to the
parent
points are created.
Step 6 The
next
step is to perform mutation on strings in th
intermediate
population. For bit-wise mutation, we flip a coin with
a proba biHty m = 0.05 for every bit.
If the
outcome is true, we alter
tho bit to 1 or 0 depending on
the
bit va]ue. With a probability of
0.0
,a
population
Rlzo
20, and
a
string length 20, we
ca.
n expect to ·
n. t.M IL
tota
.l of a.hou t 0.01) x 20 x
20
or 20 bitR ln tho population ,
l li hl< ,
0.2 Hhowll th11
11111 tlitod blts in
bold
ch
a.ra.ctcrtl In tho t 1 ~ b l
A
-
8/17/2019 Genetic Algo 1
11/11
308
t i m i ~ t d i o n for
engineering
D
e1ign
Al
gorltAm1
and
Ex
amples
11.1
c
- M
o o c i c 0 . , ; ~
M
00 0 ' )
co 00 0 ' )
C 'O') O ' ) ~ . - < C ' L < ' : > O O C ' O
C O O M O . - < L < ' : > M . - < . - < O ~ < t ' t - M < t ' . - < L < ' : > O ' ) < t ' O
~ . , ; d ~ . , ; ~ . , ; . , ; d ~ ~ ~ ~ ~ ~ d ~ ~ ~ . , .
.-
t-
. C ' M 0
t-
...... C ' L< > t- M M M M O O M O " < t ' " < t ' C O C O L < ' : > . - < M t - L < ' : > C O
OO')"