Optimisation Techniques

65
Presented BY Sarbjeet Singh ME-ECE regular (NITTTR-Chandigarh) Under the Guidance of Dr. Swapna Devi Asso.Prof ECE Dept. Sector-26,NITTTR, Chandigarh Optimisation Techniques

Transcript of Optimisation Techniques

Page 1: Optimisation Techniques

Presented BYSarbjeet SinghME-ECE regular(NITTTR-Chandigarh)

Under the Guidance of

Dr. Swapna Devi Asso.Prof ECE Dept.Sector-26,NITTTR, Chandigarh

Optimisation Techniques

Page 2: Optimisation Techniques

GA

PSO

ACO

BFO

BBO etc.,

Optimization Methods

Page 3: Optimisation Techniques

Conceptual AlgorithmConceptual Algorithm

Page 4: Optimisation Techniques

Genetic Algorithm

• Problems are solved by an evolutionary process resulting in a best (fittest) solution (survivor).

• Evolutionary Computing – 1960s by I. Rechenberg

• Genetic Algorithms– Invented by John Holland 1975– Made popular by John Koza 1992

Page 5: Optimisation Techniques

How GA are Different than Traditional Search Methods

• GAs work with a coding of the parameter set, not the parameters themselves.

• GAs search from a population of points, not a single point.

• GAs use payoff information, not derivatives or auxiliary knowldege.

• GAs use probablistic transition rules, not deterministic rules.

Page 6: Optimisation Techniques

Vocabulary

• Gene – An single encoding of part of the solution space.

• Chromosome – A string of “Genes” that represents a solution.

• Population - The number of “Chromosomes” available to test.

Page 7: Optimisation Techniques

Simple Example• f(x) = {MAX(x2): 0 <= x <= 32 }• Encode Solution: Just use 5 bits (1 or 0).• Generate initial population.

• Evaluate each solution against objective.

A 0 1 1 0 1

B 1 1 0 0 0

C 0 1 0 0 0

D 1 0 0 1 1

Sol. String Fitness % of Total

A 01101 169 14.4

B 11000 576 49.2

C 01000 64 5.5

D 10011 361 30.9

Page 8: Optimisation Techniques

Simple Example (cont.)

• Create next generation of solutions– Probability of “being a parent” depends on the fitness.

• Ways for parents to create next generation– Reproduction

• Use a string again unmodified.

– Crossover• Cut and paste portions of one string to another.

– Mutation• Randomly flip a bit.

– COMBINATION of all of the above.

Page 9: Optimisation Techniques

The Basic Genetic Algorithm1. [Start] Generate random population of n chromosomes (suitable

solutions for the problem) 2. [Fitness] Evaluate the fitness f(x) of each chromosome x in the

population 3. [New population] Create a new population by repeating following steps

until the new population is complete 1. [Selection] Select two parent chromosomes from a population

according to their fitness (the better fitness, the bigger chance to be selected)

2. [Crossover] With a crossover probability cross over the parents to form new offspring (children). If no crossover was performed, offspring is the exact copy of parents.

3. [Mutation] With a mutation probability mutate new offspring at each locus (position in chromosome).

4. [Accepting] Place new offspring in the new population 4. [Replace] Use new generated population for a further run of the

algorithm 5. [Test] If the end condition is satisfied, stop, and return the best solution in

current population 6. [Loop] Go to step 2

Page 10: Optimisation Techniques

Genetic AlgorithmGenetic Algorithm

• Based on Darwinian Paradigm

• Intrinsically a robust search and optimization mechanism

Reproduction Competition

SelectionSurvive

Page 11: Optimisation Techniques

Particle Swarm Optimization-PSO

Page 12: Optimisation Techniques

Particle Swarm Optimization

Particle Swarm Optimization-PSO

Page 13: Optimisation Techniques

•Particle Swarm Optimization (PSO) uses the concept of social interaction to problem solving.

•It was developed in 1995 by James Kennedy and Russ Eberhart [Kennedy, J. and Eberhart, R. (1995). “Particle Swarm Optimization”, Proceedings of the 1995 IEEE International Conference on Neural Networks, pp. 1942-1948, IEEE Press.]

•It has been applied successfully to a wide variety of search and optimization problems.

•In PSO, a swarm of n individuals communicate either directly or indirectly with each other in search directions.

•PSO is a simple but powerful search technique

Particle Swarm Optimization-PSO

Page 14: Optimisation Techniques

Collision Avoidance

• Rule 1: Avoid Collision with neighboring birds

Collision Avoidance

Page 15: Optimisation Techniques

• Rule 2: Match the velocity of neighboring birds

Velocity Matching

Page 16: Optimisation Techniques

• Rule 3: Stay near neighboring birds

Flock Centering

Page 17: Optimisation Techniques

• Simple rules for each individual

• No central control - Decentralized and hence robust

• Emergent and Performs complex functions

Particle Swarming - Characteristics

Page 18: Optimisation Techniques

• Bird flocking is one of the best example of PSO in nature.

• One of the motive of the development of PSO was to model social behavior.

Page 19: Optimisation Techniques

Particle Swarm Optimization:The Anatomy of a Particle

• A particle (individual) is composed of:

– Three vectors:• The x-vector records the current

position (location) of the particle in the search space,

• The p-vector records the location of the best solution found so far by the particle, and

• The v-vector contains a direction for which particle will travel in if undisturbed.

– Two fitness values:• The x-fitness records the fitness

of the x-vector, and• The p-fitness records the fitness

of the p-vector.

Ik

X = <xk0,xk1,…,xkn-1>P = <pk0,pk1,…,pkn-1>V = <vk0,vk1,…,vkn-1>

x_fitness = ?p_fitness = ?

Page 20: Optimisation Techniques

PSO: Swarm Search

• Particles are agents that fly through the search space and record (and communicate) the best solution that they have discovered.

• particle moves from one location in the search space to another by adding the v-vector to the x-vector to get another x-vector (Xi = Xi + Vi).

• Once the particle computes the new Xi it then evaluates its new location. If x-fitness is better than p-fitness, then Pi = Xi and p-fitness = x-fitness.

Page 21: Optimisation Techniques

PSO:Swarm Search

• the v-vector is calculated before adding it to the x-vector as follows:

– vid = vid + c1*rnd()*(pid-xid)

+ c2*rnd()*(pgd-xid);– xid = xid + vid;

• C1,c2 are learning rate / accelaration constants governing the cognition and social components

• Where i is location of the particle, g represents the location of the particle with the best of p-fitness

• Where, p is personal best • Where d is the dimension.

Page 22: Optimisation Techniques

vid= vid

+ c1*r()*(pid–xid)

+ c2*r()*(pgd – xid)

vid = [ - Vmax, Vmax ]

xid = xid + vid

Update of Velocity & Position

Inertia

Cognitive learning

Social learning

Update Position

Page 23: Optimisation Techniques

PSO: Swarm Search

• Intially the values of the velocity vectors are

randomly generated with the range [-Vmax,

Vmax] where Vmax is the maximum value that

can be assigned to any vid.

Page 24: Optimisation Techniques

PSO: Swarm Types

• In his paper, [Kennedy, J. (1997), “The Particle Swarm: Social Adaptation of Knowledge”, Proceedings of the 1997 International Conference on Evolutionary Computation, pp. 303-308, IEEE Press.]

• Kennedy identifies 4 types of PSO based on 1 and 2 .

• Given: vid = vid

+ c1*rnd()*(pid-xid) + c2*rnd()*(pgd-xid);

xid = xid + vid;– Full Model (c1, c2> 0)– Cognition Only (c1> 0 and c2= 0),– Social Only (c1= 0 and c2> 0)– Selfless (c1= 0, c2> 0, and g i)

Page 25: Optimisation Techniques

Parameters

vid Velocity of the ith particle

pid pBest position of the ith particle

pgd the gBest position of the particles

xid current position of the ith particle

c1 & c2 are acceleration constants

r() random function in the range [0, 1]w Inertia weight

Page 26: Optimisation Techniques

PSO Algorithm

Initialize particles with random position and zero velocity

Evaluate fitness value

Compare & update fitness value with pbest and gbest

Meet stopping criterion?

Update velocity and position

Start

End

YES

NO

pbest = the best solution (fitness) a particle has achieved so far.

gbest = the global best solution of all particles.

Page 27: Optimisation Techniques

• In PSO, two basic topologies are used– Ring Topology (neighborhood of 3)– Star Topology (global neighborhood)

I4

I0

I1

I2 I3

I4

I0

I1

I2 I3

Particle Swarm Optimization: Swarm Topology

Page 28: Optimisation Techniques

1

2

3

Minimization ProblemBest Particle

Other Particle

1. Initializing Position 2. Create Velocity (Vector)

First Iteration

Page 29: Optimisation Techniques

1

Minimization ProblemBest Particle

Other Particle

1. Update New Position 2. Create Velocity (Vector)

Second Iteration

1

2

3

Page 30: Optimisation Techniques

Learning Structures

“Inertia Term”

Page 31: Optimisation Techniques

“Personal Best (Pbest)”

Current Position (X)

Personal Best (Pbest)

Learning Structures

Page 32: Optimisation Techniques

Current Position (X)Personal Best (Pbest)

Global Best (Gbest)

Learning Structures

“Global Best (Gbest)”

Page 33: Optimisation Techniques

Properties of Particles

1. Ability to exchange information with its neighbors

2. Ability to memorize a previous position

3. Ability to use information to make a decision.

Page 34: Optimisation Techniques

Inertia Term:- This term forces the

particle to move in the same direction

- following own way using old velocity

Velocity Updating

3 terms that update the velocity:

1. Inertia Term

2. Cognitive Term

3. Social Learning Term

Page 35: Optimisation Techniques

Cognitive Term:(Personal Best)This term forces the particle to go back to the previous best position: Conservative tendency

3 terms that update the velocity:

1. Inertia Term

2. Cognitive Term

3. Social Learning Term

Velocity Updating

Page 36: Optimisation Techniques

Social Learning Term:(Global Best)This term forces the particle to go to the global best position: group following tendency

3 terms that update the velocity:

1. Inertia Term

2. Cognitive Term

3. Social Learning Term

Velocity Updating

Page 37: Optimisation Techniques

PSO: Basic Idea: Cognitive Behavior An individual remembers its past knowledge

Food : 100

Food : 80Food : 50

Where should I

move to?

Page 38: Optimisation Techniques

An individual gains knowledge from other population member

Bird 2Food : 100

Bird 3Food : 100Bird 1

Food : 150

Bird 4Food : 400

Where should I

move to?

Particle Swarm Optimization~ Basic Idea: Social Behavior ~

Page 39: Optimisation Techniques

Pitfalls of PSO

• Particles tend to cluster, i.e., converge too fast and may get stuck at local optimum

• Movement of particle carried it into infeasible region

• Inappropriate mapping of particle space into solution space

Page 40: Optimisation Techniques

Problem: Particles tend to cluster in the same area. It reduces movement of swarm as the particles are trapped in a small local area.

Page 41: Optimisation Techniques

To solve this problem, some particles can be reinitialized into new positions which may be in a better area. Other particles will be pulled to the new area

!

Page 42: Optimisation Techniques

Modified

PSO algorithmInitialize particles with random

position and zero velocity

Evaluate fitness value

Meet local search criterion?

Compare/update fitness value with pbest and gbest

Meet stopping criterion?

Update velocity and position

Meet re-initialization criterion?

Start

End

Local search

Re-initialization

YES

YES

YES

NO

NO

NO

Page 43: Optimisation Techniques

PSO: Related Issues

• There are a number of related issues concerning PSO:– Controlling velocities (determining the best value for Vmax),– Swarm Size,– Neighborhood Size,– Updating X and Velocity Vectors,– Robust Settings for (1 and 2),

• Carlisle, A. and Dozier, G. (2001). “An Off-The-Shelf PSO”, Proceedings of the 2001 Workshop on Particle Swarm Optimization, pp. 1-6, Indianapolis, IN. (http://antho.huntingdon.edu/publications/Off-The-Shelf_PSO.pdf)

Page 44: Optimisation Techniques

Particle Swarm: Controlling Velocities

• When using PSO, it is possible for the magnitude of the velocities to become very large.

• Performance can suffer if Vmax is inappropriately set.• Two methods were developed for controlling the growth

of velocities:– A dynamically adjusted inertia factor, and – A constriction coefficient.

Page 45: Optimisation Techniques

PSO: The Inertia Factor

• When the inertia factor is used, the equation for updating velocities is changed to:

– vid = *vid + 1*rnd()*(pid-xid) + 2*rnd()*(pgd-xid);

• Where is initialized to 1.0 and is gradually reduced over time (measured by cycles through the algorithm).

Page 46: Optimisation Techniques

The following weighting function is usually utilized in (1)

w = wMax-[(wMax-wMin) x iter]/maxIter (2)

where wMax= initial weight,

wMin = final weight,

maxIter = maximum iteration number,

iter = current iteration number.

sik+1 = si

k + Vik+1 (3)

PSO: The Inertia Factor

Page 47: Optimisation Techniques

Comments on the Inertial weight factor:• A large inertia weight (w) facilitates a global search

while a small inertia weight facilitates a local search.

• By linearly decreasing the inertia weight from a relatively large value to a small value through the course of the PSO run gives the best PSO performance compared with fixed inertia weight settings.

Larger w ----------- greater global search ability

Smaller w ------------ greater local search ability.

PSO

Page 48: Optimisation Techniques

PSO: The Constriction Coefficient

• In 1999, Maurice Clerc developed a constriction Coefficient for PSO.

– vid = K[vid + 1*rnd()*(pid-xid) + 2*rnd()*(pgd-xid)];

– Where K = 2/|2 - - sqrt(2 - 4)|, = 1 + 2, and > 4.

Page 49: Optimisation Techniques

PSO: Swarm and Neighborhood Size

• Concerning the swarm size for PSO, as with other ECs there is a trade-off between solution quality and computational cost (in terms of function evaluations).

• Global neighborhoods seem to be better in terms of computational costs. The performance is similar to the ring topology (or neighborhoods greater than 3).

• There is scope for research on the effects of swarm topology on the search behavior of PSO.

Page 50: Optimisation Techniques

PSO: Particle Update Methods

• There are two ways that particles can be updated:– Synchronously– Asynchronously

• Asynchronous update allows for newly discovered solutions to be used more quickly

• The asynchronous update method is similar to ____.

I4

I0

I1

I2 I3

Page 51: Optimisation Techniques

• Genetic algorithm (GA)

• Begins with a population of random chromosomes

• 3 Operators • Selection • crossover • mutation• High computational cost• High memory requirement• Working in parallel• Global search

• Particle Swarm Optimization (PSO)

• Social sharing of information among individuals of a population

• No operators• Pbest or local best• global best • Velocity & Displacement• Low computational cost• Low memory requirement• Moves quickly towards best • Tendency to lock in local optima

Comparisons of GA & PSO

Page 52: Optimisation Techniques

• Population consists of solutions or chromosomes

• Terminate after best chromosomes, if not satisfy then restart

• Randomly selected population

• More time to determine the results

• More complex mechanism

• Population consists of solutions or particles

• No question of restart PSO for best particles

• Randomly selected initial particle velocity & displacement

• Less time to determine the results

• Simple mechanism

Page 53: Optimisation Techniques

•The combination vector created by pbest, gbest, lbest, and nbest pulls each particle to a better direction than previous versions

pbest

gbest

Standard PSOpbest

gbest

nbest lbest

GLN-PSO

More good information sources, Better performance

GLN-PSO

Page 54: Optimisation Techniques

• Tested on 4 multi-modal or complex functions– Ackey– Griewank– Rastrigin– Rosenbrock

Experiments

Page 55: Optimisation Techniques

Results 1

Page 56: Optimisation Techniques

Results 2

Page 57: Optimisation Techniques

• Based on two simple modifications of the real-valued PSO:• The position x(t) is an bit-string of length n.• The velocity v(t) is still a vector in real-valued n-

dimensional space.• Velocity update formula is left unchanged - the bits in

x(t) are treated as reals.• However, the position update formula is modified!

The Binary PSO

Page 58: Optimisation Techniques

is a random number between 0 and 1 – s is a sigmoid function:

otherwise

tvstx i

i 0

))1((1)1(

xexs

1

1)(

The position update formula

Page 59: Optimisation Techniques

• Now, vi (t) is a probability for xi (t) being ”set”.• vi(t) changes when xi differs from pi or gi

• An example:– xi(t)=1, vi(t)=2.4, pi(t)=1, gi(t)=0– What will xi(t+1) be?– vi(t+1)=vi(t)+(1-1)+2(0-1)=vi(t) – 2

• Control of maximum velocity is important:– We can use vmax or to limit the velocity

Meaning of the velocity

Page 60: Optimisation Techniques

• Evaluating influence of jth particle on the ith particle (along the dth dimension)

where Pj is the previous best position visited by the jth particle

Xi is the position of the particle under consideration

( ) ( )( , , )

j i

jd id

Fitness P Fitness XFDR j i d

P X

Fitness-Distance Ratio

Page 61: Optimisation Techniques

• Each particle influenced by

• Its own previous best (pbest)• Global best particle (gbest)• Particle that maximizes FDR (nbest)

• Velocity Update Equation

• Position Update Equation

1

1 2 3() ( ) () ( ) () ( )t tid id id id gd id nd idV V rand P X rand P X rand P X

X X Vid id idt 1

FDR-PSO Algorithm

Page 62: Optimisation Techniques

Algorithm FDR-PSO: For t= 1 to the max. bound on the number of generations, For i=1 to the population size, Find gbest; For d=1 to the problem dimensionality, Find nbest which maximizes the FDR; Apply the velocity update equation; Update Position; End- for-d; Compute fitness of the particle; If needed, update historical information regarding Pi and Pg; End-for-i; Terminate if Pg meets problem requirements; End-for-t; End algorithm.

FDR-PSO Algorithm

Page 63: Optimisation Techniques

• GA-PSO

• ACO-PSO

• PSO-BFG

• PSO-ANN

• PSO-Fuzzy

• FDTD-PSO

• etc,

Hybrid Particle Swarm Optimizations

Page 64: Optimisation Techniques

Signal Processing Artificial Neural Networks Bio-Medical applications Control Engineering Image Processing Mechanical Engineering Instrumentation Computer Engg Communication Engg

Applications

Page 65: Optimisation Techniques