PSO and its variants Swarm Intelligence Group Peking University.

Post on 02-Jan-2016

223 views 2 download

Tags:

Transcript of PSO and its variants Swarm Intelligence Group Peking University.

PSO and its variants

Swarm Intelligence Group Peking University

Outline Classical and standard PSO

PSO on Benchmark Function

Analysis of PSO_state of art

Analysis of PSO_our idea

variants of PSO_state of art

Our variants of PSO

Applications of PSO

Classical and standard PSO

Swarm is better than personal

Classical and standard PSO

Russ EberhartRuss Eberhart James KennedyJames Kennedy

Classical

1 2() ( ) () ( ) (1)

(2)id id id id d id

id id id

V w V c Rand p x c Rand g x

x x V

Vid : Velocity of each particle in each dimension i: Particle D: Dimension

W : Inertia Weight c1、 c2 : Constants Rand() : Random Pid : Best position of each particle gd : Best position of swarm xid : Current position of each particle in each dimension

Classical and standard PSO

1 2() ( ) () ( ) (1)

(2)id id id id d id

id id id

V w V c Rand p x c Rand g x

x x V

( )idx t

( )dg t

x

y

( )idp t( )idV t

( 1)idV t ( 1)idx t

Flow chart depicting the General PSO Algorithm:

simulation 1

x

y

fitnessmin

max

search space

simulation 2

x

y

search space

fitnessmin

max

simulation 3

x

y

fitnessmin

max

search space

simulation 4

x

y

fitnessmin

max

search space

simulation 5

x

y

fitnessmin

max

search space

simulation 6

x

y

fitnessmin

max

search space

simulation 7

x

y

fitnessmin

max

search space

simulation 8

x

y

fitnessmin

max

search space

Schwefel's function

n :1=i 420.9687,=

418.9829;=)(

minimum global

500500

where

)sin()()(1

i

i

n

iii

x

nxf

x

xxxf

Evolution - Initialization

Evolution - 5 iteration

Evolution - 10 iteration

Evolution - 15 iteration

Evolution - 20 iteration

Evolution - 25 iteration

Evolution - 100 iteration

Evolution - 500 iteration

Search result

Iteration Swarm best0 416.245599

5 515.748796

10 759.404006

15 793.732019

20 834.813763

100 837.911535

5000 837.965771

Global 837.9658

Standard benchmark functions

nn

ii xxxf 5,5,

1

2

1

1

2221 10,10,1100

n

i

niii xxxxxf

1) Sphere Function

2) Rosenbrock Function

D

iii xxxf

1

2 102cos10

3) Rastrigin Function

4) Ackley Function

nn

ii

n

i

xxn

xn

exf 32,32,2cos1

exp1

2.0exp202011

2

Composition Function

Analysis of PSO_state of art Stagnation - Convergence

Clerc 2002 The particle swarm - explosion, stability, and convergence in a multidimensio

nal complex space,2002 Kennedy 2005

Dynamic-Probabilistic Particle Swarms,2005 Poli 2007

Exact Analysis of the Sampling Distribution for the Canonical Particle Swarm Optimiser and its Convergence during Stagnation,2007

On the Moments of the Sampling Distribution of Particle Swarm Optimisers,2007

Markov Chain Models of Bare-Bones Particle Swarm Optimizers,2007

standard PSO Defining a Standard for Particle Swarm Optimization,2007

Analysis of PSO_state of art standard PSO: constriction factor -

convergence Update formula

1 2() ( ) () ( )

id id id id d id

id id id

V w V c Rand p x c Rand g x

x x V

1 2( () ( ) () ( ) )

id id id id d id

id id id

V V c Rand p x c Rand g x

x x V

Equivalent

Analysis of PSO_state of art standard PSO

50 particles Non-uniform initialization No evaluation when particle is out of

boundary

Analysis of PSO_state of art standard PSO

A local ring topology

Analysis of PSO_state of art How does PSO works?

Stagnation versus objective function Classical PSO versus Standard PSO Search strategy versus performance

Classical PSO Main idea: Particle swarm optimization,1995

Exploit the current best position Pbest Gbest

Explore the unkown space

pbest gbest

Classical PSO

Implementation

pbest

gbest

1 2() ( ) () ( ) (1)

(2)id id id id d id

id id id

V w V c Rand p x c Rand g x

x x V

pbest gbest

wV

Analysis of PSO_our idea Search strategy of

PSO Exploitation Exploration

Analysis of PSO_our idea Hybrid uniform distribution

pbest gbest

wV

wVExploitation

Exploration

Analysis of PSO_our idea

( 1) ( ) ( )x t x t wV t Z

Sampling probability density-computable

Analysis of PSO_our idea

Analysis of PSO_our idea

Analysis of PSO_our idea

wVSampling probability

Analysis of PSO_our idea No inertia part(wV)

Analysis of PSO_our idea Inertia part(wV)

Analysis of PSO_our idea No inertia part(wV)

Analysis of PSO_our idea Inertia part(wV)

Analysis of PSO_our idea Difference among variants of PSO

Probability

Exploitation Exploration

Balance

Analysis of PSO_our idea What is the property of the

iteration?

Analysis of PSO_our idea Whether the search strategy is the same or whethe

r the PSO is adaptive when Same parameter(during the convergent process) Different parameter Different dimensions Different number of particles Different topology Different objective functions In different search phase(when slow or sharp slope,stagn

ation,etc) What’s the change pattern of the search strategy?

Analysis of PSO_our idea What is the better PSO on the search strategy?

Simpler implement Using one parameter as a tuning knob instead of two in stan

dard PSO Prove they are equialent when setting some value of parame

ter Effective on most objective functions Adaptive

Analysis of PSO_our idea Markov chain

State transition matrix

Analysis of PSO_our idea Random process

Gaussian process Kernel mapping

Gauss process

Covarance matrix Kernel function

Mapping ability

Search straegy Effective?

Objective problem

Analysis of PSO_our idea the object of our analysis

search strategy of PSO Different parameter sets In different dimensions Using different number of particles On different objective functions Fitness evaluation Different topology Markov or gauss process and kernel function

Direction to PSO Knob PSO

Analysis of PSO_our idea

1

( )i if x ( )P Exploitation

65432

w:

c:

dim:

Num:

Fun:

Top:

1x

2x

3x

4x

5x

7xFEs: 6x 7

Current results Variance with convergence

func_num=1; fes_num=5000; run_num=10; particles_num=50; dims_num=30;

Current results Variance with dimensions

func_num=1; fes_num=3000; run_num=10; particles_num=50;

Current results Variance with number of particles

func_num=1; fes_num=3000; run_num=10; dims_num=30;

Current results

Variance with topology

Current results

Variance with inertia weight

Current results 1. Shifted Sphere Function 2. Shifted Schwefel's Problem 1.2

-100-50

050

100

-100

-50

0

50

100-1

0

1

2

3

4

5

x 104

-100-50

050

100

-100

-50

0

50

100-2

0

2

4

6

8

x 104

PSO on Benchmark Function 3. Shifted Rotated High Conditioned Elliptic Function 4. Shifted Schwefel's Problem 1.2 with Noise in Fitness

-100-50

050

100

-100

-50

0

50

1000

1

2

3

4

x 1010

-100-80

-60-40

-200

0

50

1000

1

2

3

4

5

x 104

Current results Variance with objective functions

Unimodal Functions Multimodal Functions Expanded Multimodal Functions Hybrid Composition Functions

Current results Variance with objective functions

func_num=1,2,3,4; fes_num=3000; run_num=5; particles_num=50; dims_num=30;

variants of PSO_state of art Traditional strategy

Simulated annealing Tabu strategy Gradient methods

Adopted from other fields Clonal operation Mutation operation

Heuristical Methods Advance and retreat

Structure topology Full connection Ring topology

Our variants of PSO

CPSO AR-CPSO MPSO RBH-PSO FPSO

Our variants of PSO

CPSO

n次迭代过后克隆保存的n个全局最优粒子

将所有克隆出来的粒子利用随机扰动变异

基于浓度机制的多样性保持策略进行选择操作

保存每一代的全局最优粒子作为第二步中克隆算子的父粒子

Our variants of PSO

MPSO

Our variants of PSO

AR-CPSO

Our variants of PSO

FPSO

Applications of PSO

Applications of PSO

Applications of PSO