Auto-Tuning Fuzzy Granulation for Evolutionary Optimization Auto-Tuning Fuzzy Granulation for...

Post on 15-Jan-2016

229 views 0 download

Tags:

Transcript of Auto-Tuning Fuzzy Granulation for Evolutionary Optimization Auto-Tuning Fuzzy Granulation for...

Auto-Tuning Fuzzy Granulation for Evolutionary Optimization

Auto-Tuning Fuzzy Granulation for Evolutionary Optimization

Mohsen Davarynejad, Ferdowsi University of Mashhad, Mohsen Davarynejad, Ferdowsi University of Mashhad, davarynejad@kiaeee.org

M.-R.Akbarzadeh-T., Ferdowsi University of Mashhad. M.-R.Akbarzadeh-T., Ferdowsi University of Mashhad. akbarzadeh@ieee.org

Carlos A. Coello Coello, CINVESTAV-IPN, Carlos A. Coello Coello, CINVESTAV-IPN, ccoello@cs.cinvestav.mx

Outline of the presentation

• Brief introduction to evolutionary algorithms• Fitness Approximation Methods• Meta-Models in Fitness Evaluations• Adaptive Fuzzy Fitness Granulation (AFFG)• AFFG with fuzzy supervisory• Numerical Results• Summery and Contributions

Outline of the presentation

• Brief introduction to evolutionary algorithms• Fitness Approximation Methods• Meta-Models in Fitness Evaluations• Adaptive Fuzzy Fitness Granulation (AFFG)• AFFG with fuzzy supervisory• Numerical Results• Summery and Contributions

Brief Introduction to Evolutionary Algorithms

• Generic structure evolutionary algorithms– Problem representation (encoding) – Selection, recombination and mutation– Fitness evaluations

• Evolutionary algorithms– Genetic algorithms – Evolution strategies – Genetic programming …

• Pros and cons– Stochastic, global search – No requirement for derivative information– Need large number of fitness evaluations– Not well-suitable for on-line optimization

Motivations(When fitness approximation is necessary?)

• No explicit fitness function exists: to define fitness quantitatively

• Fitness evaluation is highly time-consuming: to reduce computation time

• Fitness is noisy: to cancel out noise• Search for robust solutions: to avoid additional

expensive fitness evaluations

Outline of the presentation

• Brief introduction to evolutionary algorithms• Fitness Approximation Methods• Meta-Models in Fitness Evaluations• Adaptive Fuzzy Fitness Granulation (AFFG)• AFFG with fuzzy supervisory• Numerical Results• Summery and Contributions

Fitness Approximation Methods

• Problem approximation– To replace experiments with simulations– To replace full simulations /models with reduced simulations / models

• Ad hoc methods– Fitness inheritance (from parents)– Fitness imitation (from brothers and sisters)

• Data-driven functional approximation (Meta-Models)– Polynomials (Response surface methodology)– Neural networks, e.g., multilayer perceptrons (MLPs), RBFN– Support vector machines (SVM)

Fitness Approximation Methods

• Problem approximation– To replace experiments with simulations– To replace full simulations /models with reduced simulations / models

• Ad hoc methods– Fitness inheritance (from parents)– Fitness imitation (from brothers and sisters)

• Data-driven functional approximation (Meta-Models)– Polynomials (Response surface methodology)– Neural networks, e.g., multilayer perceptrons (MLPs), RBFN– Support vector machines (SVM)

Outline of the presentation

• Brief introduction to evolutionary algorithms• Fitness Approximation Methods• Meta-Models in Fitness Evaluations• Adaptive Fuzzy Fitness Granulation (AFFG)• AFFG with fuzzy supervisory• Numerical Results• Summery and Contributions

Meta-Model in Fitness Evaluations*

• Use Meta-Models Only: Risk of False Convergence

• So a combination of meta-model with the original fitness function is necessary– Generation-Based Evolution Control – Individual-Based Evolution Control (random strategy

or a best strategy)

*Yaochu Jin. “A comprehensive survey of fitness approximation inevolutionary computation”. Soft Computing, 9(1), 3-12, 2005

Generation-Based Evolution Control

Generation t Generation t+1 Generation t+2 Generation t+3 Generation t+4

Individual-Based Evolution Control

Generation t Generation t+1 Generation t+2 Generation t+3 Generation t+4

Fitness Approximation Methods

Problem approximation Meta-Model Ad hoc methods

Generation-Based Individual-Based

Fitness Approximation Methods

Problem approximation Meta-Model Ad hoc methods

Generation-Based Individual-Based

ADAPTIVE FUZZY FITNESS GRANULATION is Individual-Based Meta-Model Fitness Approximation

Method

GRADUATION AND GRANULATION

• The basic concepts of graduation and granulation form the core of FL and are the principal distinguishing features of fuzzy logic.

• More specifically, in fuzzy logic everything is or is allowed to be graduated, or equivalently, fuzzy.

• Furthermore, in fuzzy logic everything is or is allowed to be granulated, with a granule being a clump of attribute-values drawn together by indistinguishability, similarity, proximity or functionality.

• Graduated granulation, or equivalently fuzzy granulation, is a unique feature of fuzzy logic.

• Graduated granulation is inspired by the way in which humans deal with complexity and imprecision.*

*L. A. Zadeh, www.fuzzieee2007.org/ZadehFUZZ-IEEE2007London.pdf

Outline of the presentation

• Brief introduction to evolutionary algorithms• Fitness Approximation Methods• Meta-Models in Fitness Evaluations• Adaptive Fuzzy Fitness Granulation (AFFG)• AFFG with fuzzy supervisory• Numerical Results• Summery and Contributions

Fuzzy Similarity Analysis based on Granulation Pool

Yes FF Evaluation

ADAPTIVE FUZZY FITNESS

GRANULATION (AFFG)

FF Association No

Phenospace

Fitness of Individual

Update Granulation Table

Create initial random population

Evaluate Fitness of solution exactly

Termination Criterion Satisfied?

Designed Results

Is the solution similar to an existing

granule?

Gen:=0

Gen:=Gen+1

Yes

No

Yes

No

Look up fitness from the queue of

granules

End

Select a solution

Update table life function and granule radius

Yes

Add to the queue as a new granule

No

AFFG Part

Finished Fitness Evaluation of population?

Perform Reproduction

Perform Crossover and Mutation

Flowchart of the Purposed

AFFG Algorithm

AFFG (III)

• Random parent population is initially created.

• Here,

• G is a set of fuzzy granules,

1112

110 , ... , , ... , , tj XXXXP

1112

110 , ... , , ... , , tj XXXXP

imj

irj

ij

ij

ij xxxxX ,,2,1, , ... , , ... , ,

},...,1,,,|,,{ lkLCLCG kkm

kkkk

AFFG (IV)

• Calculate average similarity of a new solution to each granule is:kG

m

r

irjrk

kj m

x

1

,,,

)(

otherwiseXf

CfXf

ij

ikj

lkKi

j

function fitnessby computed

max if , ..., 2, 1,

AFFG (V)

1

112

11 )(...),(),(

.

i

it

iii

f

XfXfXfMax

)(

1kCf

ke

i

larger and Since members of the initial populations generally have less fitness, larger and is smaller, fitness is assumed more often initially by estimation/association to the granules.

ki

,/exp 2,

2

,,,, rkrkirj

irjrk Cxx

AFFG (V)

1

112

11 )(...),(),(

.

i

it

iii

f

XfXfXfMax

)(

1kCf

ke

i

larger and Since members of the initial populations generally have less fitness, larger and is smaller, fitness is assumed more often initially by estimation/association to the granules.

ki

,/exp 2,

2

,,,, rkrkirj

irjrk Cxx

Outline of the presentation

• Brief introduction to evolutionary algorithms• Fitness Approximation Methods• Meta-Models in Fitness Evaluations• Adaptive Fuzzy Fitness Granulation (AFFG)• AFFG with fuzzy supervisory• Numerical Results• Summery and Contributions

AFFG - FS

• In order to avoid tuning parameters, a fuzzy supervisor as auto- tuning algorithm is employed with three inputs.

• Number of Design Variables (NDV)

• Maximum Range of Design Variables (MRDV)

• Percentage of Completed Generations (PCG)

• The knowledge base of the above architecture has large number of rules and the extraction of these rules is very difficult. Consequently the new architecture is proposed in which the controller is separated in two controllers to diminish the complexity of rules.

AFFG – FS • Granules compete for their survival through a life index.

• is initially set at N and subsequently updated as below:

• Where M is the life reward of the granule and K is the index of the winning granule for each individual in generation i. Here we set M = 5.

• At each table update, only granules with highest index are kept, and others are discarded.

kL

OtherwiseL

KkifMLL

k

kk

GN kL

Outline of the presentation

• Brief introduction to evolutionary algorithms• Fitness Approximation Methods• Meta-Models in Fitness Evaluations• Adaptive Fuzzy Fitness Granulation (AFFG)• AFFG with fuzzy supervisory• Numerical Results• Summery and Contributions

n

i

in

i

i nii

xx

1 1

2

;:1,)cos(4000

1

;600600 ix

n

iii xxnxf

1

2 ))2cos(10(10)(

;12.512.5;:1 ixni

n

ii

n

ii x

nx

neeexf 11

2 2cos11

2.0

2020)(

;768.32768.32;:1 ixni

List of Test benchmark Functions

Function Formula

Griewangk

Rastrigin

Akley

Parameters used for AFFG

Function

Griewangk 0.00012 190

Rastrigin 0.004 0.15

Akley 0.02 0.25

is considered as a constant parameter and is equal to 0.9 for all simulation results

0 100 200 300 400 500

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5

5.5

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Griewangk, (5-D)

GAAFFGAFFG-FSFES-.5FES-0.7FES-0.9

0 200 400 600 800 10000

1

2

3

4

5

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Griewangk, (10-D)

GAAFFGAFFG-FSFES-.5FES-0.7FES-0.9

0 500 1000 1500 20000

1

2

3

4

5

6

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Griewangk, (20-D)

GAAFFGAFFG-FSFES-.5FES-0.7FES-0.9

0 500 1000 1500 2000 2500 3000

1

2

3

4

5

6

7

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Griewangk, (30-D)

GAAFFGAFFG-FSFES-.5FES-0.7FES-0.9

0 100 200 300 400 500

2

2.5

3

3.5

4

4.5

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Rastrigin, (5-D)

GAAFFGAFFG-FSFES-.5FES-0.7FES-0.9

0 200 400 600 800 1000

3.2

3.4

3.6

3.8

4

4.2

4.4

4.6

4.8

5

5.2

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Rastrigin, (10-D)

GAAFFGAFFG-FSFES-.5FES-0.7FES-0.9

0 500 1000 1500 20004

4.2

4.4

4.6

4.8

5

5.2

5.4

5.6

5.8

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Rastrigin, (20-D)

GAAFFGAFFG-FSFES-.5FES-0.7FES-0.9

0 500 1000 1500 2000 2500 30004.4

4.6

4.8

5

5.2

5.4

5.6

5.8

6

6.2

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Rastrigin, (30-D)

GAAFFGAFFG-FSFES-.5FES-0.7FES-0.9

0 100 200 300 400 500

2.2

2.3

2.4

2.5

2.6

2.7

2.8

2.9

3

3.1

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Akley, (5-D)

GAAFFGAFFG-FSFES-.5FES-0.7FES-0.9

0 200 400 600 800 1000

2.4

2.5

2.6

2.7

2.8

2.9

3

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Akley, (10-D)

GAAFFGAFFG-FSFES-.5FES-0.7FES-0.9

0 500 1000 1500 2000

2.4

2.5

2.6

2.7

2.8

2.9

3

3.1

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Akley, (20-D)

GAAFFGAFFG-FSFES-.5FES-0.7FES-0.9

0 500 1000 1500 2000 2500 3000

2.6

2.7

2.8

2.9

3

3.1

3.2

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Akley, (30-D)

GAAFFGAFFG-FSFES-.5FES-0.7FES-0.9

Effect of on the convergence

behavior

• only granules with highest index are kept, and others are discarded.

• The effect of varying number of granules on the convergence behavior of AFFG and AFFG-FS is studied.

GN kL

GN

0 200 400 600 800 1000

0

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5

5.5

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Griewangk, (10-D)

AFFG, 20AFFG, 50AFFG, 100AFFG-FS, 20AFFG-FS, 50AFFG-FS, 100

0 200 400 600 800 10003

3.5

4

4.5

5

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Rastrigin, (10-D)

AFFG, 20AFFG, 50AFFG, 100AFFG-FS, 20AFFG-FS, 50AFFG-FS, 100

0 200 400 600 800 1000

2.2

2.3

2.4

2.5

2.6

2.7

2.8

2.9

3

3.1

Exact Function Evaluation Count

log

(Fitn

ess

Va

lue

)

Akley, (10-D)

AFFG, 20AFFG, 50AFFG, 100AFFG-FS, 20AFFG-FS, 50AFFG-FS, 100

Outline of the presentation

• Brief introduction to evolutionary algorithms• Fitness Approximation Methods• Meta-Models in Fitness Evaluations• Adaptive Fuzzy Fitness Granulation (AFFG)• AFFG with fuzzy supervisory• Numerical Results• Summery and Contributions

Summery

• Fitness approximation Replaces an accurate, but costly function evaluation with approximate, but cheap function evaluations.

• Meta-modeling and other fitness approximation techniques have found a wide range of applications.

• Proper control of meta-models plays a critical role in the success of using meta-models.

• Small number of individuals in granule pool can still make good results.

Contribution• Why AFFG-FS?

– Avoids initial training.– Uses guided association to speed up search process. – Gradually sets up an independent model of initial training data to compensate

the lack of sufficient training data and to reach a model with sufficient approximation accuracy.

– Avoids the use of model in unrepresented design variable regions in the training set.

– In order to avoid tuning of parameters, A fuzzy supervisor as auto-tuning algorithm is being employed.

• Features of AFFG-FS: – Exploits design variable space by means of Fuzzy Similarity Analysis (FSA) to

avoid premature convergence.– Dynamically updates the association model with minimal overhead cost.