Memetic Algorithms
-
Upload
natalio-krasnogor -
Category
Education
-
view
3.089 -
download
1
Transcript of Memetic Algorithms
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Memetic Algorithms Dr. N. Krasnogor
Interdisciplinary Optimisation Laboratory Automated Scheduling, Optimisation and Planning Research Group
School of Computer Science & Information Technology University of Nottingham
www.cs.nott.ac.uk/~nxk
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Based on:
M. Tabacman, J. Bacardit, I. Loiseau, and N. Krasnogor. Learning classifier systems in optimisation problems: a case study on fractal travelling salesman problems. In Proceedings of the International Workshop on Learning Classifier Systems, volume (to appear) of Lecture Notes in Computer Science. Springer, 2008
W.E. Hart, N. Krasnogor, and J.E. Smith, editors. Recent advances in memetic algorithms, volume 166 of Studies in Fuzzyness and Soft Computing. Springer Berlin Heidelberg New York, 2004. ISBN 3-540-22904-3.
N. Krasnogor. Handbook of Natural Computation, chapter Memetic Algorithms, page (to appear). Natural Computing. Springer Berlin / Heidelberg, 2009.
N. Krasnogor and J.E. Smith. A tutorial for competent memetic algorithms: model, taxonomy and design issues. IEEE Transactions on Evolutionary Computation, 9(5):474- 488, 2005
J. Bacardit and N. Krasnogor. Performance and efficiency of memetic pittsburgh learning classifier systems. Evolutionary Computation, 17(3):(to appear), 2009.
Q.H. Quang, Y.S. Ong, M.H. Lim, and N. Krasnogor. Adaptive cellular memetic algorithm. Evolutionary Computation, 17(3):(to appear), 2009.
N. Krasnogor and J.E. Smith. Memetic algorithms: The polynomial local search complexity theory perspective. Journal of Mathematical Modelling and Algorithms, 7:3-24, 2008.
All material available at www.cs.nott.ac.uk/~nxk/publications.html
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Outline of the Talk
– Evolutionary Algorithms Revisited
– MAs’ Motivation (General Versus Problem Specific Solvers)
– MAs design issues – A Few Exemplar Applications – Related Methods and Advanced Topics – Putting it all Together – Conclusions, Q&A
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Evolution
Environment Individual Fitness Natural Selection
Evolutionary Computation Most Important Metaphors
Problem Solving
Problem
Candidate/Feasible Solution
Solution quality, i.e. objective value
Simulated Pruning of bad solutions
Objective value → chances of generating related (but not necessarily identical) solutions
Fitness → survival and reproduction likelihood
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Simulated Evolution Consists of: • The population contains a diverse set of individuals (i.e. solutions
to an optimisation problem)
• Those features that make solutions good under a specific objective function tend to proliferate
• Variation is introduced into the population by means of mutation, crossover and, for MAs, local search.
• Selection culls the population throwing out bad solutions and keeping the most promising ones
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
The Metaphor of “Adaptive landscape” (Wright, 1932) (1)
• Solutions (i.e. individuals) are represented by n properties + its quality value.
• One can imagine the space of all solutions and their quality values represented as a graph (i.e. landscape) in n+1 dimensions.
• In this metaphor, a specific individual is seen as a point in the landscape.
• Each point in the landscape will have neighbouring points, which are those solutions that are somehow related to that point implying a neighbourhood.
• It is called “adaptive” landscape because the quality value function, F(), depends on the features and properties of the individual and hence the better those are the higher(smaller) the value.
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
An example: Maximisation Problem with HillClimber (2)
Prop 1 Prop 2
Solutions have 2 features
Objective Value
a solutions
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Prop 1 Prop 2
An example: Maximisation Problem with EA (3)
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Evolutionary Algorithms in Context • There are several opinions about the use of metaheuristics in
optimisation • For the majority of problems a specific algorithm could:
– work better than a generic algorithm in a large set of instances , – but it can be very limited on a different domain. – don’t work too good for some instances.
• One important research challenge is : – to build frameworks that can be robust across a set of problems
delivering good enough/cheap enough/soon enough solutions. – to a variety of problems and instances.
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Outline of the Talk
– Evolutionary Algorithms Revisited – MAs’ Motivation (General Versus Problem Specific
Solvers) – MAs design issues – A Few Exemplar Applications – Related Methods and Advanced Topics – Putting it all Together – Conclusions, Q&A
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
scale of all problems
met
hod
perfo
rman
ce o
n pr
oble
ms
random search
specific method
metaheuristic method
EAs as problem solvers: The pre-90s view
P
the Rollroyce for problem P The For T for problem
solving
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Evolutionary Algorithms and domain knowledge
• Fashionable after the 90’s: to add problem specific information into the EAs by
means of specialized crossover, mutation, representations and local search
• Result: The performance curve deforms and – makes EAs better in some problems, – worst on other problems – amount of problem specific is varied.
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Michalewicz’s Interpretation
scale of all problems
P met
hod
perfo
rman
ce o
n pr
oble
ms
EA 1
EA 4
EA 3 EA 2
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
So What Are Memetic Algorithms? Adding Domain Knowledge to EAs
Memetic Algorithms (MAs) were originally inspired by:
• Models of adaptation in natural systems that combine evolutionary adaptation of population of individuals (GAs)
WITH
• Individual learning (LS) within a lifetime (others consider the LS as development stage). Learning took the form of (problem specific) local search
PLUS
• R. Dawkin’s concept of a meme which represents a unit of cultural evolution that can exhibit refinement, hence the local search can be adaptive.
BUT WHY?
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Prop 1 Prop 2
An example: Maximisation Problem with EA (3)
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
The Canonical MA At design
time lots of issues arise
From Eiben’s & Smith “Introduction To Evolutionary Computation”
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Memetic Algorithms: the issues involved Motivation
There are several reasons why it is worthwhile hybridizing:
• Complex problems can be partially decomposable, different subproblems be better solved by different methods:
• EA could be used as pre/post processors • Subproblem specific information can be placed into variation operators or into local searchers • In some cases there are exact/approximate methods for subproblems
• Well established theoretical results: generally good black-box optimisers do not exist. This is why successful EAs are usually found in “hybridized form” with domain knowledge added in
• EA are good at exploring the search space but find it difficult to zoom-in good solutions
• Problems have constraints associated to solutions and heuristics/local search are used to repair solutions found by the EA
• If heuristic/local search strategies in MAs are “first class citizens” then one can raise the level of generality of the algorithm without sacrificing performance by letting the MA choose which local search to use.
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
A conservation of competence principle applies: the better one algorithm is solving one specific instance (class) the worst it is solving a different instance (class) [Wolpert et.al.]
It cannot be expected that a black-box metaheuristic will suit all problem classes and instances all the time, that is, it is theoretically impossible to have both ready made of-the-shelf general & good solvers for all problems.
MAs and Hyperheuristics are good algorithmic templates that aid in the balancing act of successfully & cheaply using general, of-the-shelf, reusable solvers (EAs) with adds-on instance (class) specific features.
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
What happens inside an MA?
Population of agents
This is my solution to your
problem
Offspring
Evolutionary Algorithm
I used strategy
X
When I grow up I’ll need to decide
whose problem solving strategy to
use
Memetics
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Outline of the Talk
– Evolutionary Algorithms Revisited – MAs’ Motivation (General Versus Problem Specific
Solvers) – MAs design issues – A Few Exemplar Applications – Related Methods and Advanced Topics – Putting it all Together – Conclusions, Q&A
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
• Lamarkian • traits acquired by an individual during its lifetime can
be transmitted to its offspring • e.g. replace individual with fitter neighbour
• Baldwinian • traits acquired by individual cannot be transmitted to
its offspring • e.g. individual receives fitness (but not genotype) of
fitter neighbour
Memetic Algorithms: the issues involved Baldwinism VS Lamarckianism
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Raw fitness Baldwin’s “filter”
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Memetic Algorithms: the issues involved Diversity
The loss of diversity is specially problematic in MAs as the LS tends to focus excesively in a few good solutions.
If the MA uses LS up to local optimae then it becomes important to constantly identify new local optimae
If the MA uses partial LS you could still be navigating around the basins of attractions of a few solutions
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Memetic Algorithms: the issues involved Diversity
There are various ways to improve diversity (assuming that’s what one wants!):
• if the population is seeded only do so partially. • instead of applying LS to every individual choose whom to apply it to. • use variation operators that ensure diversity (assorted) • in the local search strategy include a diversity weigth • modify the selection operator to prevent duplicates • archives • modify the acceptance criteria in the local search:
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
The following modified MC exploits solutions (zooms-in) when the population is diverse. If the population is converged it explores (zooms-out)
The temperature T of the MC is defined for each generation as:
A new solution is accepted when:
when population is diverse T <= 0 only accepts improvements
when population is converged T goes to infinity accepts both better and worst solutions (explores)
Memetic Algorithms: the issues involved Diversity
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Memetic Algorithms: the issues involved Operators Choice
The choice of LS/Heuristic is one of the most important steps in the design of an MA
1. Local searchers induce search landscapes and there has been various attempts to characterize these. Kallel et.al. and Merz et.al. have shown that the choice of LS can have dramatic impact on the efficiency and effectiveness of the MA
2. Krasnogor formally proved that to reduce the worst case run time of MAs LS move operators must induce search graphs complementary (or disjoint) than those of the crossover and mutation.
3. Krasnogor and Smith have also shown that the optimal choice of LS operator is not only problem and instance dependent but also dependent on the state of the overall search carried by the underlying EA
The obvious way to implement 2&3 is to use multiple local searchers within an MA (multimeme algorithms)
and we will see that the obvious way of including feedback like that suggested by 1 is to use self-generated multiple local searchers (self-generating MAs aka co-evolving MAs)
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Thanks to P. Merz!
Memetic Algorithms: the issues involved Fitness Landscapes
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Multiple Local Searchers
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Memetic Algorithms: the issues involved Use of Knowledge
The use of knowledge is essential for the success of a search methods
There are essentially two stages when knowledge is used: • At design time: eg, in the form of local searchers/heuristics, specific variation operators, initialization biases, etc. • At run time:
• using tabu-like mechanisms to avoid revisiting points (explicit) • using adaptive operators that bias search towards unseen/promising regions of search space (implicit) • creating new operators on-the-fly, eg., self-generating or co-evolving MAs (implicit)
With appropriate data-mining techniques we can turn implicit
knowledge into explicit and feed it back into the design process!
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Memetic Algorithms: the issues involved Specific Considerations for Continuous Domains
There are several factors which makes CD optimisation difficult:
• Different scales might be required for local/global searches
• It is not always possible to determine when a solution is locally optimal
• Long local searchers might be needed to ensure convergence to good optima
• Several local searchers exists but they are general methods so they violate the conservation of competence principle.
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Memetic Algorithms: the issues involved Specific Considerations for Continuous Domains
The design of CD MAs can be different than the one needed for DD.
As there is a need to both do long local searchers and balance it with global search then:
• LS is truncated after a number of fitness evaluations • LS is applied sporadically • But these strategies makes it difficult to guarantee convergence
To the best of my knowledge the only MAs for CD that have guaranteed convergence to LO are Hart’s Memetic Evolutionary Pattern Search.
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Memetic Algorithms: the issues involved Initialisation
Intelligent initialisation of the MA is one of the obvious ways of reusing knowledge:
• One does not reinvent the wheel ‘cos existing solutions are reused.
• Bias the search mechanism towards more suitable regions of the search space.
• Given a CPU budget allocation it might pay to spend some part of the budget in smart initialisations rather than in a pure EA.
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
T: time needed by an EA with random initialization to reach F
T T T Time
Fitn
ess F: fitness after a smart initialization F
If T < T then it is worth initializing. If T < T then it is not worth doing it
T ,T Time needed by the Intelligent Initialization
but remember diversity!
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Memetic Algorithms: the issues involved Other Hybridisations
EA + LS have been used in various other hybridisation schemes:
• during the genotype to phenotype mapping prior to evaluation, e.g. in timetabling, scheduling and VRP.
• during the mutation or crossover stages, e.g., DPX is a good example of intelligent crossover, and Unger & Moult used a try-best approach for protein folding. Note however that these differ from Xover hill-climbing in that the later does not use problem/instance specific knowledge
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Outline of the Talk
– Evolutionary Algorithms Revisited – MAs’ Motivation (General Versus Problem Specific
Solvers) – MAs design issues – A Few Exemplar Applications – Related Methods and Advanced Topics – Putting it all Together – Conclusions, Q&A
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Showcase Applications The Maximum Diversity Problem
Katayama & Narihisa solve the MDP by means of a sophisticated MA. The MDP:
The problem consists in selecting out of a set of N elements, M which maximize certain diversity measure Dij
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Showcase Applications The Maximum Diversity Problem
This problem is at the core of various important real-world applications:
• Immigration and admission policies
• Committee formation
• Curriculum design
• Portfolio selection
• Combinatorial chemical libraries
• etc
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Showcase Applications Protein Structure Prediction by us
Primary Structure
Tertiary Structure
Secondary Structure
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Showcase Applications Protein Structure Prediction
Krasnogor, Krasnogor & Smith, Krasnogor & Pelta, Smith have used MAs to study fundamentals of the algorithmics behind PSP in simplified models.
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Showcase Applications Protein Structure Prediction
Standard MA template except that Multiple Memes which promote diversity by means of fuzzy rules are used
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Membership function for “acceptable” solutions Two distinct “acceptability” concepts
Promotes Diversity
Promotes improvements
Showcase Applications Protein Structure Prediction
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Showcase Applications Protein Structure Prediction
New optimal solutions
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Showcase Applications Optimal Engine Calibration
The OEC problem is paradigmatic of many industrial problems. In this problem many combinatorial optimisation problems occur:
1. Optimal Design of Experiments
2. Optimal Test Bed Schedule
3. Look-up Table Calculation
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Showcase Applications Optimal Engine Calibration
By P.Merz:
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Showcase Applications Optimal Engine Calibration
Standard MA template
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
CP is the task of dividing a circuit into smaller parts. Its an important component of the VLSI Layout problem:
1. the division permits the fabrication of circuits in physically distinct components
2. By dividing we conquer: resulting circuits can fit fabrication norms, complexity is reduced
3. Can reduce heat dissipation, energy consumption, etc.
Showcase Applications Circuit Partitioning
this is a constraint this is a minimization objective
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Showcase Applications Circuit Partitioning
From S.Areibi’s chapter: A graphical example
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Showcase Applications The Maximum Diversity Problem
Various features: distinct repair & LS, GRASP for init, diversification phase, accelerated LS.
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Outline of the Talk
– Evolutionary Algorithms Revisited – MAs’ Motivation (General Versus Problem Specific
Solvers) – MAs design issues – A Few Exemplar Applications – Related Methods and Advanced Topics – Putting it all Together – Conclusions, Q&A
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Variable Neighbourhood Search: under this approach a number of different neighbourhood structures are systematically explored, tries to improve the current solution while avoiding poor local optima.
A-teams of Heuristics: in A-Teams a set of constructive, improvement and destructive heuristics are asynchronously used to improve solutions.
Hyperheuristics: the main concept behind the hyperheuristic is that of managing the application of other heuristics adaptively with the purpose of improving solutions.
Related Methodologies Teams of Heuristics
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Methodologies Cooperative Local Search (Landa Silva & Burke)
The search cycle of each individual begins
Finds something to do. Gets unstuck
Gets stuck
Cycle of each individual in pop
sharing moves, parts, centralized
control, etc
Cooperation mechanism
Note that this differs from teams of heuristics in that here the cooperation is made explicit
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Methodologies On-the-fly operators discovery
All the previous methodologies clearly benefits the end user as they have been shown to provide improvements in robustness, quality, etc.
But what do we do if we do not have, or don’t know, good heuristics which could be used by,eg., A-teams, VNS or CLS?
Also, why don’t we use the information the algorithm produces to better understand and make explicit new knowledge of the search landscape capturing this knowledge in new operators?
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Methodologies On-the-fly operators discovery
Two alternatives:
1. Off-line: Whitley and Watson did it successfully for TS, and Kallel et al for other methods.
2. In-line: Krasnogor, Krasnogor & Gustafson, J.E. Smith and others for MAs
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Canonical MA cycle Adapted from Durham,
W.: Coevolution: Genes, Culture and Human Diversity. Stanford University Press (1991)
Methodologies On-the-fly operators discovery
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Methodologies On-the-fly operators discovery
Self-Generating/Co-evolving Mas Adapted from Durham,
W.: Coevolution: Genes, Culture and Human Diversity. Stanford University Press (1991)
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
There are various processes that guide the Agent’s cultural evolution of local search strategies:
• Inheritance: an agent inherits the meme of the most successful of its parents
• Imitation: an agent imitates a successful non-genetically-related individual
• Innovation: an agent blindly (i.e.randomly) change its meme
• Mental Simulation: an agent purposely (e.g. hill-climbs to ) improve its meme
Methodologies On-the-fly operators discovery
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
From Krasnogor & Gustafson chapter
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
From Smith chapter
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Outline of the Talk
– Evolutionary Algorithms Revisited – MAs’ Motivation (General Versus Problem Specific
Solvers) – MAs design issues – A Few Exemplar Applications – Related Methods and Advanced Topics – Putting it all Together – Conclusions, Q&A
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
So What Are Memetic Algorithms?
o MAs are carefully orchestrated interplay between (stochastic) global search and (stochastic) local search algorithms
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
A Pattern Language for Memetic Algorithms Memetic Algorithms by N. Krasnogor. Handbook of Natural Computation (chapter) in Natural Computing. Springer Berlin / Heidelberg, 2009.
www.cs.nott.ac.uk/~nxk/publications.html
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Outline of the Talk
– Evolutionary Algorithms Revisited – MAs’ Motivation (General Versus Problem Specific
Solvers) – MAs design issues – A Few Exemplar Applications – Related Methods and Advanced Topics – Putting it all Together – Conclusions, Q&A
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Conclusions (I) • There is much more in MA that meets the eye. Its not a simple matter of ad-hoc putting LS somewhere in the EA cycle.
• Just a small space of the architectural space of MAs has been explored and we don’t know yet why a given architecture performs well/bad in a specific problem (see my PhD thesis)
• People usually use one “silver bullet” LS. That’s fine if that SB exists. However when it does not exist use multimeme algorithms, or other heuristics teams/cooperative algorithms as lots of simple heuristics can synergistically do the trick.
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Conclusions (II) • ADAPT: the search process is dynamic and your method should detect and adapt to changing circumstances. Adaptation is not too expensive or complex to code!
• Carefully consider how your variation operators interact with LS
• Consider whether Baldwinian or Lamarckianism is better
• Understand that the fitness landscape explored by your MA is not a one-operator landscape but the results of the superposicion with interference of varios landscapes.
Ben Gurion University of the Negeve – June 23rd to July 5th 2009 Distinguished Scientist Visitor Program – Beer Sheva, Israel
Conclusions (III) • Use more expresive acceptance criteria in your local search, eg., fuzzy criteria
• If you don’t know what operators to apply let the the MA find it for you by some Self-Generating mechanism, e.g., co-evolution.
• Self-Generating mechanisms are a great niche for GPers!
FINALLY:
check out the literature, almost surely you will find MAs. among the best success stories in applications to real world probs!