2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

22
2005MEE Software 2005MEE Software Engineering Engineering Lecture 11 – Optimisation Lecture 11 – Optimisation Techniques Techniques

Transcript of 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Page 1: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

2005MEE Software 2005MEE Software EngineeringEngineering

Lecture 11 – Optimisation Lecture 11 – Optimisation TechniquesTechniques

Page 2: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

TopicsTopics

OptimisationOptimisation– what is it?what is it?– finite and infinite search spacesfinite and infinite search spaces– deterministic and stochastic approachesdeterministic and stochastic approaches

Optimisation techniquesOptimisation techniques– exhaustive searchesexhaustive searches– gradient ascentgradient ascent– Monte-Carlo (random) searchesMonte-Carlo (random) searches– genetic algorithmsgenetic algorithms

Page 3: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

What is Optimisation?What is Optimisation?

Process of finding an Process of finding an optimaloptimal or or sub-sub-optimal optimal set of parameters to solve a set of parameters to solve a problemproblem

For example:For example:– finding values of finding values of m,cm,c to form line which to form line which

best fits a given set of pointsbest fits a given set of points– determining best setup for a racing car determining best setup for a racing car

(springs, dampers, wings, ride height, etc)(springs, dampers, wings, ride height, etc)– maximising profits from mining or maximising profits from mining or

agricultural activitiesagricultural activities– travelling salesman problemtravelling salesman problem– data mining applicationsdata mining applications

Page 4: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Types of OptimisationTypes of Optimisation

Mathematical optimisationMathematical optimisation– optimal parameters can be calculated optimal parameters can be calculated

mathematically in fixed timemathematically in fixed time– eg, finding maximum of a parabolaeg, finding maximum of a parabola– generally trivial problems, almost never generally trivial problems, almost never

applicable to real-world situationsapplicable to real-world situations Numerical optimisationNumerical optimisation

– using algorithms to find a near-optimal using algorithms to find a near-optimal solutionsolution

– quality of solution can depend upon algorithm quality of solution can depend upon algorithm used and chanceused and chance

– result is rarely guaranteed as global maximumresult is rarely guaranteed as global maximum

Page 5: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Parameter ConstraintsParameter Constraints

It is usually necessary to place limits on It is usually necessary to place limits on parameter valuesparameter values– to reduce the search space (make it finite)to reduce the search space (make it finite)– to reflect physical propertiesto reflect physical properties

This knowledge of the problem is often called This knowledge of the problem is often called a prioria priori information information

Parameters without limits create an infinite Parameters without limits create an infinite search spacesearch space– solution is possible only if function behaviour is well solution is possible only if function behaviour is well

defined at the extrema – strictly monotonic, etc.defined at the extrema – strictly monotonic, etc. Common sense is usually sufficient to Common sense is usually sufficient to

constrain parameters in ‘real world’ constrain parameters in ‘real world’ applicationsapplications

Page 6: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Goal of OptimisationGoal of Optimisation

To find the ‘best’ solution to the To find the ‘best’ solution to the problemproblem– how is ‘the best’ defined?how is ‘the best’ defined?

Evaluation criteriaEvaluation criteria– simple for mathematical functions – highest simple for mathematical functions – highest

value is best (or worst)value is best (or worst)– very difficult for many real world problemsvery difficult for many real world problems

Often, near enough is good enoughOften, near enough is good enough– finding ‘the best’ may be too difficult or take finding ‘the best’ may be too difficult or take

far too longfar too long– a solution which is near optimal may be far a solution which is near optimal may be far

simpler and faster to computesimpler and faster to compute

Page 7: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Optimisation ExampleOptimisation Example

f(x)

x

Calculated maximum value

Actual maximum value

In this example, the optimisation technique used has not found the global maximum, but rather a local maximum which is nearly as good

Page 8: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Optimisation Optimisation ApproachesApproaches Deterministic optimisation:Deterministic optimisation:

– algorithm is not random in any wayalgorithm is not random in any way– given same search space and start given same search space and start

conditions, result will always be the sameconditions, result will always be the same Stochastic optimisation:Stochastic optimisation:

– algorithm is partially or totally randomalgorithm is partially or totally random– each run of the algorithm may give each run of the algorithm may give

different results, even with the same inputdifferent results, even with the same input Stochastic methods are generally Stochastic methods are generally

superior as they are less likely to be superior as they are less likely to be stuck in a local maximumstuck in a local maximum

Page 9: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Optimisation Optimisation AlgorithmsAlgorithms General approach:General approach:

– pick starting point(s) within the search spacepick starting point(s) within the search space– evaluate function at this pointevaluate function at this point– refine parameter estimate(s)refine parameter estimate(s)– continue until criteria metcontinue until criteria met

Known as ‘iterative refinement’Known as ‘iterative refinement’ Most optimisation algorithms use some Most optimisation algorithms use some

version of thisversion of this Requires method of evaluationRequires method of evaluation

– can be mathematical or practicalcan be mathematical or practical– often relies on a problem modeloften relies on a problem model

Page 10: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Optimisation Optimisation AlgorithmsAlgorithms Exhaustive search:Exhaustive search:

– every possible combination of parameter every possible combination of parameter values is evaluatedvalues is evaluated

– only possible for finite spacesonly possible for finite spaces– generally infeasible for most problemsgenerally infeasible for most problems– accuracy is determined by granularityaccuracy is determined by granularity

Monte-Carlo algorithm:Monte-Carlo algorithm:– random points are evaluatedrandom points are evaluated– best after specified time is chosen as best after specified time is chosen as

optimaloptimal– more time produces better resultsmore time produces better results

Page 11: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Gradient AscentGradient Ascent

Estimates are refined by determining Estimates are refined by determining gradient and following upwardsgradient and following upwards– starting points are still required (random?)starting points are still required (random?)– high probability of finding local maximahigh probability of finding local maxima– can find good solutions in short timecan find good solutions in short time– best result is taken as optimum parametersbest result is taken as optimum parameters

Parameters required:Parameters required:– distance to move at each stepdistance to move at each step– starting locationsstarting locations– # of points# of points

Page 12: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Gradient AscentGradient Ascent

f(x)

x

range of starting points which will give global maximum

Page 13: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Gradient AscentGradient Ascent

Only small range of starting values will Only small range of starting values will give global maximumgive global maximum– other starting points will give local maxima other starting points will give local maxima

onlyonly– unsuitable for many functions and problemsunsuitable for many functions and problems

‘‘Smoothing’ function may lead to better Smoothing’ function may lead to better resultsresults– requires knowledge of problem – how much requires knowledge of problem – how much

smoothing is performed?smoothing is performed?– small peaks are removed, leaving only large small peaks are removed, leaving only large

peakspeaks– range of ‘good’ starting values is increasedrange of ‘good’ starting values is increased

Page 14: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Smoothing FunctionSmoothing Function

f(x)

x

original starting range for global maximum

starting range of smoothed function for global maximum

Page 15: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Genetic AlgorithmsGenetic Algorithms

Relatively new approach to optimisationRelatively new approach to optimisation– a biological evolution model is useda biological evolution model is used– can lead to much better solutioncan lead to much better solution– able to ‘escape’ local maximaable to ‘escape’ local maxima

Set of points is used at each iterationSet of points is used at each iteration– next set is created using combinations of next set is created using combinations of

previousprevious– chance of a point being used is dependent chance of a point being used is dependent

upon its fitnessupon its fitness– a ‘survival of the fittest’ algorithma ‘survival of the fittest’ algorithm

Page 16: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Genetic AlgorithmsGenetic Algorithms

The algorithm:The algorithm:– choose starting set of pointschoose starting set of points– repeat:repeat:

evaluate fitness of each pointevaluate fitness of each point choose two parents based on fitnesschoose two parents based on fitness combine parents using combination functioncombine parents using combination function possibly add random mutationpossibly add random mutation repeat until new set is createdrepeat until new set is created

– until iteration limit is reached or suitable until iteration limit is reached or suitable solution foundsolution found

Page 17: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Combination FunctionsCombination Functions

Parameters are treated as chromosomesParameters are treated as chromosomes– must be merged in such as way as to must be merged in such as way as to

incorporate features of both parentsincorporate features of both parents– choice of merging function is critical to successchoice of merging function is critical to success

Implementations:Implementations:– binary: each parameter is treated as a binary binary: each parameter is treated as a binary

string, and bits are chosen randomly from each string, and bits are chosen randomly from each parent to form new bitpatternparent to form new bitpattern

leads to problems as bits are not equal in value!leads to problems as bits are not equal in value!

– parameter based: entire parameters from each parameter based: entire parameters from each parent are randomly usedparent are randomly used

does not allow parameters to changedoes not allow parameters to change

Page 18: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Crossover CombinationCrossover Combination

Based on a biological modelBased on a biological model Each chromosome (parameter, or Each chromosome (parameter, or

subset of parameter) pair is subset of parameter) pair is randomly splitrandomly split– one section of parent 1 is joined with one section of parent 1 is joined with

other section of parent 2other section of parent 2– repeated for each chromosomerepeated for each chromosome

Page 19: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

MutationMutation

Mutations are random changes to Mutations are random changes to parametersparameters– generally a pre-defined probabilitygenerally a pre-defined probability– can be large or small changescan be large or small changes

flip bits randomlyflip bits randomly small adjustment to parameter valuessmall adjustment to parameter values

Allow new solutions to be discovered Allow new solutions to be discovered that may not have been foundthat may not have been found– allows escape from local maximaallows escape from local maxima– can also cause a good solution to become can also cause a good solution to become

worse!worse!

Page 20: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

Choosing ParentsChoosing Parents

Use the fitness score to determine Use the fitness score to determine probability of each point being a parentprobability of each point being a parent– can lead to many points being completely can lead to many points being completely

ignoredignored– an entire generation could be spawned by an entire generation could be spawned by

only two parentsonly two parents Alternative approach:Alternative approach:

– rank each parent, and choose based on rankrank each parent, and choose based on rank– allows even very unfit parents a small chance allows even very unfit parents a small chance

to reproduceto reproduce– can help avoid stagnant gene poolscan help avoid stagnant gene pools

Page 21: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

ElitismElitism

Alternative form of reproduction where Alternative form of reproduction where ‘best’ parents move directly into next ‘best’ parents move directly into next generationgeneration– helps prevent ‘bad’ generationshelps prevent ‘bad’ generations– ensures solution never becomes worse at ensures solution never becomes worse at

progressive generationsprogressive generations– can lead to inbreeding (local maxima)can lead to inbreeding (local maxima)

Requires a careful choice of parental Requires a careful choice of parental selection methodselection method

Page 22: 2005MEE Software Engineering Lecture 11 – Optimisation Techniques.

ExamplesExamples

http://math.hws.edu/xJava/GA/http://math.hws.edu/xJava/GA/ http://www.rennard.org/alife/englihttp://www.rennard.org/alife/engli

sh/gavgb.htmlsh/gavgb.html