Model Parameter Estimation - cs.bham.ac.ukszh/teaching/matlabmodeling/Lecture19... · Model...

26
Model Parameter Estimation Model Parameter Estimation Shan He School for Computational Science University of Birmingham Module 06-23836: Computational Modelling with MATLAB

Transcript of Model Parameter Estimation - cs.bham.ac.ukszh/teaching/matlabmodeling/Lecture19... · Model...

Model Parameter Estimation

Model Parameter Estimation

Shan He

School for Computational ScienceUniversity of Birmingham

Module 06-23836: Computational Modelling with MATLAB

Model Parameter Estimation

Outline

Outline of Topics

Concepts about model parameter estimation

Parameter estimation using Nelder-Mead Simplex Method

Parameter estimation using Particle Swarm Optimiser

Parameter estimation for Agent-based models

Assignments

Model Parameter Estimation

Concepts about model parameter estimation

What is model parameter estimation?

I After building the model, we need to determine the modelparameters.

I Usually only a fraction parameters can be measured byexperiments.

I Many parameters are hard, expensive, time consuming, oreven impossible to measure.

I Model parameter estimation: to indirectly determine unknownparameters from measurements of experimental data.

I Challenging because:I Experimental data is noisy and sparse, e.g., only a few number

of time points.I Many models are not mathmatically well defined, e.g.,

Agent-based models

Model Parameter Estimation

Concepts about model parameter estimation

Methods for parameter estimation

For equation-based models, we have:I Derivative approximation methods: essentially approximate

derivatives by finite differences, e.g., Euler’s method, and thenfit the parameters by linear regression.

I Pros: the computation time is typically very fastI Cons: derivative approximations lead to inaccurate parameters.

I Bayesian methods: essentially use Bayesian methods to inferparameters from data

I Pros: can handle noisy or uncertain data; can also infer thewhole probability distributions of the parameters rather than apoint estimate.

I Cons: Computational time is very slow because of the need tosolve high-dimensional integration problems – Markov ChainMonte Carlo

I Optimisation methods.

Model Parameter Estimation

Concepts about model parameter estimation

Parameters estimation as an optimisation problem

I Suppose we have a system of ODE:

dy

dt= f (t, y;p), y ∈ Rn, f ∈ Rn,

where p ∈ Rm is the vector of parameters, and a collection ofk measurements of experimental data:(t1, y1), (t2, y2), . . . , (tk , yk).

I We aim to minimise the following objective function, which isthe mean square error between the model output andexperimental data:

obj(p) =k∑

j=1

|y(tj ;p)− yj|2,

where | · | denotes standard Euclidean vector norm.

Model Parameter Estimation

Concepts about model parameter estimation

Optimisation algorithms

To solve the above optimisation problem, a variety optimisationalgorithms can be chosen:

I Gradient-based, e.g., Levenberg-Marquardt.I Requirements:

I Mathematically well definedI Smooth and differentiable

I Derivative-free optimisation algorithms:I Direct search algorithms, e.g., pattern search, Nelder-Mead

methodI Metaheuristic algorithms, e.g., Evolutionary Computation and

Particle Swarm Optimiser

Model Parameter Estimation

Concepts about model parameter estimation

Example 1: A simple model

I Observation:

0 1 2 3 4 5 6−20

0

20

40

60

80

100

120

data1data2

I Suppose we have constructed a simple ODE model:{dxdt = −ay + x + t2 + 6t + bdydt = bx + ay − (a + b)(1− t2)

I Task: Estimate parameters a and b from observation

Model Parameter Estimation

Parameter estimation using Nelder-Mead Simplex Method

Nelder-Mead Simplex Method

I A well-established direct search algorithm

I A heuristic search method, no guarantee to find optimalsolutions

I Based on the concept of a simplex, which is a special polytopeof N + 1 vertices in N dimensions

I Derivative-free: does not use numerical or analytic gradients

I Implemented in MATLAB as a function: [x,fval] =

fminsearch(fun,x0), where fun is the objective function tobe minimised and x0 is the starting point

Model Parameter Estimation

Parameter estimation using Nelder-Mead Simplex Method

Pros and Cons of Nelder-Mead Simplex Method

I Pros: Very fast

I Cons: only suitable for low dimensional and unimodal cases.Performance sensitive to starting point x0

Model Parameter Estimation

Parameter estimation using Nelder-Mead Simplex Method

Example 2: Parameter estimation of LV model

I The fur data was collected by Hudson Bay Company morethan 100 years ago

I Theoretical model for predator-prey interaction:Lotka-Volterra (LV) Model

Model Parameter Estimation

Parameter estimation using Nelder-Mead Simplex Method

Example: Lotka-Volterra model

{dxdt = x(α− βy)dydt = −y(γ − δx)

I α: prey population growth rate

I β: prey population decline rate

I γ: predator population decline rate

I δ: predator population growth rate

Model Parameter Estimation

Parameter estimation using Nelder-Mead Simplex Method

Example: Parameter estimation of LV model

I Objective: to estimate parameters of LV model from theHudson Bay Company fur data

I We select the period between 1908-1935

I We estimate parameters by metaheuristic algorithms

Model Parameter Estimation

Parameter estimation using Particle Swarm Optimiser

Evolutionary Computation (EC)

I Umbrella term for algorithms inspired by evolutionary systems

I Algorithms are known as evolutionary algorithms (EAs):Genetic Algorithms (GAs), Evolutionary Programming (EP)and Evolution Strategy (ES), etc.

I Also includes closely related nature-inspired approaches:Particle Swarm Optimiser (PSO) and Ant Colony Optimiser(ACO)

Model Parameter Estimation

Parameter estimation using Particle Swarm Optimiser

Evolution is amazing!

Spiny Rainforest Katydid Sand grasshopper

Walking Stick insectWalking leaf insect

http://www.environmentalgraffiti.com/featured/amazing-insect-camouflage/14128

Model Parameter Estimation

Parameter estimation using Particle Swarm Optimiser

Evolutionary Computation (EC)

Two main properties:

I Algorithms are usually population based: maintain a set ofpotential solutions at any one time;

I Algorithms are stochastic (non-deterministic): randomelements help obtain good solutions.

Model Parameter Estimation

Parameter estimation using Particle Swarm Optimiser

General design

In order to apply an EC to a problem, you need:

I A suitable representation of solutions to the problem;

I A way to evaluate solutions;

I A way to explore the space of solutions (variation operators);

I A way to select better solution to guide the search (exploit).

Model Parameter Estimation

Parameter estimation using Particle Swarm Optimiser

EC terminology

I Population: set of solutions

I Individual: a member in the population:

I Parents: reproducing individuals:

I Fitness function: a function to summarise how close anindividual (solution) is to achieving the set aims

Model Parameter Estimation

Parameter estimation using Particle Swarm Optimiser

Standard EC procedure

I Generate the initial population P(0) at random, and set t = 1I repeat

I Evaluate the fitness of each individual in P(t).I Select parents from P(t) based on their fitness.I Obtain population P(t + 1) by

I applying variation operators to parents

I Set t = t + 1.

I until termination criterion satisfied

Model Parameter Estimation

Parameter estimation using Particle Swarm Optimiser

Particle Swarm Optimiser (PSO)

I Invented by Kennedy and Eberhart 1995

I Inspired by bird flocking and fish schooling, more precisely,BIOD

I Simple rules for searching global optima

I Primarily for real-valued optimisation problems

I Simpler but sometimes better than GAs

Model Parameter Estimation

Parameter estimation using Particle Swarm Optimiser

PSO: detailed algorithm

I Can be seen as a swarm of particles flying in the search spaceto find the optimal solution.

I The variation operator consists of only two equations:

V k+1i = ωV k

i + c1r1(Pki − X k

i ) + c2r1(Pkg − X k

i )

X k+1i = X k

i + V k+1i

where X ki and V k

i are current position and velocity of the ithparticle, respectively; Pi is the best previous position of the ithparticle; Pg is the global best position of the swarm; ω isinertia weigh, typically in the range of (0, 1]; c1 and c2 areconstants, or so-called learning factors; r1 and r2 are randomnumber in the range of (0, 1)

Model Parameter Estimation

Parameter estimation using Particle Swarm Optimiser

PSO: algorithm illustration

I The search direction of PSO isdetermined by:

I The autobiographical memory,which remembers the bestprevious position of eachindividual Pi in the swarm

I The publicized knowledge,which is the best solution Pg

currently found by thepopulation

Model Parameter Estimation

Parameter estimation using Particle Swarm Optimiser

PSO: practical issues

I One key problem faced by EC researcher is how to choose theparameter

I PSO: Plug-and-play optimisation algorithm

I Only 3 parameters, all not very sensitive

I ω can be a linearly decreasing value - better local search

Model Parameter Estimation

Parameter estimation using Particle Swarm Optimiser

EC: further readings and MATLAB toolboxes

I My tutorial slides on Evolutionary Computation

I Evolutionary Computation Online tutorial.

I MATLAB Global Optimization Toolbox

I Genetic Algorithm Optimization Toolbox (GAOT)

Model Parameter Estimation

Parameter estimation for Agent-based models

Parameter estimation for Agent-based models

I Estimating parameters by minimising the mean square error ofmodel output and experimental data.

obj(p) =k∑

j=1

|y(tj ;p)− yj|2,

where y denotes the agent-based model.

I Because of the stochastic nature of agent-based model, weneed to run the model many times to get the average output:

obj(p) =k∑

j=1

| 1m

m∑i=1

y(tij ;p)− yj|2,

I Problem: Computationally very demanding!I Solution: parallel computing or GPU to speed up simulation.

Model Parameter Estimation

Parameter estimation for Agent-based models

Future direction: automated model construction from data

I Genetic Programming (GP) is a power tools for machinecreativity: Genetic Programming for reinventing patentedinventions.

I GP is useful for constructing model (with parameters) fromexperiment data

I Distilling Free-Form Natural Laws from Experimental Data.

I Automated reverse engineering of nonlinear dynamicalsystems.

I Dialogue for Reverse Engineering Assessments and Methods(DREAM) .

Model Parameter Estimation

Assignments

Assignments

Download my MATLAB code and data here, please:

I 1. use GAOT toolbox to estimate parameters of LV modelusing the the Hudson Bay Company fur data from year 1860to 1880;

I 2. try PSO to find the starting point for Nelder-Mead SimplexMethod.