Approximate Bayesian computation and machine learning (BigMC 2014)

40
Approximate Bayesian computation and machine learning Pierre Pudlo Universit ´ e Montpellier 2 Institut de Math ´ ematiques et Mod ´ elisation de Montpellier (I3M) Institut de Biologie Computationelle Labex NUMEV BigMC: 19 juin 2014 Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 1 / 40

description

The talk I was not able to give because strike actions by rail unions had been disrupting the rail traffic between Paris and Montpellier. But Chrisitian P. Robert replaced me.

Transcript of Approximate Bayesian computation and machine learning (BigMC 2014)

Page 1: Approximate Bayesian computation and machine learning (BigMC 2014)

Approximate Bayesian computationand machine learning

Pierre Pudlo

Universite Montpellier 2Institut de Mathematiques et Modelisation de Montpellier (I3M)

Institut de Biologie ComputationelleLabex NUMEV

BigMC: 19 juin 2014

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 1 / 40

Page 2: Approximate Bayesian computation and machine learning (BigMC 2014)

Table of Contents

1 Approximate Bayesian computation

2 ABC model choice

3 Population genetics example on SNP data

4 Conclusion

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 2 / 40

Page 3: Approximate Bayesian computation and machine learning (BigMC 2014)

Joint work with

Jean-Marie Cornuet, Arnaud Estoup, Mathieu Gautier and Renaud Vitalis

Jean-Michel Marin, Christian Robert, Mohammed Sedki & Julien Stoehr

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 3 / 40

Page 4: Approximate Bayesian computation and machine learning (BigMC 2014)

Table of Contents

1 Approximate Bayesian computation

2 ABC model choice

3 Population genetics example on SNP data

4 Conclusion

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 4 / 40

Page 5: Approximate Bayesian computation and machine learning (BigMC 2014)

Motivation

Issue. When the likelihood functionf(yobs|θ) is expensive or impossible tocalculate, it is extremely difficult tosample from the posterior distributionπ(θ|yobs).

Two typical situations:I Latent vector u of high dimension,

including combinatorial structures(trees, graphs)

f(y|θ) =

∫f(y,u|θ)du

I Unknown normalizing constantf(x|θ) = g(y, θ)

/Z(θ)

(eg. Markov random fields,exponential graphs,. . . )

Example. Population genetics:likelihood of the current genetic datadepend on a latent genealogy (treewith random branch lengths)

ABC is a technique that only requiresbeing able to sample from thelikelihood f(·|θ).This technique stemmed frompopulation genetics models, about 15years ago, and population geneticistsstill significantly contribute tomethodological developments of ABC.

MCMC algorithms (Felsenstein,Hey,. . . ) get stuck on a tree of highprobability when the scenario iscomplex.(Multimodal distribution on acombinatorial space)

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 5 / 40

Page 6: Approximate Bayesian computation and machine learning (BigMC 2014)

Population geneticsThe typical questions of ArnaudEstoup:

I How the Asian ladybird arrived inEurope?

I Why does they swarm right now?I What are the routes of invasion?I What was the size of the colony

that founded a new population?

Answer using molecular data,Kingman’s coalescent process andABC inference procedures!

Under neutrality, the frequencies ofvarious allele evolves according to arandom process, which depends onthe demography.

20

Arbre de coalescence et mutations

Sous l’hypothèse de neutralité des marqueurs génétiques étudiés,

les mutations sont indépendantes de la généalogie

i.e. la généalogie ne dépend que des processus démographiques

On construit donc la généalogie selon les paramètres

démographiques (ex. N),

puis on ajoute a posteriori les

mutations sur les différentes

branches, du MRCA au feuilles de

l’arbre

On obtient ainsi des données de

polymorphisme sous les modèles

démographiques et mutationnels

considérés

The coalescent put a randomdistribution on the genealogy of asample of genes, backward in time,until the most recent commonancestor.

Add then the mutation process alongthe branches of the genealogy

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 6 / 40

Page 7: Approximate Bayesian computation and machine learning (BigMC 2014)

Population genetics (cont’)

38 2. Modèles de génétique des populations

•4

•2

•5

•3

•1

Lignée ancestrale

Passé

PrésentT5

T4

T3

T2

MRCA

FIGURE 2.2: Exemple de généalogie de cinq individus issus d’une seule population fermée à l’équilibre. Lesindividus échantillonnés sont représentés par les feuilles du dendrogramme, les durées inter-coalescencesT2, . . . , T5 sont indépendantes, et Tk est de loi exponentielle de paramètre k

�k - 1

�/2.

Pop1 Pop2

Pop1Divergence

(a)

t

t0

Pop1 Pop3 Pop2

Admixture

(b)

1 - rr

t

t0

m12

m21

Pop1 Pop2

Migration

(c)

t

t0

FIGURE 2.3: Représentations graphiques des trois types d’évènements inter-populationnels d’un scénariodémographique. Il existe deux familles d’évènements inter-populationnels. La première famille est simple,elle correspond aux évènement inter-populationnels instantanés. C’est le cas d’une divergence ou d’uneadmixture. (a) Deux populations qui évoluent pour se fusionner dans le cas d’une divergence. (b) Trois po-pulations qui évoluent en parallèle pour une admixture. Pour cette situation, chacun des tubes représente(on peut imaginer qu’il porte à l’intérieur) la généalogie de la population qui évolue indépendamment desautres suivant un coalescent de Kingman.La deuxième correspond à la présence d’une migration.(c) Cette situation est légèrement plus compliquéeque la précédente à cause des flux de gènes (plus d’indépendance). Ici, un seul processus évolutif gouverneles deux populations réunies. La présence de migrations entre les populations Pop1 et Pop2 implique desdéplacements de lignées d’une population à l’autre et ainsi la concurrence entre les évènements de coales-cence et de migration.

Between populations: three types ofevents, backward in timeI the divergence is the fusion

between two populations,I the admixture is the split of a

population into two parts,I the migration allows the move of

some lineages of a population toanother.

The goal is to discriminate betweendifferent population scenarios from adataset of polymorphism (DNAsample) yobs observed at the presenttime.

A complex scenario:

2.5 Conclusion 37

Divergence

Pop1

Ne1

Pop4

Ne4

Admixture

Pop3

Ne3

Pop6Ne6

Pop2

Ne2

Pop5Ne5

Migration

m

m0

t = 0

t5

t4

t04Ne4

Ne04

t3

t2

t1r 1 - r

1 - ss

FIGURE 2.1: Exemple d’un scénario évolutif complexe composé d’évènements inter-populationnels. Cescénario implique quatre populations échantillonnées Pop1, . . . , Pop4 et deux autres populations non-observées Pop5 et Pop6. Les branches de ce schéma sont des "tubes" et le scénario démographique contraintla généalogie à rester à l’intérieur de ces "tubes". La migration entre les populations Pop3 et Pop4 sur lapériode [0, t3] est paramétrée par les taux de migration m et m0. Les deux évènements d’admixture sont pa-ramétrés par les dates t1 et t3 ainsi que les taux d’admixture respectifs r et s. Les trois évènements restantssont des divergences, respectivement en t2, t4 et t5. L’événement en t04 correspond à un changement de tailleefficace dans la population Pop4.

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 7 / 40

Page 8: Approximate Bayesian computation and machine learning (BigMC 2014)

ABC recap

Likelihood free rejection samplingRubin (1984) The Annals of StatisticsTavare et al. (1997) GeneticsPritchard et al. (1999) Mol. Biol. Evol.

1) Set i = 1,2) Generate θ ′ from the priordistribution π(·),

3) Generate z ′ from thelikelihood f(·|θ ′),

4) If ρ(η(z ′),η(yobs)) 6 ε, set(θi, zi) = (θ ′, z ′) and i = i+ 1,

5) If i 6 N, return to 2).

We keep the θ’s values such that thedistance between the correspondingsimulated dataset and the observeddataset is small enough.

Tuning parameters

I ε > 0: tolerance level,I η(z): function that summarizes

datasets,I ρ(η,η ′): distance between vectors

of summary statisticsI N: size of the output

ABC target:πε(θ, z|yobs) = π

(θ, z

∣∣∣ρ(η(z),η(yobs)

)6 ε)∝ π(θ)f(z|θ)1

{ρ(η(z),η(yobs)

)6 ε}

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 8 / 40

Page 9: Approximate Bayesian computation and machine learning (BigMC 2014)

Another point of viewAim. Compute an approximation of π

(θ∣∣∣η(yobs)

)

Likelihood free rejection sampling1) Set i = 1,2) Generate θ ′ from the priordistribution π(·),

3) Generate z from thelikelihood f(·|θ ′),

4) If ρ(η(z),η(yobs)) 6 ε, setθi = θ

′ and i = i+ 1,

5) If i 6 N, return to 2).

The general ABC algorithm

A) Generate a large set of (θ, z)from the Bayesian model,

π(θ)f(z|θ)

B) Use nonparametric methods toinfer the conditional

distribution π(θ∣∣∣η(yobs)

)

The set of (θ, z) computed in A) is often called the reference table.

If B) = kernel density estimate of the conditional density, ε = the bandwidth

Likewise, if k-nearest neighbor estimate of π(θ∣∣∣η(yobs)

)in B), ε = the largest

distance among the accepted particles. See Biau, Cerou, and Guyader (2014)

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 9 / 40

Page 10: Approximate Bayesian computation and machine learning (BigMC 2014)

Another point of view (cont’)

The general ABC algorithm

A) Generate a large set of (θ, z)from the Bayesian model,

π(θ)f(z|θ)

B) Use nonparametric methods toinfer the conditional

distribution π(θ∣∣∣η(yobs)

)

As we can see, calibrating ε is achallenging question. EitherI a bandwidth,I the number of neighbors.

Cannot be known before stage A)

Other samplers do not send θeverywhere(hence, less time consuming)

I ABC-MCMC (Marjoram et al.,2003)

I Sequential methods: ABC-PMC(Beaumont et al. 2009),ABC-SMC (Del Moral et al. 2009)

I Adaptive sequential methods. . .

Drawbacks.I how to calibrate ε?I if informative prior, the amount of

time we can save with suchalgorithms is not large

⇒ I will not talk about those cleveralgorithms

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 10 / 40

Page 11: Approximate Bayesian computation and machine learning (BigMC 2014)

New light on Beaumont’s postprocessing

Beaumont’s ABC algorithm

A) Generate a large set of (θ, z)from the Bayesian model,

π(θ)f(z|θ)

B) Keep the particles (θ, z) suchthat ρ(η(yobs),η(z)) 6 ε

C) Fit (in β and θ0) aweighted, local linear model

explaining θ by z on the

kept particles:

θ = β(z− yobs) + N (θ0,Σ2)

Replace the θ’s by the

residuals, centered around

θ0.

How to explain the algorithm from alearning point of view?

Assume we want to approximate theexpected value of the posteriordistribution π(θ|η(yobs)), i.e.,

E(θ∣∣∣η(z) = η(yobs)

)

This is a regression problem. We canuse some weighted, local linear model.

Beaumont’s postprocessing method(stage C)) is a local linear model.See also Blum and Francois (2010)

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 11 / 40

Page 12: Approximate Bayesian computation and machine learning (BigMC 2014)

Other issues I will skip

How to choose the set of summarystatistics?I Joyce and Marjoram (2008,

SAGMB)I Nunes and Balding (2010,

SAGMB)I Fearnhead and Prangle (2012,

JRSS B)I Ratmann et al. (2012, PLOS

Comput. Biol)I Blum et al. (2013, Statistical

science)

In the context of ABC modelchoice?I Barnes et al. (2012, Stat. and

Comp.)I Fearnhead et al. (2014, SAGMB)I Estoup, Lombaert, Marin,

Guillemaud, Pudlo, Robert,Cornuet (2012, MolecularEcology)Estimation of demo-geneticmodel probabilities withApproximate BayesianComputation using lineardiscriminant analysis onsummary statistics

EP-ABC Barthelme & Chopin (2013, JASA), . . .

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 12 / 40

Page 13: Approximate Bayesian computation and machine learning (BigMC 2014)

Table of Contents

1 Approximate Bayesian computation

2 ABC model choice

3 Population genetics example on SNP data

4 Conclusion

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 13 / 40

Page 14: Approximate Bayesian computation and machine learning (BigMC 2014)

Model choice in population geneticsIncluded in the software DIYABC,Cornuet J-M, Pudlo P, Veyssier J,Dehne-Garcia A, Gautier M, Leblois R,Marin J-M, Estoup A (2014,Bioinformatics)

Example. Does pygmies share acommon ancestral populations?

94

!""#$%&'()*+,(-*.&(/+0$'"1)()&$/+2!,03 !1/+*%*'"4*+56(""4&7()&$/.+.1#+4*.+8-9':*.+

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 14 / 40

Page 15: Approximate Bayesian computation and machine learning (BigMC 2014)

Model choice in population genetics

Included in the software DIYABC,Cornuet J-M, Pudlo P, Veyssier J,Dehne-Garcia A, Gautier M, Leblois R,Marin J-M, Estoup A (2014,Bioinformatics)

Example. Does pygmies share acommon ancestral populations?

96

!""#$%&'()*+,(-*.&(/+0$'"1)()&$/+2!,03 !1/+*%*'"4*+56(""4&7()&$/.+.1#+4*.+8-9':*.+

Différents scénarios possibles, choix de scenario par ABC

Verdu et al. 2009

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 15 / 40

Page 16: Approximate Bayesian computation and machine learning (BigMC 2014)

Another example:Worldwide routes of invasion of Harmonia axyridis

For each outbreak, the arrow indicates the most likely invasion pathway and theassociated posterior probability value (P), with 95% credible intervals in brackets

The harlequin ladybird was introduced to fight against another deleterious instect: hopaphid (since 1916 in North America, since 1982 in Europe)

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 16 / 40

Page 17: Approximate Bayesian computation and machine learning (BigMC 2014)

Harmonia axyridis cont’

From a scenario on the map to a tractable model

An evolutionary scenario

The model

Kingman’s coalescent is constraintedto live in the above “pipes”

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 17 / 40

Page 18: Approximate Bayesian computation and machine learning (BigMC 2014)

Population genetics (cont’)

For one population, the joint likelihoodof the genealogy and of thepolymorphism observed at the presenttime is available.

However, the genealogical tree is anuisance parameter, and calculationof the likelihood for the parametersinvolves integrating out theunknown tree. That is not possible.

This modelling clearly differs from thephylogenetic perspective where thetree is the parameter of interest.

Choosing a population scenario is amodel choice problem.

We consider a collection of M models,for m ∈ {1, . . . ,M}: fm(yobs|θm) andπm(θm)Prior on model index: π(1) = P(M =1), . . . ,π(M) = P(M =M)

The Bayesian model choice is drivenbyP(M = m|yobs) ∝π(m)

∫Θi

fi(yobs|θi)πi(θi)dθi

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 18 / 40

Page 19: Approximate Bayesian computation and machine learning (BigMC 2014)

ABC model choice

ABC model choice

A) Generate a large set of(m,θ, z) from the Bayesianmodel, π(m)πm(θ)fm(z|θ)

B) Keep the particles (m,θ, z)such that ρ(η(yobs),η(z)) 6 ε

C) For each m, returnpm = porportion of m among

the kept particles

If ε is tuned so that the number of keptparticles is k, then pm is a k-nearestneighbor estimate of

E(1{M = m

}∣∣∣η(yobs))

Actually, approximating the posteriorprobabilities of each model is aregression problem whereI the response is 1

{M = m

},

I the co-variables are the summarystatistics η(z),

I the loss is L2 (conditionalexpectation)

The prefered method to approximatethe postererior probabilities in DIYABCis a

local polytomouslogistic regression

Less sensible to k, but no clear gain inquadratic loss

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 19 / 40

Page 20: Approximate Bayesian computation and machine learning (BigMC 2014)

Machine learning to analyse machine simulated data

ABC model choice

A) Generate a large set of(m,θ, z) from the Bayesianmodel, π(m)πm(θ)fm(z|θ)

B) Use machine learning methodto estimate anything on

m∣∣η(yobs)

In this machine learning perspective:I the (iid) “data set” is the reference

table simulated during stage A)I the actual observed yobs becomes

a new data point

Note that:I predicting m is a classification

problem⇐⇒ select the best model basedon a maximal a posteriori rule

I computing π(m|η(yobs)) is aregression problem⇐⇒ confidence in each model

It is well known that classification ismuch simple than regression.(dimension of the object we try tolearn)

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 20 / 40

Page 21: Approximate Bayesian computation and machine learning (BigMC 2014)

Machine learning (cont’)

ABC model choice

A) Generate a large set of(m,θ, z) from the Bayesianmodel, π(m)πm(θ)fm(z|θ)

B) Use machine learning methodto estimate anything on

m∣∣η(yobs)

It is well known that classification ismuch simple than regression.(dimension of the object we try tolearn)

Hence we renounce approximatingthe posterior probabilities.

An error in the classification problem⇒an error in approximatingπ(m∣∣∣η(yobs)

)which swaps the order

the posterior probabilities

A revealing example: MA(1) vsMA(2)Time series (xt) of length T = 100

MA(1): yt = εt − θ1εt−1MA(2): yt = εt − θ1εt−1 − θ2εt−2where εt iid, Gaussian.

Prior from Marin et al. (2011, Stat &Comp)

Auto-covariance of order 1,2,. . . assummary statistics

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 21 / 40

Page 22: Approximate Bayesian computation and machine learning (BigMC 2014)

Another issue I will skip

The information we lost using nonsufficient summary statistics.

Fundamental discrepancy between thegenuine Bayes factors/posteriorprobabilities and the Bayes factorsbased on summary statistics. SeeI Didelot et al. (2011, Bayesian

analysis)I Robert, Cornuet, Marin and Pillai

(2011, PNAS)I Marin, Pillai, Robert, Rousseau

(2014, JRSS B)I . . .

Running example: MA(1) vs. MA(2)

+

+

+++

+

+

+

+

+

+

++

+

+

+

+++

++

+

++

+

+

+

+

+

++

+ +

+

+

+++

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+++

+

+

+

+

+

+

+

++

+

+ ++

+

+

+

+

+

+

++

+

+

+

+

++++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+ +

+

+++ ++

+

+

+

+

+

++

+

+

+

+

+

++

+

+

+

++ +

+

++

+

+

+

+++

+

+

+

++

+

+

++

++

+ +

+

+

++

+

+

+

+

++

+

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+

+

++

++

+

+

+

++

+

++

+

++

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

++

+

++

+++

+

+

++ +

+

+++

++

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+ +

+

++

+++

++

+

+

+

+

+

+

+ +

+

+++

+

++

+

+

+

+++

+

+

+

++

+ +

+

+

+

+ +++

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

++++

++

++

+

++

++

++

+ +

++ +

++

+

++

+

+

+

+

+

+

++

+

+

+

+

++

+ +

+

++

++

+

+

+

+

+

+

+

+

+

++

+

+

+

++

++

+

+

++

++

+

+

+

+

+

+

+

+

+

+++

++

+ ++

+

++

+

+

+

+

+

+

+

++

+

+

++

++

++

+

+

++

+

+

+ ++

+++

+

+

+

++

+

+

+

++

+

+++

+

+

+

++

+ +

+

+

++++

+++

+

+

+

+

+

++

+

++

+

+

+

+

++++

+

+

+

+

+

++

+

+

++

+

+

+

+

+

+

++

+

++

+

+

+

+

+

+

++ +

+

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+

++

+

+

+

+

+++

+

++

+

+

++

+

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

+

+

+

+

++

+

++

+

+

+

+

+

+

++

+

+

+

+

++

+

+

+ +

+

++

+

+

+

+++

+

+

+

+

++

++

++

+

+

+

+

+

+++

+

+

+

+

+ +

+

+

+

+

++

+

++

+

+

+

+

++

+

+

+

++

+

+

+

+

+

++

+

+

+++

+

+

+

+

+

+

+

++

+

+

+

+

+

+ ++

+

+

+

+

+++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+++

++

+

+

+

+

+

++ +

+

+

++++

+++

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+++

+

+

+

+

+

+

+

+

+

++

+ +

+

+ ++

+++

+

+

+ +

+

+

+

+

+

+

+

+

+

++

+

++

+

+

++

+

+

+

+

+

+

+

+

+

+ +

+

++

+

+

+

+

+ +++

+

++

+

++

+

+

+

++

+

+

+

++

+

+

+

+

+

++

++

+

+

+

+

+ +

++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+ +

+

++

+

+

+

++

+

++

++

+

+

+

+

+

+

+

+

+

++

+++

++

+

+

+

+

+

++++

+

+

+

++

+

+

+

+

++

+

+++

+

++

+

++

++

+

+

+

+++

+

+

+

+

+++

+

+

++

++

+

++

+

+

+

++

+

++

+

+

+

+

+

+

+

+

++

+

++

+

+

+

+

+++

+

++

+

+

++

+

+

+

+ +

+

++

+

+

+ +++ +++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+++

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

++

+

+

++

+ +

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+ +

+

++

+

+

+

++

+

+

+

+

++

++

++

++

++

+

+

+

+

++

+

+

+

++

+

+

++

++

+

+

+

+

++

+

++

+

+

+

++

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+++

+

++++

+

+

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

++

+

+

+ ++

+ +

++

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

+

++

+

+

+++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

++

+

+

+

+

+

+++

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+ ++

+

++

+

+

+

+

++

++

+

+

+

++

+

+

+

+

+

+

+

+

+

+

++

+

+++

++

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+ ++

+

+

+

+

+

++

+

+

+

+++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

++

+

+

+

+

+

+

+++

+ +

+

+

+

+

+

+

+

+

++

+

+

+

+

++

++

+

+ ++

+

++

+

++

+

+

+

+

++

+

+

+

+

+

+

+

+

+++

+

+

+

+

+

++

+

+

+ +

+

+

+

++

+

++

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+++

+

++

+++

+

++

+

+

+

++

+

+

+

++

++

++

+

++

+

+

+

+

+

++ + +

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+++

+

+

+

+

+

+

+

+

++

+

+ ++

+ +

+

+

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+

+

+

++

+

++

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

+

+

+

+

++++

++

+

+

+

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++

++

++

++

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+++

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+++

+

++

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

++

+

+

+

+

+

+++

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

++

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++ +

+

+

++

+

+

+

+

+

+++

+

+++

+

+

+

+

+

++ ++

+

++

+

+

+

+ +

+

+

+

+

+ +

+

++

+

+

+

+

+

+

++

+

+

++

+

+

++

+

++

+

+

+++

+

+

+

+

+

++

+

+ ++

++

+++

+

+

+

+

+

+

+

+

++

+

+

+ ++

+

++

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+ ++

++

+

+ ++

+

+

+

+

+

+

+

+

+

+

+

++

+

+

++

+++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+ +

+ +

++

+

++

+

+

+

++

+

+

+

+++

+

+ ++++

+

++

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

++

+

++

+

+

+

+

+++

+

+

++

+

+

++++

+

+

+

+

+

++

+

++

+

+

+

++

+

++

+

+

+

++

+

+ +

+

+++

+

+

+

+

+

+

+

+

+

+

++

+

+

++++

+

+

+

+

+

+

+

+

++

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

++++

+

+

+

+

+

++

+

+

++ +

+

+

+++

+

+

+

++

+

+

+

+

+

++

+

++

+

+

+

+

+++

+

+

+

+

++

+

+

+++

+

+

+ ++

+

+

+

+

+

++

+ +

+

+

+

+++

+

++ +

+

+

+

+

++

+

+

+

++

+

+

+

+

+

++

++++

+++

++

+

+

+

+

+

+

+

++ +

+

+

+

+

+

++++

++

+

+

+

+

+

+

++

+

++

+ +

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+

+

+

++

+++

+

+

+

++

+ +

+

++

+

+

+

+

++

+

+

+

+

+ +

+

+

+

+

+

+

+

+

+

++

+

+++

+

++ +

+

+

+

+

++ +

+ +

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

++

+

+

++

+

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+++

+

+

+

+

+

+

+

+

++

+

+++

+

+

+

+

++

+

+

++

+

+

++++

+

+

+

+

+

+

+++

+

+

+

+

++

++

++

++

+

+ ++

+

++

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

++++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

++

+

+

+

+ + +

+

+

+

+

++ +

+

++

+

+

+

+ +

+

+

+

+

++

+

+

+

+

+

++

+

+

+

++

+

+

+

+

+

+

+

++

+

++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+++

+

+

+

+

++

+

+

+

+

+

+

++

+

+

+

+

+

++

+

++

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

++

+

+

+

+

+

++

++

+

++

+

+

++

++

++

+

+

++

+

+ ++ +

+

+

+

++

+

+

+

+

++

+++

+

+

+

+

+

+

+

+

+

+

+ +

+

++

+

++ +

+

++

+

+

+

+ ++

+++

+

+

+

+

++

+

+

+

++

+

+

+

++

++++

+

+

+

+

++

+

+

+

+

+

+ ++

+

++

+

++

+

+

+

+

+

+

+++

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+

+ +

+

+

++

++

+

+

+

+

+

+

+

+

++

++ + +

++ +

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+++++

+

+

+

+

+

+

+

+

+

+ +

+

+

+ +

++

+

+

+

+

++

++

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+ +

+

+

+

+

+

+

+

+

+

+

++

+

+ +

+

+

+

+ +

+

+

+

+

+

+

++

++

+

+

+

++

+

+

+

+

++

++

+ ++++

++

+

+

++

++

+

+

++

+ ++

+

+ ++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+

+

+ +

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

++

+

++

+

+

+

+++

+

++

+

+

+

++

+

+ ++

+

+ ++ +

++

+

+

+

+

+

+

++

+

+

+

+

+

+

++

+

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

++++

+

+

+

+

+++

+

+

+

+

++

+

+

+

++

+

+

+

+ +

+

++

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

++

+

++

+

++

++ +

+

++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+ + +

+

+

+

+

++

+

+

+

+

++

+

+

+

+

+

+

+

++

+

+

+

+

+

+ +

+

+

+

+

++

+

+

+

+

++

+

+

++

++++

+

+

++

+

+

+

++

+

++

+

++

+

+

+

+

+

+

++

+

+

+

++

+

+

++++ +

+

+

+

+

+

++ +

+

+

+

+ +

+

++

+

+

+

+

+

+

++

+

+

+

++++

+

+

++

+

+ ++

++

+

+++

+

+

+

+++

++

+

++

+

+

+

+

+

+

+

+

+

+

+

++

+++

+

+++ + ++

+

++

++

++

+

+

+

+++++

+

+

+

+

+

+

+

+

++

++ +++

+

+

+

++

+

+

+

+

+

+

+

+

++

++

+

+++

+

+ +

+

+

+

+

+

+

+++

+

+

+

+

+

++

++

++

+

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+++

+

+

+

+

+

++

++

++

++ +

+

+

+

+

++

+

+

+

+

+

+

++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

++ ++

+

+

+

+

++

+

++

+

++

+

+

+

+

+

++

+

+

+

+

+++

+

++

+

+++

+

+

+

+

+

+

+

++

+

+

+

+

+

++

+

+

+++

+

+

+

++

+

+

+

++

+

+

+

+

+ +

+

+

+

+

++

+

+

+

+

+

+

+

+ +

+

+

+

+

+

+

+

+

+

+

+

+

+

+++

+

+

+

+

+

+

++

++

++

+

+

+++

+

++

+

+

+

+

+

+

+++

+

+

+

++

+

+

++

+

+

++

+

+++

+

+

++++

+

+

+

++

++

+

+

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

+

+

+

+

+ +

+

+ +

+

+

+

+

++

+

+

+ +

+

+

+

++

+

+

+

+

+

+

++

+

+

+

+

+

++

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

++

++

+

+

+

+

+

++++

+

+

+

+

+

+

++

+

++++++

+

+

+

+

+

++

+

+

+

+

+

+

+

+ + +

+

++

+

+

+

+

+++

+ +

+

+

+

+

+

+

++

+

+ +

+

+

+

+

+

++ +

+

+

++

+

+

++

+

++

+ ++

+

+

+

+

+

+

+

+

+++

+

+

+

+ +

+

+

+

+ +++++

+

+

+

+++

+

+

+

+

++

+

+

+

+

+

++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+ ++

+

+

+

+++

++

+

+

+

+

++

+

+

+

+

+

+

+

++

+

+

+

+

+

+

++

+ +

+

+++

+

+

++

++

++

+

+++

+

++

+

+ +

++

+

+

+

+

+

+

+

+++

+

++

+

++

+

+

+

++

+ +

+

+

+

+

+

++

+

+

++

+

+

+

+

+

++

+++

++

++

+

+

+

+

+

+ +

+

+

+

+

+

+

+

++

++

+

+

+

+++

+

+

+

+

++

+

+

++

++

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+ +

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

++

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+

++

+

+++

+

+

+

+

+

+

++

++

+

+

+

+

+

++++

+

+

+

++

++

++ + +

+

+

+

+

+

+ ++

+

+

+

+

++

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+ +

+

+

+

+

++ +

+

+

+

+ ++

++

+

+

+

++

+

+

++

++

++

+

+

+

+

+

+

++

+

++ +

+

+

+

+

+

+

+ +

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

++ +

+

+

++

+

+

+

+

+++

++

+

+

+

++

+

+

+

++

+

++

+

+

+

+

+

+

+++ +

++

++

+

+

+

+++

+

+

+

+

+

++

+

+++

++

++

+

+

++

++

+

+

+

+

++

+

+

++

+

+

+

+

+

++

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

++

+

++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

++

+

+

+

+

+

+

+

+++

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

++

++

+

+

+

+

++

+ +

+

+++

++

+

+

+

+

+

+

+

+

+

+++

+

++

++

+

+

+ +++

+

++

+

+

+

++

+

+

++++

+

+

+

+

++

+

+

+

+

+

+

+

+

++

+++

++

++

++

++

+

+

+

+

+

+

++

+

+

++

+

+

+

+++

++

++

+

+

+

+

+

+

+++

+

+

++ +

++

+ +

+

+

+

+

+

+

++

+

+

+

++

+

+

+

+

+

+

++

++

+

+++

+

+

+

++

++

++

+

+

++

+

+

+

++

++

+

+

+

+

++

+

+

+

+

++

+

+

++

+++

+ +

+

+

+

+

+++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+++

+

+

++

+

+++

+

++

+

++

+

+ +

+

+

+

+

+

+++

+

+

+++

+ +

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+++

+

+

+

+ ++

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+ +

+

+ ++ + +

+

++++

++

++

+

+

++

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

++

+

++

+

+

+

++

+

+

+

+

+

+ ++

+

+

+

+

++

++

++

+

++

+

++

+

+

+

+

+

+

+

+

+

+

++

+

+ ++++ +

+

++

+

+++

+

+

+

+

+

+

++

+

++

++

+

++

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

++

+

+

+

++

+

+

+

+

+

+

+

+

++

+

++

+

+

+

+

+

+

++

+

+ +

++

++

+

+

+

+

+

++

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+++

++++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+++

+

+

+

++

++

+

+

+

+

+

++

+

+

+

+

+

+

++

+++

++

+

+

+

+

+

+

++

++++

+

+

+

+

+

+

+

+

++

+

+ ++

+

+

+

+

+

++

+

++

++

+

+

+

+

+

+

++

+

++

++

+ ++

++

+

+

++

+

+ ++

++

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+

+ +

+

+

+

+

+++

+

+

+ ++

+

++ +

+

+

+

+

+

+

+

+

+

+

+++

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+

+

+

++

++

+

+

++

+

+++

+

+

+

++

+

+

+

+

++

++

+

+

+

+++

+

+

+

++

+

+

++

+

+

+

+

+

+

+

+++

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+ ++++

++

+

++

+

+

++

+

+

+

+

+++

++

+

+

++

+

+

+

+

+

+

+

+ +

+

+

++ +

++

+

+

+

+ + +

++

+

+

+

+

++

+

+++

+ +

+

+

+

+

+

+

++

+

+

++

+ +

+

+

+

+

+

+

+

+ +

+

+

+

++

+

++

++

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+ +

+

+

+

+

+

+

++

+

+

+

+

+

+

++

+

+

+

++

+

+

+

+++++

++

++

++

+

+

+

++

+

+

++

+

+

+

++

+ +

+++

+

+

+

+

+

+

++

+

++

++

+

+

+

+

++ +

++

+

+

+

+++

+++

+

+

+

++

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

++ +

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+++

++

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

++

++

+

+

+

+

+

+ +

+

++

+

+

+++

+

++

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

+

+

+

+

+

++

+

++ +

+

+

++

+

+

+

+

++

+

++

+

+

++

+

++

+

+

+

+

+

++

++

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+++

+

+

+

++

+

+

+

+

+

++

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

++

+

+

++

+

+ +

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

++

+

++

+

+

+

+

+ +

++

+

+++

+

+

+++

+

++

+

++

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+++

+

+

++

++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

++

+

+

+

+

++

+

+

++++

++

+

++

+

++

+

+

+

+

++

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+ +

++

+

+

+

+

+

+

+

+

+

+++

++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

++++

+

+

+

+

+++

+

+

++

+

++

+ +

+

++

+

+

+

+++

+

+

+

++

+

+

+

+

+

+

++

+

+

+

+

+++

+

+

+++

+

+

+

+

++

+

+

+

+

+

+

++

+

++

+

+

++

+

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

++

+

+

++

+++

++

++

++

+

++

+

++

++ +

+

+

+

+

++

+

+

+

+

++

+ +

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+++

++

++

++

+

+

+

++

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+++

+

+

+

++

+

++

+

+

+

+

+

+

+

+

++

+

++

++

+

+

+

++

+

+

+

+

+

++

+

+

+

++

+

+

++

+ +

+

+

+

+

+

+

+

+ +

+

+

+

++

+

+

++

+

+ +

+

+

+

+

+

+

++

+

+

+

+

+

+

++ +

+

+

+

+

+

+

+

+

+++

+

+

+

+

+

+

+

+

+

+

+

++

+

++

+

+

+

+

++

+

+

+

++

+

+

+

+ +

+

+

+

+

++

+

++

+

++

+

+

++

+

+

++

+

+

+

++

++

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

++

+

+

+

+

+ +

+

+

+

+

+

+

++

+

+

+

+

++

+

+

+

+

+

+

+

++

+

+

+ +

+

+

+

+

++

++

+

++

+

+++

++

+

++ +

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+ ++ +

+ +

+

+

+

+

++

+ +

+

++

+

+

+

+

+

+ ++

++

+++

+

+

+

++

+

++++

+

++

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

++

+

+

++

++

+

+

+

+

+

+

+

+

+

++

+

+++

+++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++++

+

+

+

++

+

+

+

+++++

+

+

+

++

++

+

++

+

+

+

+

+

+

++++ ++ +

+

+

+

++

+

+

+

++

++

+

+ +

+

+

+

+

+

+

++ ++

+

+

+ ++

+

+

+

+

+

+++

+

++ +

++

+

++

+

+

+

+

+

+

+

+

+

+

+

++ ++

+

+

+

+

+

++

+

+

+

++

+

++++

+

+

+

+

+

++

+

++

+

+

+

+

++

++

+

+

+

+

+

+

+

+

+

+ +

+

+

+ +

+

+

+

++

+

+

+

+

++

++

++

+

+

+++

+

+

+

+

+++

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

++

++ +

+

++

+

+

+

+

+

+

+

++

+

+

+

+

++

+

+

++

+

+

+

+

+

+

+

+

+

++

+

+

++

+

++

+

+

+

+

++

++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

+

+ +

+

+

++

+

++

+

++ +

+

+

+

+

+

+

+

++++

++

+

++

+

+

+

++++++

+

+++

++

+

+

+++

+

+

+

+

+

+

+

+

+

+

++

+ +

+

+

++

+

++

++

+

+++

+

+++ +

++

+

+

+

+

+

+

+

+ ++

+

+

+

+

+

+

+

+

+

+

+ ++

+

+

+

+

++

+

++

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

++

+

+

+++

+

+

+

+

+

++

++

+

+

+

+

++ +

+++

+

+

+

+

++

+

+

+ +

++ ++

++

+

+

+

+

+

++

+

+

+

+

++

+

+

+

+

+

+

++

++

+

+

+

+++ + +

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+++

+

+

++++

+

+

+

+

+

++

+

++

+

+

+

++

+

+

+

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+ +

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

++

+

+

+

+

+

+

++

+

++

+

+

+

+

+

+

+

++

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

+

+

++

+

+

+

+

+

+

+ +

+

+

+

++

+

+

+

+

+++

+

+

+

++

+

++

+

+

+

+

+

++++

+

+

+

++

+

++

+

+

++

+

+

+++

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

++

+

+

++

+

+

+

+

+

+

+++

+

+

++

+

+

+

++

++

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

++ +

+

+ ++

++

+

+

+

++

+

+

+

+

+ +

+

+++

+ +

+

+++

+

+

++

+

+

+

+

+

+

+

+

+

++

+

+

+

+

++ +

++

++

+++

+

+

+

+

+

++

++

+ +

+

+

+

+

+

++++

+

++

+

++

++

+++

+

+

++

+

+

+

+

+

+

+

++

++

++

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

++

+

+

++

+

++

+

+

+

+

+

+

+

++

+++

+

+

+

+

+

++

+

+

+

+

+

++

+

+

+

+ +

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+++

+

+

+

++

+

+

+

+

+

+

+

++

+

++++

+

+

+

+

+

+

+

+

+

+

+

++

+

+

++

+

++

+

+

+

++ ++

+

+

+

++

+

+

+

+

+

++

+

+

+++

++

++

+

+

+

+

+

+

++

+

+

+

+

+

+

++

+

+

+

++ +

+

+

+

+

+

+

+

+++

++

+

++ +

+

++

++

+

+

+ +

+

+

+

+

+

++

+

+

+

++++

+

+

+

++

+++

+

+

+

++

+

+

+

+

++

++

+

+

+

+++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+++

+

+++

+

+

+

+

++ +

+

+

+

+

++++

+

+

+ +

+

+

+

++

+

++

+

+

+

+

+++

+

+

+

+++

+

+++

+

+

+

+

+

+ ++

+

+

+

+

+ +

+

+

++

+

+

+

+

+

+

++

+

++ ++

++

+

+

+

+

+

+

+

+

+

+++

++

+ ++ +

+

+

+

+

+

+

+

++

+

++

+

+

+

++

+++

+

+++

+

+ +

++

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

++++ + +

+ +

++

+

+

+

+

++

++

++

+

++

+

+

+

+

+

++

+

+

+

++

+

+

+

+

+

+

+

+

+++

++

+

++

+

+

++

+

+

+

++

+

+

+

+

+

+

+

+

+

++++

+

+

+

+

+

+

+

+

+

+

+

+

++

+++

+

+

+

+

+

+

++

++

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

++

+

+

++

+

+

+ +

+++

++

++

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

++

+

+

++

+

++

++

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

++

+

++

+

+

+

+

++

++

++

++

+

++

+

+

+

+

+

+++

+

++

+

+

+

+

++

+

+

+

+

++ +

++++++

+

+

+

++

+

+

++

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+++

+

+

+

+++

++

++

+

+

+

++

+ +

+

++

+

++

+

++

++++

+

++

+

+

+

+

+

+

+ ++

+

+

+

+

+

+

+

+

+

+

+

+

+++

++

+

+

++

+

+

+

+

+

+

+ ++

+

++++

++

+

++

+

+

+

+

+

+

+++

+

+

+

++

+

+

+

+

+

+

+

+++

+

+

+

++

+

+

+

+ ++ +

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

++

+

+

+

+

+

+

++

+

++++

+

+

+

+

+

+

+

+

+

+

++++ +++

+

+

+

+

+

+

++

+

+

+

++

+

+

+

+

+

++

+

+

+

+

++ +

+

+

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+ ++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

++

+

+

++

+

+

+

+

+

++

+++

+

+

++

+

+

+

++

+

+

+ +

+

+

+

+

+

+ +

+

+

+

+

+

+

++

++

+

++

+

+

+

++

+

+

+

+

++

+++

+

+

++

+

++

+

+

+

+++

+

+ ++

++

++

+

+

+

+

+

++

++

+

+

+

++

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

+

+

+

++

+

+

+

++

+

+

+

++

+

+

+ ++

+

+++

++

++

++

+

+

++

+

+

+

+

+

++

+

+

++

++

+ +

+

+++

+++

+ +

+

+

+

+

++

+

++

+++ +

+

++ +

+

++++

+

+

++

+

++

+

+

+

+

+

++

+

+

++

+

+

++

++

+

+

+

+ +

+++

+

+

+

+

+

+

++ +

+

++

+

+++

+

+

+

+

++

+

++

+ +

++

+

+

+

++

+

+

+

+

++

++

+

++

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+

++

+

+ +

+

+

+

++

+

++

+ +

+

+

+

+

+

++

+

++

+

++

+

+

+

+

+

+

++

+

++

+

++

++

+

+

+

+

+

+

+

+

+++

+

+++

+

+

+

+

+

++

+

+

++

+

+

+

+

+ +

++

+++

+

+

+

+++

+ ++

+

+

+

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

+

+

+

+

+

++

+

+

+

+

++

+ +

+

+

++

+

+++

+

+

++

+

+

++

+

+

+

+

+

+

++

++

++ +

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++ +

+

+++

+

+

+

+

++

+

+

+

+

+

+

+++

+

++

+

+

+

++ ++

+

+

+

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+

++++

+

+

++

+++

+

+

+

+

+++

+

+

+++

+

+

+

+

++

++

+

+

+

+

++

+

+

+

+

++

+

+

+

++

+

++

+

+

++

+

+

++

+

+

++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

++

++

+

+

++

+

+

+

+

+

+

++

++

+

++

+

+

+

+

+

+

+

+

++++ +++

++

+

+

+

+

++

+

+

+

+

+

+

+

++

+

+

+

+

+

+++

+

+

+

+

++

+

+

+

+

+

+

+ ++

+

++

+

+

+

++

+

++

++

+

+

+

+

+

++ + +

+

+

+

+

+

+

+

++

+

+

+

+

+

+

++

+

+

+

+

+

+

+ ++

++

+

+

+

+

+

++

+

+

+

+

+

+

++

+

+

+

+

+

++

+

+

+++

+

+

++++

+++

+

+

++

+

+

+

++

+

+

+

++

+

++

+

+

++

+

++

+

+

++

+

+

+

+

+ ++

+

+

+

++

+

++

++

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

++

++++ +

+

++

+

+

+

+

++

+

+

+

+

+

+

++

+

+

+ + ++

+

+++

+

++

+

++

+

++

++

+

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

++

+

+

+

+

+

+

+

+

+ +

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+++

+

+

++

+

++

+

+

+++

+

+

+ +

++

+++

+

+

+

+

+

+

+

+

+

+

+++

+

+

++

+

+

+

+

+

+

+

++

+ +

+

+

+

+

+

+++ +

+

+

++

+

+

++

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+++

+

+

++

+

+

+

+

+

++

+

+

+

+

+

++

+

+

+

++ +

++

+

+

+

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

+

++

+

+

+ +

+

+

+

+

+

++++

++

+

+

+

+

+

++++

+

++

+++

+ +

+

+

++ +

+

++

+

++

+

+

+

+

++

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+

+

++

+

+

+

++

+

+

+

+

+

+

++

+

+

+

+

+ +

+

++

+

+ +

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+++

+

+++++

++

+

++

+

+

+

+

+

+

++

+

+

+

+

+

++

+

+

+

+

+

+++

+

+

+

+++

+

++

+

++

+

+++

+

+

+

+

+

+

+

++

+

+

+

++

++

++

+

+

++

+

+

+

++

+

+

+

++

+++

++ +

+

+

+

+

+

+++

+

++

+

++ +

+++

+

+

+

+

+

+

+

+++ ++

+

++

+

+

+

+

+

++

+

+

+

+++

+

+

+

+

+

++

+ +

+

+

++

+

+ +

+

+

++

++

++

+

+

+

+

+

+

+

+

++

+

++

+

++

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

++

+

+

+

+

++

+

+

+

++

+

+

+

+

++

++

+

+

+

+

+

+

++

+

+

+

+

+

++

++

+ +

+

+

+

+

+++

+ +

++

+

+++

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+++ ++

++

+

++

++

+

+

+

++

+

+

+

++

++

+

+

++

+

+

+

++

++

+

+++

+++ +

+

++

+

++

+

+

+

+

+

+

+

++

++

+

+

+

+

++

+

0.0 0.2 0.4 0.6 0.8 1.0

0.0

0.2

0.4

0.6

0.8

1.0

p(m=2|x)

p(m

=2|

S(x

))

p(MA(2)

∣∣η(yobs))

vs. p(MA(2)|yobs)

blue = simulated from MA(1)orange = simulated from MA(2)

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 22 / 40

Page 23: Approximate Bayesian computation and machine learning (BigMC 2014)

MA(1) vs MA(2)

The reference table of MA(1) vsMA(2)x-axis: auto-covariance of order 1y-axis: auto-covariance of order 2

blue = simulated from MA(1)orange = simulated from MA(2)

Which classification method?I linear discriminant analysis,I local logistic regression,I k-nearest neighbors,I . . .

+

+

+++

+

+

+

++

++

+

+ ++

+

+

+

+++

+

+

++

++

+

++

+

+

+

+

+++

+ +++ +

+

+

+

+

+

++ +++

+

+++ +

+

+

+

+

+

+ +

+

++

+

+

+

++

+

+

+

+ ++ +

+

+

+

++

++

+

+

+

++

+

++

+

+

+

+

+

+ ++

+

+

+

++

+

++

+

+

+

++

+

++

+

+

++

+

+

+

+

++

+

+

+

+

+

+++ +

+

+++

++

++

+

++

++

+ + +

+++

++

+

+

+++

++

+

+

+

+

+

+

++

+

+

+

+

+

+ +

+

+ +

+

++

+

+

+

+

+ +

+ +

+

+

+

++

+

+

+ +

+

+

+

+

+ ++ ++

+

+

+

+

+

++

+

+

++ +

+

++ +

+

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

++

++

+

+

++

++

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+++

+

++

++

+

+

++

++

+

+

+

+

+

+

++

+

++ +

+

+

++

+

++

+

+

+

+

+

++

+

+

++

+

+

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+ +

+

+

+

+

+

+

+

+

+

++

++

+

+

+

++

++

+

+

+

+

+

+

+

+++

+

+

++

+

+ ++

+

++ +

++

+

+

+

++

++

+

+

+

++

++ +

++

+

+

++

+

+

++

+

+

+ +

+

+

+

+

+

++

+

+

+

+ +

++

+ +

+++

+

+

++

+

+

+

+

+

+++

+

+

+

++

+

+

+++

+

+

++

+

+

+

++

++ ++ +

++

++

+

+

+

+

+

+ +

+

+

++

++

++

+++

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

++

+

+

++

+

++

+ +

+

+

+

+

++

+

+

+

+

++ +

+

+

+

++

+

+

+

+

+

+

+

++

+

+ +

+

+

+

+

+

+

+

+

+

++

+

+ ++

+

+

+

+

++ +

+

+

+

+

+

+

+

+

++

+

+

+

+

++

++

++ +

+

+

+

++

++

+

+ +

+++++ +

+

++

+ +

++

++

++

++ ++

+ +

+

+

+ ++

+

+ +

+

+

++

+

+

+

+

+

++

++

+

+

+

+

+

+

+

+

+

+

++ +++

+

++

+ ++

+

++

+

+

++

+

+

+

+

+

+

+ ++

+

+

+

+

+

+

+

+

+++

+

+

++

+

+ ++

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

++

+

++++

++

+

++

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+++

++

+

++

+

+

++

+

+

+

+

+

+

++

+++

+

++

++

++

+

++

+

++ +

+

+

+

+

+

+

+

+

+

+

++

+

++

+

+

+

+

+

+++

++

+

+

+

+

+

++

+

+

+ +

+

+ +

++

+

+

+

+

+

+

+++++

+

+

++

++

+

+

+

+

+

+ +

+

+

+

+

+

+

++

+ ++ ++ +

+

+ ++++ +

+

++

++

++

+

+

+

+

+

+

+ +

++

+++

+

+

+

++

+ +

+

+

+

+

+++

+

+

+

+

+

+

+

+++

+

+

+

+

+++ ++

+

+

+

++

++

+

+ +

++

+

+

++ +

+

+

+

+

+

+

+

+++

+

+

+

+

+

+

++

+

++

+

+

+

+ +

+

+

+

+

++

+

++

+++

+

+

++

+

+

+

++

+

+

+ +

+

+++

+

+

+

+

+

++

+

+

++

+

+

++

+

++++

+

++

+

+

+

+

+

+

+

++

+++ +

++

+

+

+

+

+

++

++

+

+

+

++

+

+

++

++

++

+

+

++

++

+

+

+++

+

+

++

+

+

+

+

+

++

+

+

++

+

+

+

+

++ + +

+

+

+

+

+ ++

++

++

+

+

+

+

+

+

+

+

+

+

+

+

+

++ +

+

+

+

+

+

++

+

+

+++

+ + +

+

+

++

++

++

+ +

+ +

+

+ ++

+

+

+

++

+

+

+ +

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

+

+

++

++ +

+

+ ++ +

+

+

+

++

+

+

+

+

++

+

+

+

+

+

++

+

+

+ +

+

+

+

+

+

+++

+

+

+ +

++

+

+

+ +

+

+ +

+

+

+

+

+ +

+

+

+

++ +

+ ++

+

+

+

++

++

+

+

+

+

++

+++

+

+

+

++

++

+

+

+

+

++

++

++ +

+

+

+

+

+

+

+

+

++

+++

+

+

+

+

+

+

+

+

+ +

+

+

+

+

++++

+

+

+

+

++

++

+

++

+

+

+

+

+

++ +

+

+

+

+

+

+

+

+

+

++++

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+ ++

+

+

+

++

+

+

+

+

+

+ +

+

+

++

+

++

+

+

+++

+

+

+

+

+

+

+

+

+

+

++ +

+

+

++

+++++ +

+++

+ +

+

+++

+ +

+

++++ +

+

+++

+

+

+

+

++

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

++

++

+

+ +

+ +

+

++

+

+

+++

+ ++

+

++

+

+

+

+

+

+

++

++

+

+

++

+

+

+

+ +

+

+

+

++

+

+

++

+

++

+

+

+

+

++

+

+

++

+

+

++

+

++

+

+

+

+

+

++

+

+

++

+++

++

+

+

++

+

+

+

++

++

+

+

+

+

+

+

+

+

+

+

+ +

+

+

+

++

+

++ +++

+ +

++

+

+

++

+

+

+

++

+

+

+

+

+

++ +

+

+ +

+

+

+ +++

+

++

++

+

+

+

++

+

+

+

+

++

+

+

++

+

++

+

+

+

+

+

+

+

+ +++ +

+

++ +

+

+ +++ +

+

+

+

++

+

+

+

+

++

+

+

+

++

+

+

+ ++++

++

+

+

++ ++++

++

++++

+

+

+

+ +

++

+

+

+

++

+

+

+

++

+

++

+

++

+

++

++

++

+

+

+

+

++

+

+

+

+

++

+

+

+

+

++

+ +++

+

++ +

+

++

+

+

+

+

+++ +

+

+ +

+

+

+

+

+

++

+

+

+

+

+

++

+

+

+

+

++

++ ++

+

+

+

++

+

+

+

+ +

+

+

+

+

+

+

+

++

+

++

+

++

+

++

+

+

+

++

+

+

++

+

+ ++

+

+

+

+ +

++

+

+

+

+

+

++

+

++

++

+ ++

+

+

+

++

+

+

+

+

+

+

+

+

+

++

+

+

+

+ +

+

+

+

+

+

++

+

+

++

+

+

+

+ +

+ +

+

+

+

++

++

+

+

+

+

+ ++

+

+

+

+

+

+

+

++

+

+

+

+ +

+++ +

+

++

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+ +

+

+

++

+

+

++

+ +

+ +

++

+

++

+

+

+

+

++

+

+

+

+

+ +

+

+

+

+

++

+

+

+

+

+

+

++

++

+

++

+++

++

+

++

+

+

+

++

+

+

+

++

++

+

++ +

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

++

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+ +

+

+

+++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

++

+

+

++

++

+++

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+++

+

+

+

++

+

+

+

+

+

++

+ ++

+

+ +

+

+

+

+

+

+

+

++

+

+

++

+ + +

+

++ + +

+

++

+

+

+

+

+

++

+

+

++ +

+

+ ++

+++

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+ +

+

++

+

+

+

+

+

+

++

+

+

+

+

++

+

+

+

+

+

++

+

+

+

+

++

+

+

+

+

+

+

++

+ ++

+

+

+

++

++

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

++ +

+

+

+

+

+

+

+

++

+

++

+

+

+

++

+

+

+

+

+

+

+

++

+

+

++

+

+

+

+

+

++

+

++

++

+

++ +

+

+

+

+

+

+

+

++

+

+++

+

+

+

+

+

+

+

+ +

+

+++ +

+++

+

+

+

+

+

+

+

++++

+

++

+

+

+

+

++

+

+

+

+

+

++

++

+

+

++

+

+

+

+

++

+

+

+

++

+

+

+

+

++

+

+

+++ +

++ +

+

+

+

+

+

+

+

++ +

++

+

++

+

+

+

+

+

++

+

+

+

+ +

+

+

+ +

+++

+

++ +

++

+

+

+ +

+

++ +

++

++

+

+

+

+

++ + ++

+ +

+

+

+

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+

+ +

+

+++

+

+

++ ++

+

+

+

+ +

+

+

+++ +

++

+

++

+ + ++ +

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

++

+

+

+++

+

+

+

++

+

+ + ++ +

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+

+ +

+

+++

+ ++

+

+

+

+++

+

+

+

+

++++

++

+

+

+

+

+ +

+

+

+

+

+

++

+

+

+

++

+++

+

++

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+++

++

+

+

+

+

+

++

+

+

++

++

+

+

+++

+

+

+

+

+

+

+

++

+

+

+

+ ++

+

+

+

+

+

++

+

++

+

+

+

+

+

++

+ ++

+ +

++++

+

++

+

+

+

+

+++

++

+

++

+

++

+

+

+

+

+

+

+

+

++ + ++++

++

++

++

+

+

+

+

++ +

+

+

+

+

+

+++ +

+

++ +

++

+

++

+

+

+

+

++++

+

+ +++

++

+

+

+

+

+

+

+

+

++

+

+

+

+

++

++

++

+

+

++

+

+

+

+

+

+

+

++

+

+

+

+

++

+

+

+

+

++

+

++

+++

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

++

+

+

+

++

+

+

+

++

+

+

+

++

+

+

+

++

+

+

++

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+

+++

++

+

++

+

+

+

+

++

+

+

+

++

++

+

+

+

+

+

+

++

+

+

+

++

++

+

+

+

+

++

++

+

+

+

+

+

+

+

+

+

+

+

+ +

++

+

+

+

++

+

++

+

++

++

+

+

++

+

+

+

+

++

+

+

+

++

+

+

+

+

+

++

++

+ +

+

+

+

+

+

+

+ +

+ ++

+

++

+

+

+

+

+

+

+

++

+

+

+++

+

++

+

++

+

+

++

+

+

+

++

+ +

+

++

++++

+ +

++

+

+

+ +

+

+

+

+

++

+

+

+

+

+

++

+

++ +++

+++

+

+

+

++

++

+

+

+

+

+

+

+ +

++

++ +

+

++

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+

+

++

++

+

+

+

+

+

+

+

++

++

+

++

+

++

++

++

+

+

+ +

+

+

+

+

+

+

++

++++

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+ + +

+

+

+

+

++

+++++

++ +

+

+

+

+

+

+++

+

+

+

+

++

+

+

+

+

+

+

++

+

+

+

+

+

+

+++

+

+

+

+

+

+

++

++

++

++

+

+

+

++

+

+

++

++

++

+++ +

+

+

+

++ +

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

++ +

++

++

+

+

+

++

+

++

+

+

+

+++

+

+

+

+

+

+

+++

+

+

++

+

+

+ +

+

+

++

+

+

++

++

+

++

+ ++

+

++

+

++

+

+

+

++

+

+

+ + +

+

+

+

+

+

+

+

+ ++

++

+

+

+

+

++

+

+++

+

+

+

++

+

+

+

+

+

+

+

+++

+

+ ++ +++

+

+ +

+

+

++

++

+

+

++

+

+

+

+

+

+

+

+

+

+

++

+

+ +

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+ +

++

++

+

+

+ +

+

+

+

+

+

+ +

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

++

+

+

+

+

+

+ +

+

+

++

+

+

+

+

+

+

++

++

+

+

+

+

+

+

++

+

+

+

+

++

+

+

+ +

+

+ +

+

+ ++

+

+

+ ++

+

++++

+

+ ++

+

+

+ ++

++

+

+

+

+ +

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

++

+

++ +

+

+

+

+

+ +++ +

++

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+++

++

+

+

+++

+

++

+

+

+

++

+

+

+

++

+

+ ++

++

+

+

+

+

++

+

+

+ + +

+

+ +++

+

+

+

+

+

+

+

+

++

++

+

+

+ ++

++

+

+++

+

+

++

+

+

+

+

+

+

+

+

+ +

+

+

+

+

++

++

+

+

+

+

+

+

+

++

+ ++

+

+

+

+

++

+

+

++

+

+

+

+

++

++

++

++

+

+

+

+

+

++

+++ +

++

+

++

+

++

+

+

+

+

+

+

+

+ +

+

+

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

+

++

+

+

+

+

+ +

+

+

+

+

++

++

+

+

++

++

++

++

+

+

+

+

++

+

+

+

++

+

+

+

+

++

+

+

+

+

+

++ +

+

+ +

+

+

+

++

+

+

+

++

+

++

+

+

+ ++

+

++

+

+

+

+

+

+

+

+

+ ++ ++

+

+

+

+

++

+

++

+

+

+

++

+

++ ++

+

+

++

+

++

+

+

++

++

+

++

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+

+

+

++

+

+

+

++

+

+ +

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

++

++

+

+

+

+

+ +

+

+

+

+

+

++

+

+

+

+ + ++

+

+

++

+

+

+

+

+ ++

+

+

+

++

+

+ ++

+

++

+

+

++

++

+

++

+

+

+

++

+

+

+

++

+

+++

+

++

+

+

+

+

+ +

+

+

+

++

+

+

+

+

+

+

+ ++

+

+

+

++

+

+

++

++

+

++++

+

++ +

+

+

+

+

+ +

+

+

+

+

++

+

+

++

+

+

+

+

+ +

+

+

+

++

+

++

+

++

+

+

++

+

+

+

+

+

+

+ +

+

+

+

+

+

+

+

+ +

+

+

++

++

++++

+

++

+

+

++

+ +

+ +

+

+++

+

+

+

++

+

+

++

+

+

+

+

++ +

+

+

++

+

+

+

+

+

+

+

+

+

+++

+

+

+

+

++

+

++

+

+

+

++

+

+

+

++

+

+

+

+ +

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+ ++

+ + +

+

+

++ +

+

++

+

+

++

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+ ++

+

+

+

+ ++

+

+

+

+++

++

+

+

++

+

+ ++

+ +

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+

+

+

++++

+

+ ++

++

+

+

+

+ +

++

+

+

+++

+

+

+

++

++

+

++

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+

+

++

++ ++ +

++

+

+

+

+

++

+

+ +++

++ +

++

+

++ +

+

+

+

+

+

++

+

+++

+

+

++

+

+

+

++ +

+

+ +

+

++

+

++

+ +

++

+

+

+ ++

+

++

++

+

+++

++ +

+

+

+

+

+

+

+

+++

++

+

+

++

+

+

++

+

+

+

++

+

++

++

+

+

+

+

+

+

+ +

+

+

+ ++

+

+

+

++

+

++

+

+ +

+

+

++

+

++

+

+

+

+++ + +++

+

+

+

++

+

+

+

+

+

+

+

+

+

+++

++

++

++

++

++

++

++

+

+

+

++

+

++

++

+ ++ ++

+

++

+

++

+ +

+

+

+

+

++

+

+ ++

++

+

+

+

+

+

++

++

+

++

+

+

+ +

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+ ++

++

++

+

+

++

+

+

++ +

+

++

+

+

+

++

+

+

+

+

+

+

+ +

++

+

+

++

+

+ ++

+

+

++ ++ +

+

+

+

+

+

+

+ +

+

+

+

++

+

+

+

+

++

+

+

+++++

+

+

++

++

+

+ ++

+

+

++

+

++

+

+

+++

+

+

+

+

+

+

+

+

++

+

++

+

++

++ +

+

+

+

+ ++

+

+

+

++

++

++ +

+

+

+

+

+

+ +

+

+

+

+

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

+

+

+ ++

+

+

+

+

+

+

++

+

+

+

+

++

+

++

+

++

+

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

++

+

+

++

+++

+

++

+

+

+

++ +

+

+++

+

+

+

+

+

++

+ ++

+

+++

+

+

++

+

+

++ +

++

+

+ ++

+

+++

+

++

+

++

+

+

+

+

+

+

+

+

+

++ +

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

++

+

++

+

++

+

+

+

+

+

+ ++

+

+

+

+

+

+

+

+

+

+

+

+ ++

+

+

+++

+

+ +

++ ++

+

+

++

+

+

+

++

+

+

+

++

++

+

+

+

++

++ +

+

++

+

++

+

++

+

++

+

+

+ ++

+

+

+ +++

++

+

+

+

++

+

+

+

+

+++

+

+

+

+

+++

+ +

+

+

+

+

+

+

+

+

++

++

+

++

+

+

+

+++

+

++

+

+

+

+

++

+

+

+

+

++

++

+

++

++

+

++

+

+

+

+

+

+

+

+

++

++

++ +

+

++

++

+

++

+

+

+

++++

+

++

+

+

+

++ +++

+

+

+

++ +

+

++

++

+

+

+++

+

+

++++

+

++

+

+

+

+++++

+++

+

++

+

+

+

+

+

++

+

++ +

++

++

+

++

+

+

++

++

+

+

+

+

++

+

+

+

+

+

+ +

+

+ +

+

+

+

+

+++ +

+

+

+

+

+

+

++

+

+

+

++

+

+

+

+

+

+

+ +

+

+

++

+

+

+

++

++

+

++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+ +

++

+

++

+

+

+ +

+

+

+

+

+

+

+

++

+

+

++

++

+

+

+

+

++ +

+

++

+

+

+

+

+

+ +++

+

++

+

+

+

++

+

+

+

+

++

+

+

++

+++

+

+

+

+ +

+

+

+

+

+

+

+

+

+

+++++

+

+++

+

+

+

++

+++

+

+

+ +

+

+

+

+ ++

+ +

+

+

+

+

+++

++

+

+++

+

+

+

+

+

+

+

+ +++

+

+

+++

+

+

+

+

+

+

+ +

+

+

++

+

++

+

+

+

+

++ +

+

+

+

+

+

++

+

+ ++

+

+

+

+

+

+ ++++

+

+

++

+++

++

+

+

++

+ +

+

+

+

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+

+++

+ +

++

+

+

+

+

+

+

++

+

+

+

+

+ +

+

+++

+

+

+

+++

+

++

+

+

++ +

+

++

+

++

+

+

+++

+ +

+

+ +

+

++

+

+

+

+

++

+

+

+

++

++

+

+

+

+

+

++

+

+

+

+ ++ ++

+

+

+

+

++

+

++

++

+

+

++

++

++

++

+

+

+

+

++

+

+ +

+

+

+

++

+

+

+++

+

++ +

++

+

+

+

+ +

++ + +

+

+

++

++

++

+

+

+

+

+

+

+

+

+ +

+

+

+

+

++

+

+ +

+ +

+

+

+

+

+

+

+

+

++ +

+ +

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+ +

++

++

+

+ +

+

+

+

+++

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

++

+

+

+

++

+ ++ +

+

+

+

+

+

+

+

+

+ + +

+

+++

+

++

++

+

+ ++

++

+

+

+

+

+ ++

++

+

+ +

+

++

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+

+ +

++

+

+

+

+

++

+

++

+

+

++

+

++

+

++

+

+

+

++

+

+ +

+

++

+

+

++

+

+

+

+

++

+

+

+

+

+ +

+

+

+++

+ +++

+

++

+

+

+

+

+

+ +

++

+

+++

+

+

++ +

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

++

+

+

+ +

+

+

++

+

+

+

+

+

+ +

+

+

++

+

+++

+

+

+

++ +

++

+

+

+

++

++

+

+

+

++

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

++

+

+

+

+

+

+

+

+

++

+

+

+

++

++

+ +

+

+

++

+

++

+

+

+

+

++

+

++

+ +

+

++ +

+

+ + +

+

+

+

+

+

+

+ ++

+

+

+

++

+ ++

+++

+

++

+

++ +

+

++

+ +

+

++ +

++

+

+

+

++

+

+++

+

++

+

+

+

++ ++

+

+

+

++

+

+

+

+

+

+

++

+

+

+

++

+

+

+

+

+

++ +

+ +

+

+

+

+ +

+

+

+

+

+

+

++

+

++

++

+

+ ++

++

++

+

+

+

+

+

+

+ ++

+

++ +

++

+ +

+

++

+

+

+

++

+

++

++

+++ +

+++

+

+

+

+ ++

+

+

+

+

+++

+

+

+

+

+

+

+

+

++

+

+

+

+

+

++

+

+

++

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

++ ++

+

+

+

+

++

+

+

+

+

+

+

+

+ +

+

+

+

+

+

++

+

+

+

+

+

+

++

+

+

++

+

+

+

++

+

+++

+

+

+

+

+

+

+

+

+

++ +

++

+

+

+

+

+

++

+

+

++

++

+

+

+

++

+

+ +

+

+

++

+

++

+

++

+

+ +

+

+

+

+

+ +

+

+

+

+

++

++

+

+++

+

++ +

+

++ +

++

+

++

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

+

++

+

+

++

+

+

++

+

+

+

+++

+

+

++

++

+

+

+

+

++

+++

+

+

+

+

+

++

+

+

+

+

+

+

+

+++

++

++ ++

+

+

+++ +

+

++

+

+

++

+

++

+++

++

+

+

+

++++

+

+

+

+ +

+

+

+

+

+

+

+

+

+++

+

+

++

++

+

+ ++

+

+

+ +

+

++

+

++

++

+

+

+

+

+

++

+

+

+

+

+++

+

++

++

++ +

++

+

+

+

+

+

+

+

++ +

+

++

++

+

+

+

+

+

+

++

+

+ ++

+

+ +

++

+

+

++

+

++

+

++

+

+

+

++

++

++

+

+

++

+

+

++

+

+

+

+

++ +

+

+

+

+

+

+++

+

+

+

+

+

++

+

+

+

++ +

+

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+

++

+

+

+

+

+

+

+

++

+

++

++

+

+

+

+++

+

+

++

+

++ +

+

+++

+

+

+

++

+

+

+

++

+

+

+

+ ++

+

++

+

+

++

+

+

+

+

+

+

+

+

+ ++

+

+

+

+

++

+

+

+

++

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

++

+

+ +

+

+

+

+

++

+

+

+

+

+

++ +

+

+

+

++

++

+

+

+

+

++

+

+

+

+

+

+

++

+

+

++ +

++

+

+

+

+

+

++

+

+

++

++

+

+

+

+

+

+

+

+

+

++

++

+

+

++

+

+

+

+

+

+

+

+++

+

+

+

++

+

+

+

+

+

++

+

+

++

+

+

+ +

+

+

++

+

+

+

++

+

+++

++

+

+

+

++ +

+ +++

+ ++

+

+ +

++ +

+

++

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

++ +

+

+

+

++

+ ++ +

+ +

+

+

++

+

+

+

++ +

++

+

++

++

+

+

+

+

+

++

+ + +

+

++

+

+

+

+ +

+ +

+

+

+

+

+++

++

++

+

+

+

+

+

+

++

+

+

+

+

+

+ + ++ ++ +

++

+

+ +++

+

+

+

+

+++ +

+

+

+

+

++

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+ ++

+

+

++ ++

+

++

++

++

+

+

++

+

+

+

++

+

+

+

+

++

+

++

+

+

+

+

+++ +

+++

+

+

+

++

+

+

+

+

+

+

++

++

+

+

++ +

+

+

+

+

++

+

+

+

+

+

++

++

+

+

++++

++

+

++

+

+ ++

+++

+

+

+

+

+++++

++

+

+

+

+

++

+

+

+

+

++

+

+

+

+

+

++

++

+

+

+

++

++

+ +

+

+

+ +

+

+

+

+

+++

+

+

++

+++

+

+

+

+

+

+++

+

+

+

+

+

++

++

+

+

++

+

+

+

+

+

+

+++

+

++

+

+

++

++

+

+

+

+

+

+

+

+++

+

++++

+++

+

+

+

++++

+

+

+

++

+ +

+

+

++

+

+

++

+

++

++

++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

++

+

+ +

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+ ++ +

+

++

+

+

++

+

+ ++

+

+

+

+

+++

+ +

++

+

++

+

+

+

+

+

+

+

+

+++ +

+

+

+

+

+

+

+

++

+ ++

+

+

++

++

+

++

+

+

+

+

++

+

+

+

++

+

++

+

++

++

+

+

+

++ ++

+

++

++++

+

+

+

+

+

++

+

++

+

++

+

+

+

++

+

+

+

+

++++ +

+

+

++

+

+

+++ +

+

+

+

+ +

+

++

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

++ +

+

+

+

+

+ +

+

+

++

+

++

+

+

+

+

+ +

+

++

++

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+++

+ ++

++

+

+ ++ +

+

+ +

+

+

+

+

+

+

+

+

+

++

++

++

+

+

+

+

+

+ +

+

+

+

+

+

+

++

+

++

+ +

+

++

+

+

++ + +

+

+

+

+

++

+

+

+

++

++

+

+

+

+

+

++

+

++

++

+

+

++ ++

++

+

+

+

++

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

++

+

+

+

+

+

++

+

+

+

+

+

++

+

++ +

+

+

+

+

++++

+

+

+ ++

+

+

+

+

+

+

+

++

+

+

+

+

+

++

+

+

+

++

+

+

++ +

+

+

++

+

++

+

+ + ++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

++

++

+

++

+

+

+

++

++

+

+

+

+

+

+

+

++

++

+

+

+

+

++

+

+

+

+

+

+

+

++

+ ++

++

++

+

+

++ + +

+++

+

+

++

+

+

+

+

+

+ +

+

++

+

++

+

+

+

+

+

+

+

+ + ++ +

+

+

+

++

+

+

++

+

++++ +

+

+

++

+

+

++

++

+

+

++

+

+

+

+

+

+

++

+

+

+

+

++ +

++ +

+

+

+

+ ++

+

+

+ +

+

++

+

+

+

++

+

+

+ ++

+

++

++++

+

++

+

++ +

+

+

+

+

+

+

+

+

+

+

++ +

+

+

+

+++

+

+

+

+

+

+

+

+

+

++

+

+

+

++

++

++

++

+

+

+

+

+

+

+

+

+

+

+

++

+

++

+++

++

+ ++

+

++++

+

++

+

+

++

+

+

+

+

+

+

++

+

++

+

+

++

++++

++

+

+

+

+++

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+

++

+

+

+

+ + +

+

++

++

++

+

+ +

+ +

+ +

+++

+ +

+++

+ ++

+

+

+

+ +++

+

+

++++

++

+

++

+

++

+

+

+

+

+

+

+

++

+

+

+

+

++

+

++

+

+

+++

+

++

+

+

+

++ +

+

+

+

+++

++

+

++

+

+

+ +

++

+

+

+ +++

+

+

+

+

++

+

+

++

+ +

+

+

+

+

+

+

+

+ +

+

+

++

+

+

+

+

++

+

+

+

+

+++

+

+

+

++

+

+

+

+

++ +

+

++

++

+

+

+

+

++ +

+++

+

++

+

+

++

+

+++

++

+

+

+

+

+

+

+

+

+++

+

+

++

+

+

+

++

+

+

++

+

+

+

++

+

+

+++

+

++

+

++ +

++

+

+

+

++

++

+

+

+

+

++

+

+

+

+

+

+

+ +

+

+

++

+

++

+ +

++

+

+

+

++ +

++

++ +

+ ++

+

++

+

+

+

+++

+

+

+++

+

+

+

+

+

+

+ ++

+

++

+

+

+

++

++

++

+

+

+

+ +

+

++

++

+

+

+

+

+

+

+

+

+

++

+ ++

+

+ +

+

+

+

+

+++

+

+

+

+

+

+

++

++

+

+

++

+

+

+ +

+

+

++

+

+

+

+

+

+

++

+

+

++

+

+

+

+

+

+

++

++

++ ++

+

+

+

+

+

++

+

+

++

+

+

+

+

++

+

+

+

+

+

+

+ ++

+

+

+

+ +

+

+

+

+ +

+

+

+

++ +

+

+

++

+

+

+

+

++

+

+ +

+

+

+ ++

+

+

+

+

+

+

+

+

+

+

+

+

+ ++++

++

+

++

++

+

+

+

+

+

+

+

+

++

+ +

+

+

+ ++

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

++

++

+

+

+ ++

+ +

+

+

+

+++

+

+

+

+

+

+

+ ++

+ ++

+

+++

++

+

++

+

++

++

+

+

+

++

+

++

++

+

+

++

++

++

+

+

+

+

+

+

++ +

+

+

+

+

+

+

++ ++

+++

+

+

+

+

+

+

+

+

+

+

++

+

+

+

++

+

++

+

+ +

+

++

+

++

+

+ ++

+

+

++

++

+

++

+

++

+

+

+

+

++

+

+

++

++

+

+

+

++

+

+

+

+

++

+

+

+

+

+

+

+

+

+

++

+

+

++

+

+

+

+

+

+

+

+

+

+

+

++

++

+

+

+

+

+

+

++

++

+

+

+

+

+

+

+

+

+

++

++

++

+

+ +

++

++

+

+

+

++

++

+

+

+

+++

+

+

+

+

+

+

++

+++ +

+

+

+

++

++

+

+

+

+

+

+

+

++

++

++

+

++

+

+

+

+

+

+

+

++

+

+

+

++

+ +

+

+

++

+ +

+

+

+

+

+++

+

+++

+

+

+

+

++

+

++

+

++ ++

+

+

++

+

++

+

+

+

+

+

+

+

++

+

+

++

+ +

+

+

+

+

++

+

+

+

+

+

++

++

+

+

+

+

+

+

++

+

+

+

+

+

+

++

+

+

+

+

+++

++

+

+

+

+

+

+

+

+

+

+

+ + +

++

++

+

+

+

+

+

+

++

++

+

+

+++

+

+

+

+

+

+

+

+

+

+

++

+

+++

+

++

+

+

+

++

+

+

+

+

++

++

+

+

+

++

++

+

+

+

++

++

+

++

+

+

+

+

+

+

+

+

+

+ +

+

+

+

+

+++ +

+

++

+

+ ++++ +

+

+

++

++

+

+

+

++

+

++

+

+

++

+

+

++

+

+

+

+

+

+

++

+

+

+

+

+

+++

+

+++

++

+ ++

+

+

+

+

+++ +

+

+

+

+

+

+

+

++

+

+

+

+

++

+++ ++

+

+

+

+

+

++

+

+

++++

+

+

+

+

+

++

+++

+

+

++

++

+

+

+

+

+

++

+

++

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+++

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+++ +

+

+

+

+

+

++

+

+

+

+

+++

+

+

++ +

+

+

++

+

+

+

+

+

++ + ++

+

++++ +

+

+

+

+

+

+

+

+

+

+++

+

+

+

+

+

+ +

+

+

+

+ ++

++ +

+

+

++

+

+ +

+

+ +++

+

+

+

+

+

+

+

++

+

+

+

+

+

+++

+

+

+

+

+

++

+

++

+

++

++

+

+ +

+ ++

+

+

++

+

++

+

++

+

+

++

+

+

+

++

+ +

++

+

+

++

+

++

+

+

+

+

++

++

+

+

++ +

++

++

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

++ +

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

+ +

+++

+

+ ++

+

+

+

+

+

+++

+

+

+

++

+

++

+

+

+

+

+

+

+

+

++

+

++

+++

+

+

+

+++ +

+

+

+

+++ +

+

+

+

+

+

+

+

+++

++

++

+

+

++

+

+

+

+

+

++

+

+

+

+

++ +

+ ++

+

+

+ +

+

+ ++

++

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+ ++

++

+

+

++

+++

+

+ + + +

+

++ ++

+

+

+

+

+

+

+

+

+

+++

+

+

+

+

+++

+

+

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+++

+

+

+

+

+

+

+

++

+

+++

++

+ +

+

+

+

+

+

+ ++

+

+

+

+

+ +

+

+

++

+

+

+

++

+

+

+

+

+

+

++

+

++

+

+

+

++

++

+

+

+

+

+

+++

+

+

+

+

+

+

++

+

++

+

+

+

+

++

+

++

+

+

+

+

+

+

+

++

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

++

+

+

+

++

+

++

++

+

+

+

+

+

+

+

+

+

++

+

+

+

+

++

+

+

++

++

+

+

+

+ ++

+

+

+

+

+

+

+++

++

++

+

+

+

+

+

+

+

++

+

+

+

+

++

+

+

+

+ +

+

+

+

++

++++

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

++

+

++

+

+

+

+

+ +

+

+

+

+

+

++

+

+

+

+

+

+

+

+

+

+

++

+

++

+

+

+

++

+

+ + +

+

+

+

++

+

+

+

+++

+

+

+

++

+

+

+

++

+

+

++

+

+

+

+

++

+

+

+

+++

+++ +

++

+

+ +

++

+++

+

+

+

+

+

+

+

+

+

+

+

+

+

++

++

+

++

+++

++

++

+

+ +

+

+

+

+

+

+

+

++

++

+

+

+

+

+

++

+

+

++ + ++ +

+

+

+

++

+

+++

+

+ +

+

+

++

++

+

+

+

+

+

+

+

+

+

+

+

−6 −4 −2 0 2 4 6

−5

05

10

lag−1 autocovariance

lag−

2 au

toco

varia

nce

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 23 / 40

Page 24: Approximate Bayesian computation and machine learning (BigMC 2014)

MA(1) vs MA(2): Prior error rates

Classification technique Error rate2 sum stat 7 sum stat

LDA 27.43 26.57logistic regression 28.34 27.40

“naıve” Bayes (with Gaussian marginals) 19.52 24.40“naıve” Bayes (with non-parametric marginal estimates) 18.25 21.92

NN with k = 100 neighbors 17.23 18.37Random forest 17.04 16.15

NN with k = 50 neighbors 16.97 17.35local logistic regression with k = 1000 neighbors 16.82 15.44

Kernel discriminant analysis (KDA) 16.95 NADecision based on π(m|yobs), with no summary 12.36

2 sum stat: lag-1 and lag-2 autocovariance7 sum stat: lag-1,. . . , lag-7 autocovariance

The error rate with a 0-1 loss is estimated via cross-validation on the reference table

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 24 / 40

Page 25: Approximate Bayesian computation and machine learning (BigMC 2014)

A few words on Random Forest (RF)

Breiman and Cutler suggested tocreate a “forest” of (CART) decisiontrees, fitted on randomised subset ofthe training database. The manyclassifiers from the forest areaggregated using a majority vote.

Assume you have in the referencetableI n data points (= number of

simulations in ABC)I p covariates

The overall performance of RFincreases when the decision trees aremore independent.

To introduce some independence,I each node of a decision tree in the

forest is optimized against asubset of the covariates of size√p

I the training set of each tree israndomized

Subsampling or bootstrap? Breimanand Cutler proposed to randomize thetraining set using a bootstrap samplefrom the training database.When n is large, randomsubsampling is a better idea, see theresult of consistency of RF of ErwanScornet et al. (arXiv, 2014)

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 25 / 40

Page 26: Approximate Bayesian computation and machine learning (BigMC 2014)

Where is Bayes?when we renounce approximating the posterior probabilities

? ?

• in the prior distribution used togenerate the reference table

• using a 0-1 loss: the MAP is thebest classifierHence a classifier with small priorerror rate ≈ MAP of the trueposterior probabilitiesπ(m∣∣η(yobs)

)

A distinct advantage of posteriorprobabilities: evaluate the confidencein the model choice

The prior error rate gives a first clue.

But does not depend on yobs!

⇒ how to evaluate the error ondatasets that resemble yobs?

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 26 / 40

Page 27: Approximate Bayesian computation and machine learning (BigMC 2014)

Posterior predictive expected loss

We suggest replacing an unstable approximation of π(m∣∣η(yobs)

)by an average of

the selection error across all models given the observed data η(yobs), i.e.,

P(M(z) = m

∣∣yobs)

whereI the pair (m, z) is generated from the predictive

∫f(z|θ)πm(θm|yobs)dθm

I M(x) denotes the predictor of the machine learning method.

Dataset that resemble yobs ⇐⇒ dataset simulated from the predictive

The above predictive loss can be computed with ABC methods.

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 27 / 40

Page 28: Approximate Bayesian computation and machine learning (BigMC 2014)

Computing the posterior predictive expected loss

Posterior predictive error rate

P(M(z) = m

∣∣yobs)

whereI the pair (m, z) is generated from

the predictive∫f(z|θ)πm(θm|yobs)dθm

I M(z) denotes the predictor of themachine learning method.

ABC predictive error rate1) Simulate the reference table

according to the Bayesian model

2) Fix a machine learning predictorM (tuned to minimize prior errorrate via cross-validation or a testreference table)

3) Get an approximation of theposterior — in (m, θm) —selecting the k = 100 nearestneighbors in terms ofρ(η(yobs),η(z))

4) For each kept pair (m, θm),simulate M = 10 new datasets

5) Compute and return the averageerror of the predictor M on thek×M new datasets

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 28 / 40

Page 29: Approximate Bayesian computation and machine learning (BigMC 2014)

Is this posterior predictive error rate relevant?Machine learning method: random forest

●●

●●

●●

● ●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●●

●●

●● ●

●●

●●

●●●

● ●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

−4 −2 0 2 4

−4

−2

02

46

lag−1 auto correlation

lag−

2 au

to c

orre

latio

n

Predictive error

0.10.50.9

Simulated from

MA(1)MA(2)

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 29 / 40

Page 30: Approximate Bayesian computation and machine learning (BigMC 2014)

Is this posterior predictive error rate relevant? (cont’)Machine learning method: k-nearest neighbors

●●

●●

●●

● ●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●●

●●

●●

●● ●

●●

●●

●●●

● ●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

−4 −2 0 2 4

−4

−2

02

46

lag−1 auto correlation

lag−

2 au

to c

orre

latio

n

Predictive error

0.10.50.9

Simulated from

MA(1)MA(2)

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 30 / 40

Page 31: Approximate Bayesian computation and machine learning (BigMC 2014)

Is this posterior predictive error rate relevant? (cont’)x-axis: posterior predictive expected loss of k-nny-axis: posterior predictive expected loss of random forest

0.0 0.2 0.4 0.6 0.8 1.0

0.0

0.2

0.4

0.6

0.8

1.0

predictive error with knn

pred

ictiv

e er

ror

with

RF

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 31 / 40

Page 32: Approximate Bayesian computation and machine learning (BigMC 2014)

Table of Contents

1 Approximate Bayesian computation

2 ABC model choice

3 Population genetics example on SNP data

4 Conclusion

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 32 / 40

Page 33: Approximate Bayesian computation and machine learning (BigMC 2014)

The model choice problem

Asking the question: does population 3 comes from population 1, 2 or an admixtureof both populations?

Figure : The three population scenarios: left model 1, center model 2, right model 3

Dataset: 1, 000 autosomal diploid SNP loci for 75 individuals (25 per populations)⇐⇒ 1, 000 idd replicates of the coalescent with a random mutation on each genealogy

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 33 / 40

Page 34: Approximate Bayesian computation and machine learning (BigMC 2014)

Using good summary statisticsreduces the prior error rate

0 500 1000 1500 2000

0.22

0.24

0.26

0.28

0.30

k

erro

r

Link between the ABC prior error rates and the value of k

x-axis: value of ky-axis: prior error rate

black line: using 49 summary statistics provided by DIYABCred line: using the two LDA axis instead of the 49 initial summary statistics

(prior error rates are computed on a test sample of size 10, 000)

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 34 / 40

Page 35: Approximate Bayesian computation and machine learning (BigMC 2014)

Which machine learning technique?

Estimated prior error rates for some classification methods

Classification technique Prior error rate“naıve” Bayes (with Gaussian marginals) 33.31

Linear Discriminant Analysis 21.89“Optimal” k-nn using the initial 48 summaries 25.46“Optimal” k-nn using only the two LDA axis 21.65

local logistic regression on the two LDA axis with k = 5, 000 21.61random forest using the 48 initial summaries 20.84random forest using only the two LDA axis 22.50

random forest using both the 48 initial summaries and the LDA axis 19.20

The best classifier: in red

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 35 / 40

Page 36: Approximate Bayesian computation and machine learning (BigMC 2014)

Plotting the reference table

−6 −4 −2 0 2 4

−5

05

10

LD1

LD2

Projection of the reference table on the two LDA axis.

The colors correspond to the model indexes:I in black model 1I in orange model 3 andI in blue model 2

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 36 / 40

Page 37: Approximate Bayesian computation and machine learning (BigMC 2014)

Posterior predictive error rate

−6 −4 −2 0 2 4

−5

05

10

LD1

LD2

Projection of the reference table on the two LDA axis.

M = random forest using both the 48 initial summaries and the LDA axis

Case 1: red triangle NABC posterior predictive error rate ≈0.089

Case 2: green triangle NABC posterior predictive error rate ≈0.0

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 37 / 40

Page 38: Approximate Bayesian computation and machine learning (BigMC 2014)

Table of Contents

1 Approximate Bayesian computation

2 ABC model choice

3 Population genetics example on SNP data

4 Conclusion

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 38 / 40

Page 39: Approximate Bayesian computation and machine learning (BigMC 2014)

ABC model choice

Key ideasI π(m∣∣η(yobs)

), π

(m∣∣yobs

)

I Instead of trying to approximateπ(m∣∣η(yobs)

), focus on selecting

the best model (classification vsregression)

I Use the best machine learningtechnique to learn how to select amodel from the many simulationsof ABC: minimise the 0-1 loss tomimic the MAP

I Assess confidence in the modelselection with the posteriorpredictive expected loss

Consequences on Populationgenetics & ABCI Often, RF� k-NN

(because less sensible to highcorrelation in the covariates = thesummary statistics)

I RF requires much lesssimulations from the Bayesianmodel

I RF selects automatically therelevent summary statistics

I Hence we can handle muchcomplex models

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 39 / 40

Page 40: Approximate Bayesian computation and machine learning (BigMC 2014)

Questions?

Pierre Pudlo (UM2) ABC learning BigMC 19/06/2014 40 / 40