Stochastic particle methods in Bayesian statistical Stochastic particle methods in Bayesian...

Click here to load reader

  • date post

    22-Jul-2020
  • Category

    Documents

  • view

    1
  • download

    0

Embed Size (px)

Transcript of Stochastic particle methods in Bayesian statistical Stochastic particle methods in Bayesian...

  • Stochastic particle methods in Bayesian statistical learning

    P. Del Moral (INRIA team ALEA)

    INRIA & Bordeaux Mathematical Institute & X CMAP

    INRIA Workshop on Statistical Learning IHP Paris, Dec. 5-6, 2011

    Some hyper-refs

    I Feynman-Kac formulae, Genealogical & Interacting Particle Systems with appl., Springer (2004)

    I Sequential Monte Carlo Samplers JRSS B. (2006). (joint work with Doucet & Jasra)

    I On the concentration of interacting processes Hal-inria (2011). (joint work with Hu & Wu) [+ Refs]

    I More references on the website http://www.math.u-bordeaux1.fr/∼delmoral/index.html [+ Links]

    http://alea.bordeaux.inria.fr/ http://www.math.u-bordeaux1.fr/~delmoral/gips.html http://www.math.u-bordeaux1.fr/~delmoral/smc_samplers_try.pdf http://hal.inria.fr/docs/00/60/79/65/PDF/RR-7677.pdf http://www.math.u-bordeaux1.fr/~delmoral/index.html

  • Stochastic particle sampling methods Interacting jumps models Genetic type interacting particle models Particle Feynman-Kac models The 4 particle estimates Island particle models (⊂ Parallel Computing)

    Bayesian statistical learning Nonlinear filtering models Fixed parameter estimation in HMM models Particle stochastic gradient models Approximate Bayesian Computation Interacting Kalman-Filters Uncertainty propagations in numerical codes

    Concentration inequalities Current population models Particle free energy Genealogical tree models Backward particle models

  • Stochastic particle sampling methods Interacting jumps models Genetic type interacting particle models Particle Feynman-Kac models The 4 particle estimates Island particle models (⊂ Parallel Computing)

    Bayesian statistical learning

    Concentration inequalities

  • Introduction

    Stochastic particle methods =

    Universal adaptive sampling technique

    2 types of stochastic interacting particle models:

    I Diffusive particle models with mean field drifts

    [McKean-Vlasov style]

    I Interacting jump particle models

    [Boltzmann & Feynman-Kac style]

  • Lectures ⊂ Interacting jumps models

    I Interacting jumps = Recycling transitions =

    I Discrete time models (⇔ geometric rejection/jump times)

  • Genetic type interacting particle models

    I Mutation-Proposals w.r.t. Markov transitions Xn−1 Xn ∈ En. I Selection-Rejection-Recycling w.r.t. potential/fitness function Gn.

  • Equivalent particle algorithms

    Sequential Monte Carlo Sampling Resampling Particle Filters Prediction Updating

    Genetic Algorithms Mutation Selection Evolutionary Population Exploration Branching-selection Diffusion Monte Carlo Free evolutions Absorption Quantum Monte Carlo Walkers motions Reconfiguration Sampling Algorithms Transition proposals Accept-reject-recycle

    More botanical names: bootstrapping, spawning, cloning, pruning, replenish, multi-level splitting, enrichment, go with the winner, . . .

    1950 ≤ Meta-Heuristic style stochastic algorithms ≤ 1996

  • Equivalent particle algorithms

    Sequential Monte Carlo Sampling Resampling Particle Filters Prediction Updating

    Genetic Algorithms Mutation Selection Evolutionary Population Exploration Branching-selection Diffusion Monte Carlo Free evolutions Absorption Quantum Monte Carlo Walkers motions Reconfiguration Sampling Algorithms Transition proposals Accept-reject-recycle

    More botanical names: bootstrapping, spawning, cloning, pruning, replenish, multi-level splitting, enrichment, go with the winner, . . .

    1950 ≤ Meta-Heuristic style stochastic algorithms ≤ 1996

  • A single stochastic model

    Particle interpretation of Feynman-Kac path integrals

  • Genealogical tree evolution (Population size, Time horizon)=(N , n) = (3, 3)

    • - • • - • = •

    • - • -

    -

    • • = •

    • - • • -

    - -

    • = •

    Meta-heuristics ”Meta-Theorem” : Ancestral lines ' i.i.d. samples w.r.t. Feynman-Kac measure

    Qn := 1

    Zn

     ∏ 0≤p

  • Genealogical tree evolution (Population size, Time horizon)=(N , n) = (3, 3)

    • - • • - • = •

    • - • -

    -

    • • = •

    • - • • -

    - -

    • = •

    Meta-heuristics ”Meta-Theorem” : Ancestral lines ' i.i.d. samples w.r.t. Feynman-Kac measure

    Qn := 1

    Zn

     ∏ 0≤p

  • Genealogical tree evolution (Population size, Time horizon)=(N , n) = (3, 3)

    • - • • - • = •

    • - • -

    -

    • • = •

    • - • • -

    - -

    • = •

    Meta-heuristics ”Meta-Theorem” : Ancestral lines ' i.i.d. samples w.r.t. Feynman-Kac measure

    Qn := 1

    Zn

     ∏ 0≤p

  • Particle estimates More formally( ξi0,n, ξ

    i 1,n, . . . , ξ

    i n,n

    ) := i-th ancetral line of the i-th current individual = ξin

    ⇓ 1

    N

    ∑ 1≤i≤N

    δ(ξi0,n,ξi1,n,...,ξin,n) −→N→∞ Qn

    ⊕ Current population models

    ηNn := 1

    N

    ∑ 1≤i≤N

    δξin −→N→∞ ηn = n-th time marginal of Qn

    ⊕ Unbiased particle approximation

    ZNn = ∏

    0≤p

  • Particle estimates More formally( ξi0,n, ξ

    i 1,n, . . . , ξ

    i n,n

    ) := i-th ancetral line of the i-th current individual = ξin

    ⇓ 1

    N

    ∑ 1≤i≤N

    δ(ξi0,n,ξi1,n,...,ξin,n) −→N→∞ Qn

    ⊕ Current population models

    ηNn := 1

    N

    ∑ 1≤i≤N

    δξin −→N→∞ ηn = n-th time marginal of Qn

    ⊕ Unbiased particle approximation

    ZNn = ∏

    0≤p

  • Particle estimates More formally( ξi0,n, ξ

    i 1,n, . . . , ξ

    i n,n

    ) := i-th ancetral line of the i-th current individual = ξin

    ⇓ 1

    N

    ∑ 1≤i≤N

    δ(ξi0,n,ξi1,n,...,ξin,n) −→N→∞ Qn

    ⊕ Current population models

    ηNn := 1

    N

    ∑ 1≤i≤N

    δξin −→N→∞ ηn = n-th time marginal of Qn

    ⊕ Unbiased particle approximation

    ZNn = ∏

    0≤p

  • Particle estimates More formally( ξi0,n, ξ

    i 1,n, . . . , ξ

    i n,n

    ) := i-th ancetral line of the i-th current individual = ξin

    ⇓ 1

    N

    ∑ 1≤i≤N

    δ(ξi0,n,ξi1,n,...,ξin,n) −→N→∞ Qn

    ⊕ Current population models

    ηNn := 1

    N

    ∑ 1≤i≤N

    δξin −→N→∞ ηn = n-th time marginal of Qn

    ⊕ Unbiased particle approximation

    ZNn = ∏

    0≤p

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Graphical illustration : ηNn := 1 N

    ∑ 1≤i≤N δξin ' ηn

  • Complete ancestral tree when Mn(x , dy) = Hn(x , y) λ(dy)

    Backward Markov chain model

    QNn (d(x0, . . . , xn)) := ηNn (dxn) Mn,ηNn−1 (xn, dxn−1) . . .M1,ηN0 (x1, dx0)

    with the random particle matrices:

    Mn+1,ηNn (xn+1, dxn) ∝ η N n (dxn) Gn(xn) Hn+1(xn, xn+1)

    Example: Normalized additive functionals

    fn(x0, . . . , xn) = 1

    n + 1

    ∑ 0≤p≤n

    fp(xp)

    QNn (fn) := 1

    n + 1

    ∑ 0≤p≤n

    ηNn Mn,ηNn−1 . . .Mp+1,ηNp (fp)︸ ︷︷ ︸ matrix operations

  • Complete ancestral tree when Mn(x , dy) = Hn(x , y) λ(dy)

    Backward Markov chain model

    QNn (d(x0, . . . , xn)) := ηNn (dxn) Mn,ηNn−1 (xn, dxn−1) . . .M1,ηN0 (x1, dx0)

    with the random particle matrices:

    Mn+1,ηNn (xn+1, dxn)