Andrieu Doucet - Particle Filtering for Partially Observed Gaussian State Space Models - 2002

download Andrieu Doucet - Particle Filtering for Partially Observed Gaussian State Space Models - 2002

of 11

Transcript of Andrieu Doucet - Particle Filtering for Partially Observed Gaussian State Space Models - 2002

  • 7/31/2019 Andrieu Doucet - Particle Filtering for Partially Observed Gaussian State Space Models - 2002

    1/11

    Particle Filtering for Partially Observed Gaussian State Space Models

    Author(s): Christophe Andrieu and Arnaud DoucetSource: Journal of the Royal Statistical Society. Series B (Statistical Methodology), Vol. 64, No.4 (2002), pp. 827-836Published by: Blackwell Publishing for the Royal Statistical SocietyStable URL: http://www.jstor.org/stable/3088816

    Accessed: 31/10/2010 23:20

    Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at

    http://www.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unless

    you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you

    may use content in the JSTOR archive only for your personal, non-commercial use.

    Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at

    http://www.jstor.org/action/showPublisher?publisherCode=black.

    Each copy of any part of a JSTOR transmission must contain the same copyright notice that appears on the screen or printed

    page of such transmission.

    JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of

    content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms

    of scholarship. For more information about JSTOR, please contact [email protected].

    Royal Statistical Society andBlackwell Publishing are collaborating with JSTOR to digitize, preserve and

    extend access toJournal of the Royal Statistical Society. Series B (Statistical Methodology).

    http://www.jstor.org

    http://www.jstor.org/action/showPublisher?publisherCode=blackhttp://www.jstor.org/action/showPublisher?publisherCode=rsshttp://www.jstor.org/stable/3088816?origin=JSTOR-pdfhttp://www.jstor.org/page/info/about/policies/terms.jsphttp://www.jstor.org/action/showPublisher?publisherCode=blackhttp://www.jstor.org/action/showPublisher?publisherCode=blackhttp://www.jstor.org/page/info/about/policies/terms.jsphttp://www.jstor.org/stable/3088816?origin=JSTOR-pdfhttp://www.jstor.org/action/showPublisher?publisherCode=rsshttp://www.jstor.org/action/showPublisher?publisherCode=black
  • 7/31/2019 Andrieu Doucet - Particle Filtering for Partially Observed Gaussian State Space Models - 2002

    2/11

    J.R.Statist.Soc.B (2002)64, Part4, pp.827-836

    Particle iltering orpartiallyobserved Gaussianstate space modelsChristophe AndrieuUniversity f Bristol,UKand Arnaud DoucetUniversity f Cambridge,UK[ReceivedOctober 000. RevisedMarch 002]Summary. olvingBayesian stimation roblemswhere heposterior istributionvolvesovertime hroughhe accumulationfdatahasmanyapplicationsordynamicmodels.Alargenum-berof algorithms asedon particleilteringmethods,also known s sequentialMonteCarloalgorithms,averecently eenproposedosolve heseproblems.Wepropose specialparticlefilteringmethodwhichuses randommixturesfnormal istributionso representheposteriordistributionsf partiallybservedGaussian tatespace models.Thisalgorithms basedon amarginalizationdea for mprovingfficiency nd can leadto substantialainsoverstandardalgorithms.tdiffers romprevious lgorithms hichwereonlyapplicableoconditionallyinearGaussiantatespacemodels.Computerimulationsrecarried ut oevaluate heperformanceof theproposed lgorithmordynamicobitandprobitmodels.Keywords:Bayesian stimation;iltering;eneralizedinearimeseries;Importancesampling; equentialMonteCarlo ampling;tatespace model

    1. Introduction1.1. BackgroundMany data analysis tasks involve estimating the state of a dynamicmodel when only partial orinaccurateobservations are available(Westand Harrison, 1997). Except in a few special cases,including linear Gaussian state space models, on-line state estimation is a problem that doesnot admit a closed form solution. As most real world models are non-linear and non-Gaussian,it is of greatinterestto develop efficientcomputational methods to solve this so-called Bayesianfiltering problem numerically.

    Many approximation schemes, such as the extended Kalman filter, have been proposed tosurmount this problem. However, in many realisticscenarios, these approximatingmethods areunreliable and faults are difficultto diagnose on line. Recently therehas been a surgeof interestin sequential Monte Carlo (SMC) methods for non-linear or non-Gaussian time series analysis(Doucet et al., 2001). These methods, initiatedin Gordon et al. (1993), utilize a random-sample-(or particle-) based representationof the posterior probabilitydistributions.1.2. General problemFor any sequence It, we define

    Addressor correspondence: maud Doucet, Departmentof Engineering,Universityof Cambridge,Trum-pingtonStreet,Cambridge,CB2 1PZ,UK.E-mail:[email protected]? 2002 RoyalStatisticalSociety 1369-7412/02/64827

  • 7/31/2019 Andrieu Doucet - Particle Filtering for Partially Observed Gaussian State Space Models - 2002

    3/11

    828 C. AndrieuandA. Doucetlij (i, li+,...,I lj).

    In this paper, we shall concentrate on the following class of state space models. Let t = 1, 2,...denote discrete time: thenXt = Atxt- + Btvt + Ftut, XO ./(o, Po), (1)

    Yt = Ctxt + Dtet + Gtut, (2)Zt ' P (ztl yt), (3)

    where Ut E R[nus an exogenous process and xt E Rnxand YtE DRnyre unobservedprocesses.The sequencesIIDVt Af(o, In,) E R[nv

    and IIDst " Jn(O, In.) E nare independent identically distributed (IID) Gaussian. We assume that Po > 0; xo, vt and wtare mutually independent for all t, and the model parameters

    A A (xo, PO,At, Bt, Ct, Dt, Ft, Gt; t = 1, 2,...)are known. The processes (Xt)and (Yt)define a standardlinear Gaussian state space model. Wedo not observe (Yt)in our case, but (Zt).The observations(Zt)are conditionally independentgiven the processes (xt) and (yt) and marginallydistributedaccording to p(ztlyt); it is assumedthat p(ZtlYt)can be evaluated pointwise up to a normalizing constant. Typically p(ztlYt) be-longs to the exponential family. Alternatively zt may be a censored or quantized version of yt.This class of partially observedGaussianstate space models has numerous applications; manyexamples are discussed for instance in de Jong (1997), Manriqueand Shephard(1998) and Westand Harrison (1997).We want to estimate sequentially in time some characteristics of the posterior distributionp(xo:t zi:t).Typically, we are interested in computing E(xt zl:t) (filtering),E(xt+LIz :t) (predic-tion) and E(Xt-L IZI:t)fixed lag smoothing), where L is a positive integer. These estimates donot in general admit analytical expressions and we must resort to numericalmethods.

    1.3. ResolutionSMC methods are, loosely speaking, a combination of importance sampling and resamplingmethods that allow us to propagateefficientlyovertime a largeset of samplesor particlesdistrib-uted approximatelyaccordingto p(xo:t zi:t).We could applystandardSMC methods such as thebootstrap filter (Gordon et al., 1993) to estimate p(xo:t, yl:tlzl:t) and consequently p(xo:tlzl:t).However, in its standardform, this algorithm does not use all the salient structureof the model.Our algorithm is based on a marginalization technique, often called the Rao-Blackwellizationmethod (Gelfand and Smith, 1990), that improves the efficiency of the procedure. It focuseson the estimation of p(yi:tlzl:t) rather than on the joint density p(xo:t,Y1:tlzl:t).The process(xt) is integrated out analytically.Once p(y1:tlzl:t) has been estimated, we can obtain estimatesof E(Xtlzl:t), E(xt+LIZ:t)and E(Xt-LIZI:t)hrough the Kalman filter as discussed further.In aMarkov chain Monte Carlo framework,de Jong (1997) has proposed the so-called scan samplerto sample from p(yl:t\IZ:t) in a similar class of state space models; see Manrique and Shephard(1998) for some applications.

  • 7/31/2019 Andrieu Doucet - Particle Filtering for Partially Observed Gaussian State Space Models - 2002

    4/11

    ParticleFiltering 8291.4. PlanThe rest of the paper is organized as follows. Section 2 shows how it is possible to restrictourselves to estimating p(y1:t zl:t)instead of P(xo:t,Y1:tzl:t),leading to an improvementin theMonte Carlo efficiency. The particle filtering algorithm is then described in detail. Section 3demonstratesthe performanceof the proposed algorithmvia computer simulations for dynam-ic tobit and probit models.2. Rao-Blackwellized particle filtering2.1. MarginalizationConsider the state space model defined by equations (1)-(3). We have

    P(xo:tIZi:t) p(xo:tIYi:t) (Yi:tIZl:t)yl:t.Thus if we obtain (through an SMC method describedfurther)an approximation of the prob-ability distribution that is associated with the density p(yl:tIzl:t)of the form

    PN (dyl:tIZi:t) = E w)(i) (dyi:t), W() > = 1i=1 yl:t i=lthen p(xo:tlzl:t)can be approximated by using

    PN (XO: Zl:t) = Wtp(XO:It l:t),i=1i.e. a mixture of Gaussian densities. From such an approximation, we can estimate E(xtlzl:t)and E(xt-L IZl:t).For example an estimate of E(xtIzl:t)is given by

    EN(XtIZ:t) = Xt pN(xo:tl Z:t) dx:t = E w E(XtltL),i=lwhereE(xtIY:t) is computed throughthe Kalman filterassociated with the linearGaussian statespace model defined by equations (1) and (2). Using the variance decomposition formula, it isclear that for any function h()

    var {h (Xt) Zl:t} var[E {h (xt)l Ylt z:t, i:t} :],which shows that estimating p(yl:tlzl:t) only is more efficient.To obtain an SMC approximation of the marginal posterior density p(yl:tlZi:t),we need tobe able to estimate this 'target' density pointwise up to a normalizing constant. We havetP(Y1:tIZi:t)O( H P(ZkIYk)P(YklYl:k-1), (4)k=lwhere

    P (Y1\Yl:o) P (Y1)As P(ZkIYk) s assumed known up to a normalizing constant, it is only necessary to estimateP(YkIYl:k-1) p to a normalizing constant. This predictive density can be computed by usingthe Kalman filter.The Kalman filter equations are the following. Set xolo = xo and Polo = Po; then for t =1,2,... compute

  • 7/31/2019 Andrieu Doucet - Particle Filtering for Partially Observed Gaussian State Space Models - 2002

    5/11

    830 C. Andrieuand A. DoucetXt\t-l = Atxt- lt-1 + Ftut, \

    Ptlt-1 = AtPt-llt-lAT + BtBT,Ytit-l = Ctxtlt-l + Gtut,St = CtPlt1CT + DD, (5)

    Xtlt -=tt P- Ptlt_1 S -Ytt-1 ,Ptlt = Ptlt-1 - Plt-1 st SCt Ptt-1,

    whereXtlt-1 E(xtlYl:t-l),

    Xtlt = E(Xtlyl:t),Ytlt-1 E(YtlYl:ti-),

    Ptlt- cov(xtlYl:t-1),Ptlt cov(xtlYl:t)

    andSt - cov(ytlyl:t-).

    We obtain P(YklYl:k-1) = JN(Yk;Yklk-1,Sk), where AJ(Yk;Yklk-l,Sk) is a Gaussian distributionof argument Yk,mean Yklk-1and covariance Sk.

    2.2. Particle filtering2.2.1. Sequentialimportancesamplingand resamplingWe describe briefly here how to apply the sequential importance sampling-resampling (SISR)method to sample approximatelyfrom p(yl:tlzl:t); see Doucet et al. (2001) for further details.At time t - 1, assume that we have say N particles {y(, t_lN approximately distributedaccording to p(yl:t-l z:t-1) and we want to obtain N particles {yl.t}i-1 distributed accord-ing to p(yl:tlzl:t). At time t, we 'extend' each particle yli(') by sampling y() according to aconditionalensity qt(yty 1,.t). Thus, each particle Y:t is distributed according top(yl:t-zl:t-l) qt(Ytlyl:t-l, z1:t). To correct for the discrepancy between P(Y1:t- IZi:t-l)x qt(ytlyl:t-l, Zl:t) and p(yl:tlzl:t), we use importance sampling so that the distribution asso-ciated with the density P(Yl:tlzl:t) s approximated by

    Ew(yIt) 6. (dyl:t) NI=1Y Y: t NPN(dYl:tl Zl:t) = = E w)6(i)(dy1:t), (6)N - 1 1:t~ W(Y(i) i=l :ti=l

    where, using equation (4), we have for the importance weightP(yl:tl Zl:t) P(ztl Yt)P(YtIY1:t-1)P(Yl:t-1 Izl:t-l)qt(Ytl Y1:t-1, Zl:t) qt(Yt Yl:t-1, Zl:t)

  • 7/31/2019 Andrieu Doucet - Particle Filtering for Partially Observed Gaussian State Space Models - 2002

    6/11

    ParticleFiltering 831The performanceof the algorithm depends on the importancedensity qt(ytl yl:t-1, Z1:t). We canselect P(yt y1:t-1) since it is a Gaussian density. In this case the associated importance weightis equal to w (y1:t) oc p (ztl yt). Note that the 'optimal' importance density, i.e. the densityminimizing the conditional variance of the weight conditional on Y1:t-1 (Doucet et al., 2000),is

    P(Yt Y1:t-1,Zl:t)ocP(ZtIYt)P(ytl Yl:t-1),and the associated importance weight is

    W(Yl:t)c p(zt y1:t-1) = / p(Ztlyt) p(ytl yl:t-1) dy. (7)It might be possible to compute this weight or not, depending on p(ztlYt).Finally, we obtain N particles {y(Oi[ li approximatelydistributedaccording to p(yl:tlzl:t) byresampling from the weighted empirical distributiongiven in equation (6). Several resamplingproceduresareavailable n the literature.Weadopt here the stratifiedsamplingscheme describedin Kitagawa (1996).Alternative SMC methods can be applied to estimate p(yl:tIz:t). In particular,the auxiliaryparticle filtering(APF) technique of Pitt and Shephard(1999) could be used. The idea behindAPF is to extendexisting particlesy(l)4_that are the most promising,in the sensethat the predic-tive likelihood p(ztI_ ) is large.When p(z I:t- i ) cannot be computed analytically,APF pro-poses an analytical approximation. In this case, APF and SISR differ significantly.However,when p(ztIYl:)t1)can be computed analytically,then APF uses the optimal importancedensity.This is referred o as 'perfect adaptation' (Pitt and Shephard, 1999). In this particularcase, APFand SISR areessentially similarexcept that APF reverses the order of the sampling and resam-pling steps; this is possible as the importance weight is independent of Ytand this is obviouslymore efficient.2.2.2. AlgorithmWe limit our presentation to standard choices of importance densities where qt(ytlY1:t-j,Z1:t)depends on (Y1:t-1, z1:t) only via zt and the set of low-dimensional sufficient statistics xtlt-l andPtlt-1. We shall write

    qt(Ytl x tlt-l, Ptlt-l, zt) -qt(Ytl Yl:t-1, Zl:t).This class of densities includes

    P(Ytl Xtlt-1, Ptlt-1) a P(Ytl Y:t-1) = /(yt; Ytlt-1, St),where Ytlt-1 and St are deterministic functions of xtlt-1 and Ptlt-1. As one typically focuseson features of the marginal p(ytlzl:t), it is only necessary to store in memory {Yt , xtlt) }iNand Ptlt-1 instead of {y(t}i. Contrary to algorithms presented in Chen and Liu (2000) andDoucet et al. (2000), we point out that we do not have to compute N 'full' Kalman filterrecursions as most of the calculations need to be done only once. More precisely, we note thatP(i) = Ptlt-l and S(l) S for any i E {1,. . ., N}.Given N particles{Yt1 }i= at time t- 1distributedapproximatelyaccordingto p(yt-\ Izl:t-i)and the associated sufficient statistics {x(t_ }Nl and Ptt_ 1, heparticlefilterproceedsas followsat time t. In the sequentialimportancesamplingstep:

    (a) for i= 1,...,N, set(i) (i)Xtt-l -- xtlt-1

  • 7/31/2019 Andrieu Doucet - Particle Filtering for Partially Observed Gaussian State Space Models - 2002

    7/11

    832 C.AndrieuandA. Doucetand sample

    y(i) - qt(Ytlx(l Ptlt- , Zt);(b) for i = 1,.., N, evaluate and normalize the importance weights

    ) p(ztI))(5i),~ (1) Ptji(i) = 1. (8)wt o) qt ) p-I, ' E i= 1 . (8)qt(y _,tlt-1 1, Zt) i=lIn the resamplingstep: multiply or discard particles {i), I} with respect to highor low importance weights w() to obtain N particles {y(i), x l }iNl.In the updatingstep:(a) compute Pt+ Itgiven Ptlti- using one step of the Kalman recursion(5);(b) for i 1 N, use one step of the Kalman recursion (5) to compute xt 1t given

    y, (iand Pt-i.xtltt I1.The computational complexity of this algorithm at each time step is 0(N). If the (unnormal-ized) importanceweightsgivenbyequation (8) areupperbounded, then asymptoticconvergence(N -> oo) of Monte Carlo estimates towards their true values can be ensured (Crisan, 2001).2.3. ExtensionsThere are many potential extensions to both the model and the algorithm.2.3.1. ModelFor the linear Gaussian model (1)-(2), we can readily consider the case where vt and wt arecorrelated and/or add a non-linear term SO(Y1:t-1) to the right-hand side of equation (2). It isalso possible to apply the marginalizationmethod presentedabove to integrateout analytically(Xt)when the model (1)-(2) is not linear Gaussian but conditionally linear Gaussian as des-cribed in Shephard(1994); this extension allows us to consider finite or continuous mixturesofGaussian distributions.Another interesting extension consists of partially observed hidden Markov models: (xt) ismodelled as a finite state space Markov chain andt

    P (Y:t, Zl:tlXl:t) = n P(ykl Xk,Yl:k-1) P(Zkl Yk).k=lWe can integrate out (xt) and compute P(Ykl Yl:k-1) by using the hidden Markov model filterinstead of the Kalman filteras part of the method developed in the previous section.2.3.2. AlgorithmWhen the distribution of the importance weights w() is skewed, the particles {(i),}N havinghigh importance weights are selected many times; this results in a 'depletion' of samplesas numerousparticles y(1)and y(J are in fact equal for i %j. To perform sample 'regeneration',it is possible to use a recently proposed approach based on Markov chain Monte Carlo steps(Gilks and Berzuini, 2001). It consists of applying to each particley(i) a (possibly non-ergodic)transition kernel Kt(y1:tly)) of invariant density P(Yl:tlZl:t).There are an infinite number ofpossible choices for this kernel.One possibility consists of updatingat time t the valuesYt-M+1:t(M > 0) by using the efficient scan sampler (de Jong, 1997), whose computational complexityis O(M). Though this step is not necessary to ensure theoretical convergence of the algorithm,it can improve results.

  • 7/31/2019 Andrieu Doucet - Particle Filtering for Partially Observed Gaussian State Space Models - 2002

    8/11

    ParticleFiltering 8333. Simulations3.1. Dynamic tobitmodelLet us consider the following tobit model (Manrique and Shephard, 1998):

    2 -- _ 2r , LIDXt+l = 4bXt+ 7vVt+l1, xo , ( -2?), Vt (0, 1),IIDYt = Xt + Oaet, Et - (0, 1),

    Zt = max(yt, 0).It is clear that this model is of the form (1)-(3). We chose as importance density the 'optimal'density p(ytl y:t-1, zt). (We could not use p(ytl yl:t-1) for the importance density when zt = 0.In this case, the importance weight does not exist.) If zt > 0, then Yt= zt and, if zt = 0, then

    P(Ytl Yl:t-1, Zt = 0) oc P(Ytl Yl:t-1)0(-oo,)(Yt).For the importance weight, we obtain using equation (7)

    w( (--Ytlt-1//St) ifzt = 0,W(y:t) a J(zt; YtIt-1, S) ifzt > 0,where 1(.*)s the cumulative function of the standardnormal distribution.WesimulatedT = 200 observations with the knownhyperparametersq = 0.99, a2 = 0.05 anda2 = 0.30. For various numbers of particles N, we generated K = 100 different realizations ofour Rao-Blackwellized (RB) filter and of the standardalgorithm that estimates p(xo:t,Y1:tZl:t)using the importance density p(xt, tIxt_l, Yt-l, zt) as p(xt, ytI xtl, yt-1) cannot be used. Ourcomparison is in terms of the mean and variance of the squareerror SE computed as follows:

    TSE(I) = {Xt -EN,I (XtI Z:t)}2t=l1 Km(SE) = SE(),K l=1

    1 Ka2SE) = - {SE (1)- m (SE)}2,K l=1where EN,l (Xt\Z1:t) s computed using the lth realization of the particle filter. We present inTables 1 and 2 the performanceof the standard and RB filters.For a fixed numberof particles,the RB filter is not more computationally intensive than the standard filter and it performssignificantlybetter.3.2. Dynamic probit modelWeanalysehere a non-stationary binarytime series and more specificallythe (aggregated)Tokyorainfall data set (Knorr-Held, 1999). It consists of T = 366 observations with Zt = 1 indicatingthat it rained on the tth day of the year and Zt= 0 otherwise. We model Ztby using a dynamicprobit model, i.e.

    Pr(zt = lI a^t)= (at),where (at) is modelled by using a second-order random walk

    IID=t = 2Cet-1 - at--2 + OvVt, Vt - ./A(0, 1). (9)

  • 7/31/2019 Andrieu Doucet - Particle Filtering for Partially Observed Gaussian State Space Models - 2002

    9/11

    834 C. Andrieu and A. DoucetTable 1. m(SE) for the standardfilterand RBfilter

    Algorithm Resultsforthe ollowingvaluesof N.100 250 500 1000 2500 5000 10000 25000

    Standard ilter 33.70 33.64 33.90 33.41 33.45 33.55 33.54 33.52RB filter 33.52 33.49 33.51 33.50 33.50 33.49 33.51 33.50

    Let us introduce an artificiallatent process (Yt)such thatlIDEt Il An(O,1), (10)

    Zt = [0,oo)(Yt). (11)It is easy to check that we have

    Pr(zt = 11 t) = Pr(yt > 01 at) = Pr(et > -at) = -(at).We can easily rewriteequations (9)-(11) in a state space model of the form (1)-(3) by defining

    Xt - (/t, at-l).In this case, as the introduction of (Yt) is artificial, it is necessary to compare our proce-dure with a particle filtering method applied to the estimation of p (:t :) and not withp (xo:t,yl:tIzl:t). The motivation for introducing (Yt)comes from the fact that it is possible touse the optimal density as importance density

    P(Ytl Yl-1, Zt) {P(Ytl Y1:t-1) o[o,oo)(yt)p(t Yl:t-1)(-oo,o)(yt)ifzt = 1,ifzt = 0,

    which is a truncated Gaussian distribution, and, using equation (7), we obtainYtlt-1 Zt Ytlt-1W(yl:t) oCp(Zt l Yl:t-1) = - St (

    If we consider p (Xo:t Zl:t), the optimal density cannot be used since the associated importanceweight p(zt Ixt- 1) does not admit an analytical expression. Note that the introduction of (yt)

    Table 2. 10 a(SE) forthe standard filterand RBfilter

    Algorithm Resultsforthe ollowingvaluesof N:100 250 500 1000 2500 5000 10000 25000

    Standard ilter 3.76 2.19 1.61 1.20 0.83 0.62 0.37 0.26RB filter 2.50 1.30 1.21 0.99 0.51 0.38 0.29 0.14

    and defineYt = a/t + 8t,

  • 7/31/2019 Andrieu Doucet - Particle Filtering for Partially Observed Gaussian State Space Models - 2002

    10/11

    Particle Filtering 8350 1 1 I

    0.8. I .0.60.40.2

    0 50 100 150 200 250 300 350(a)1 " ' ' ' ' / , '/ ' ,'

    0.8 '/"Al _

    (b)Fig. 1. (a) Binary observations (Zt)and (b) E{(ot)|yit} ( ) and E{

  • 7/31/2019 Andrieu Doucet - Particle Filtering for Partially Observed Gaussian State Space Models - 2002

    11/11

    836 C. Andrieu and A. DoucetCrisan,D. (2001)Particle ilters-a theoreticalperspective.n SequentialMonte CarloMethods n Practice edsA. Doucet, J.F G. de Freitasand N. J.Gordon),pp. 17-41. New York:Springer.Doucet, A., de Freitas,J. F. G. and Gordon, N. J. (eds) (2001) SequentialMonte Carlo Methods n Practice.New York:Springer.Doucet, A., Godsill, S. J.and Andrieu,C. (2000)On sequentialMonte Carlosamplingmethods for Bayesianfiltering.Statist.Comput., 0, 197-208.Gelfand,A. and Smith,A. F. M. (1990) Samplingbasedapproaches o calculatingmarginaldensities.J Am.Statist.Ass., 85, 398-409.Gilks,W.R. and Berzuini,C. (2001) Followinga moving target-Monte Carlo inference or dynamicBayesianmodels.J R. Statist. Soc.B, 63, 127-146.Gordon,N. J.,Salmond,D. J.andSmith,A. F. M. (1993)Novel approach o nonlinear/non-Gaussian ayesianstate estimation. EE Proc.F, 140, 107-113.de Jong,P.(1997)Thescansampler.Biometrika, 4, 929-937.Kitagawa,G. (1996)Monte Carlofilterand smoother or non-Gaussiannonlinear tatespacemodels.J Comput.Graph.Statist.,5, 1-25.Knorr-Held,L. (1999)Conditionalpriorproposals n dynamicmodels.Scand.J Statist., 26, 129-144.Liu,J. andWest,M. (2001)Combinedparameter nd stateestimation n simulation-basediltering. n SequentialMonte Carlo Methods n Practice(eds A. Doucet, J. F G. de Freitas and N. J. Gordon),pp. 197-223. New

    York:Springer.Manrique,A. and Shephard,N. (1998)Simulation-basedikelihood nference or limiteddependentprocesses.Econometr. , 1, 174-202.Pitt, M. K. and Shephard,N. (1999)Filteringvia simulation:auxiliaryparticlefilter.J Am. Statist.Ass., 94,590-599.Shephard,N. (1994)Partialnon-Gaussian imes seriesmodels.Biometrika, 1, 115-131.West,M. andHarrison,P.J.(1997)BayesianForecasting ndDynamicModels,2nd edn. New York:Springer.