Non-Gaussian Data Assimilation using Mixture and Kernel Piyush Tagade Hansjorg Seybold Sai Ravela...

14
Non-Gaussian Data Assimilation using Mixture and Kernel Piyush Tagade Hansjorg Seybold Sai Ravela Earth Systems and Signals Group Earth Atmospheric and Planetary Sciences, Massachusetts Institute of Technology

Transcript of Non-Gaussian Data Assimilation using Mixture and Kernel Piyush Tagade Hansjorg Seybold Sai Ravela...

Mixture Ensemble Filter and Smoother

Non-Gaussian Data Assimilation using Mixture and KernelPiyush TagadeHansjorg Seybold Sai Ravela

Earth Systems and Signals GroupEarth Atmospheric and Planetary Sciences, Massachusetts Institute of Technology

Environmental DDDASMapping localized atmospheric phenomena (caos.mit.edu)

MEnFWe ExploreMuIFModelUQTargetingDAPlanningSensing

Key AreasTheoryGeneralized Coherent FeaturesPattern theory for coherent fluidsStatistical Inference for GCF

MethodologyData-driven NowcastingDBFE Information theoretic UQ, Targeting, Estimation, Planning, and ControlApplicationMapping localized phenomena

Field Alignment -- DACoalescence -- UQPAG modes Model Reduction

These are some of the key areas the group has been working on and some of the important results are shown in the figures. On the theoretical side, we are exploring generalized coherent structures for pattern-field decomposition of fluids

3Dynamically Biorthogonal Field EquationsKL ExpansionBiorthogonal Expansion

Dynamic Orthogonality: Closed form solution for mean, eigenvalues and expansion coefficientsFaster than gPC with comparable accuracy

Tagade et al., ASME IDETC 2012, AIAA-NDA 2013

Inference Challenges: Nonlinearity, DimensionalitySystem Dynamics

Measurements

Bayesian InferenceSmoothing

Filtering

DimensionalityNonlinearityKF PFEnKF??

EKFTwo ApproachesMutual Information Filter Non Gaussian prior and likelihoodDistributions orientedQuadratic non-Gaussian estimation, optimization approachTractable mutual information measures

Mixture Ensemble Filter (MEnF)GMM prior and Gaussian likelihood, joint state-parameter estimationDirect Ensemble update, compact ensemble transformFast smoother formsMarginal additional costNote: not for UQ

Approach 1: Mutual Information FilterMutual Information Filter (MuIF): Non-Gaussian prior and likelihood, Maximization of Kapoors quadratic mutual information, equivalence with RKHS

Shannon EntropyMutual InformationInference: Maximization of Mutual InformationTractable?We use Generalization of Shannon Entropy: Renyi Entropy-Information Potential: Others: Havrda-Chardat, Behara-Chawla, Sharma-MittalKapur establishes equivalence

Quadratic Mutual InformationSpecial Interest: Kapoors result:Also Cauchy-SchwartzMutual Information viaInformation potentialsNoparametric density estimationKernel density estimateReinterpretation using RKHS

Inference

Filter ParameterizationMaximization of MI: Steepest Gradient Ascent

Kapur 63, Torkkola et al. 03, Principe et al., Ravela et al. 08, Tagade and Ravela, ACC14

Application Results: Lorenz 95Lorenz 95 Model

MuIF Outperforms EnKF

Effect of Kernel Bandwidth

Approach 2: Mixture Ensemble FilterState of the Art:Means and Covariances of GMMs explicitly propagated and updated (Alspach and Sorenson, 1972) Ensemble to propagate uncertainty. Mixture moments explicitly calculated (Sondergaard and Lermusiax, 2013) Ensemble members used with balance and membership rules (Frei and Knusch, 2013)

Key Contributions:Ensemble update equation via Jensens inequalityUpdate reduces to compact ensemble transformDirect model-state adjustment, resampling not compulsoryEM for mixture parameter estimation, also in reduced formFamiliar message passing framework for smoothing O(1) fixed lag, O(N) fixed interval, in time

You have to mention thatEM reduces ad-hoc approaches to updating weights and simplifies the problem of propagating weights. This is a standard machine learning techniqueYou have to

10

Mixture Ensemble FilterJoint state and parameter estimation

Ensemble Update equation via Jensens inequality

Ensemble TransformSmoother Form

Ravela and McLaughlin, OD 2007; Tagade et al., under review

MEnF: Performance90% Confidence Bound Comparison for MEnF with EnKF and MCMC

Double WellLorenz - 63

MEnF and Coherent Structures

MEnFEnKF-FSEnKF-PS

KdV: Idealizes coherent structure of solitary waves,non-linear superposition, chaotic Conclusions and Future WorkNonlinearity and Dimensionality challenges in inference for large-scale fluids Optimization approach to non-Gaussian estimationMixture Ensemble Filter and Mutual Information FilterComplexity of Mutual Information FilterFast Gauss Transform, and variations from RKHSSmoother formsConvergence rates of MEnFCombine Inference with UQ and Planning for Clouds and Volcanoes.