Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the...

43
Bayesian inference and mathematical imaging. Part I: Bayesian analysis and decision theory. Dr. Marcelo Pereyra http://www.macs.hw.ac.uk/mp71/ Maxwell Institute for Mathematical Sciences, Heriot-Watt University January 2019, CIRM, Marseille. M. Pereyra (MI — HWU) Bayesian mathematical imaging 0 / 42

Transcript of Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the...

Page 1: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Bayesian inference and mathematical imaging.Part I: Bayesian analysis and decision theory.

Dr. Marcelo Pereyrahttp://www.macs.hw.ac.uk/∼mp71/

Maxwell Institute for Mathematical Sciences, Heriot-Watt University

January 2019, CIRM, Marseille.

M. Pereyra (MI — HWU) Bayesian mathematical imaging 0 / 42

Page 2: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Outline

1 Imaging inverse problems

2 Bayesian modelling

3 Bayesian inference

4 Conclusion

M. Pereyra (MI — HWU) Bayesian mathematical imaging 1 / 42

Page 3: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Imaging inverse problems

We are interested in an unknown image x ∈ Rd .

We measure y , related to x by some mathematical model.

For example, in many imaging problems

y = Ax +w ,

for some linear operator A, and additive noise w .

M. Pereyra (MI — HWU) Bayesian mathematical imaging 2 / 42

Page 4: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Regularisation

If A⊺A is ill-conditioned or rank deficient, then y does not allowreliably recovering x without additional information.

In words, there is significant uncertainty about x given y .

To reduce uncertainty and deliver meaningful results we need toregularise the estimation problem y → x to make it well-posed.

M. Pereyra (MI — HWU) Bayesian mathematical imaging 3 / 42

Page 5: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Regularisation

For example, given some penalty function h ∶ Rd → [0,∞) promotingexpected properties of x , we could construct the estimator

x = argminx∈Rd

∥y −Ax∥22 + θh(x) , (1)

that combines the data fidelity term ∥y −Ax∥22 and the penalty h,

where the “regularisation” parameter θ > 0 controls the balance.

When h is convex and l.s.c., x can be computed efficiently by(proximal) convex optimisation (Chambolle and Pock, 2016).

Other data fidelity terms could be considered too (e.g., ∥y −Ax∥1).

M. Pereyra (MI — HWU) Bayesian mathematical imaging 4 / 42

Page 6: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Illustrative example: astronomical image reconstruction

Recover x ∈ Rd from low-dimensional observation y =MFx +w , where Fis the cont. Fourier transform, M ∈ Cm×d is a mask. We use the estimator

x = argminx∈R

+

∥y −MFx∥22 + θ∥Ψx∥1 , (2)

where Ψ is a specialised wavelet dictionary and θ > 0 is some parameter.

∣y ∣ x

Figure : Radio-interferometric image reconstruction of the W28 supernova.

M. Pereyra (MI — HWU) Bayesian mathematical imaging 5 / 42

Page 7: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Illustrative example: astronomical image reconstruction

Modern convex optimisation can compute x very efficiently...

With parallelised and distributed algorithms...

With theoretical convergence guarantees..

And GPU implementations...

Also non-convex extensions...

Also, if we had abundant training data (we do not) we could learn a neuralnetwork to recover x from y . Or alternatively learn y → x for efficiency.

So the problem is quite solved, right?

M. Pereyra (MI — HWU) Bayesian mathematical imaging 6 / 42

Page 8: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Not really...

M. Pereyra (MI — HWU) Bayesian mathematical imaging 7 / 42

Page 9: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Elephant 1: what is the uncertainty about x?

∣y ∣ xMAP

How confident are we about all these structures in the image?

What is the error in their intensity, position, spectral properties?

Using xMAP to derive physical quantities? what error bars should we put...

M. Pereyra (MI — HWU) Bayesian mathematical imaging 8 / 42

Page 10: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Illustrative example: magnetic resonance imaging

We use very similar techniques to produce magnetic resonance images...

x x (zoom)

Figure : Magnetic resonance imaging of brain lession.

How can we quantify our uncertainty about the brain lesion in the image?

M. Pereyra (MI — HWU) Bayesian mathematical imaging 9 / 42

Page 11: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Illustrative example: magnetic resonance imaging

What about this other solution to the problem, with no lesion?

x ′ x ′ (zoom)

Figure : Magnetic resonance imaging of brain lession.

Do we have any arguments to reject this solution?

M. Pereyra (MI — HWU) Bayesian mathematical imaging 10 / 42

Page 12: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Elephant 1: what is the uncertainty about x?

Another example related to sparse super-resolution in live-cell microscopy

y xMAP xMAP (zoom)

Figure : Live-cell microscopy data (Zhu et al., 2012).

The image is sharpened to enhance molecule position measurements, butwhat is the precision of the procedure?

M. Pereyra (MI — HWU) Bayesian mathematical imaging 11 / 42

Page 13: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Elephant 2: multiple and partially unknown models

We usually have several alternative models/cost functions to recover x

x1 = argminx∈Rd

∥y −A1x∥22 + θ1h1(x),

x2 = argminx∈Rd

∥y −A2x∥22 + θ1h2(x),

(3)

How can we compare them without ground truth available?

Can we use several models simultaneously?

M. Pereyra (MI — HWU) Bayesian mathematical imaging 12 / 42

Page 14: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Elephant 2: multiple and partially unknown models

Some of the model parameters might also be unknown; e.g., θ ∈ R+ in

x1 = argminx∈Rd

∥y −A1x∥22 + θh1(x). (4)

Then θ parametrises a class of models for y → x .

How can we select θ without using ground truth?

Could we use all models simultaneously?

M. Pereyra (MI — HWU) Bayesian mathematical imaging 13 / 42

Page 15: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Other elephants in the neighbourhood...

Why do we formulate solutions to imaging problems as penaliseddata-fitting problems? Are there other relevant formulations?

Suppose we had a specific meaningful way of measuring estimation error,reflecting specific aspects of our problem or the application considered.What would be the optimal estimator then?

M. Pereyra (MI — HWU) Bayesian mathematical imaging 14 / 42

Page 16: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Other elephants in the neighbourhood...

Should solutions to imaging problems always be images? even when theirpurpose is to inform decision making, scientific conclusions?

What other mathematical objects could we construct to represent solutionsto imaging problems? e.g., could curves (videos) be interesting solutions?

M. Pereyra (MI — HWU) Bayesian mathematical imaging 15 / 42

Page 17: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Outline

1 Imaging inverse problems

2 Bayesian modelling

3 Bayesian inference

4 Conclusion

M. Pereyra (MI — HWU) Bayesian mathematical imaging 16 / 42

Page 18: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Introduction

Bayesian statistics is a mathematical framework for deriving inferencesabout x , from some observed data y and prior knowledge available.

Adopting a subjective probability, we represent x as a random quantity,and model our knowledge about x by using probability distributions.

To derive inferences about x from y we postulate a joint statistical modelp(x , y); typically specified via the decomposition p(x , y) = p(y ∣x)p(x).

M. Pereyra (MI — HWU) Bayesian mathematical imaging 17 / 42

Page 19: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Introduction

The decomposition p(x , y) = p(y ∣x)p(x) has two key ingredients:

The likelihood function: the conditional distribution p(y ∣x) that modelsthe data observation process (forward model).

The prior function: the marginal distribution p(x) = ∫ p(x , y)dx thatmodels our knowledge about x “before observing y”.

For example, for y = Ax +w , with w ∼ N (0, σ2I), we have

y ∼ N (Ax , σ2I) ,

or equivalentlyp(y ∣x)∝ exp−∥y −Ax∥2

2/2σ2 .

M. Pereyra (MI — HWU) Bayesian mathematical imaging 18 / 42

Page 20: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Prior distribution

The prior distribution is often of the form:

p(x)∝ e−θh(x)1Ω(x) ,

for some h ∶ Rd → Rm, θ ∈ Rm, and constraint set Ω.

This prior is essentially specifying the expectation

E(h) = ∫Ωh(x)p(x)dx = ∇θ logC(θ)

whereC(θ) = ∫

Ωe−θh(x)dx .

M. Pereyra (MI — HWU) Bayesian mathematical imaging 19 / 42

Page 21: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Prior distribution

For example, priors of the form

p(x)∝ e−θ∥Ψx∥† ,

for some dictionary Ψ ∈ Rd×p and norm ∥ ⋅ ∥†, are encoding

E(∥Ψx∥†) =d

θ.

See Pereyra et al. (2015) for more details and other examples.

M. Pereyra (MI — HWU) Bayesian mathematical imaging 20 / 42

Page 22: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Posterior distribution

Given an observation y , we naturally base our inferences on the posteriordistribution p(x ∣y).

We derive p(x ∣y) from the likelihood p(y ∣x) and the prior p(x) by using

p(x ∣y) = p(y ∣x)p(x)p(y)

where p(y) = ∫ p(y ∣x)p(x)dx measures model-fit-to-data.

The conditional p(x ∣y) models our knowledge about x after observing y .

Observe that p(x) may significantly regularise the estimation problem andaddress identifiability issues in p(y ∣x) (e.g., when A is rank deficient).

M. Pereyra (MI — HWU) Bayesian mathematical imaging 21 / 42

Page 23: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Maximum-a-posteriori (MAP) estimation

The predominant Bayesian approach in imaging to extract a solution fromp(x ∣y) is MAP estimation

xMAP = argmaxx∈Rd

p(x ∣y),

= argminx∈Rd

− log p(y ∣x) − log p(x)+ log p(y).(5)

When p(x ∣y) is log-concave, then xMAP is a convex optimisation problemand can be efficiently solved (Chambolle and Pock, 2016).

M. Pereyra (MI — HWU) Bayesian mathematical imaging 22 / 42

Page 24: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Illustrative example: astronomical image reconstruction

Recover x ∈ Rd from low-dimensional degraded observation y =MFx +w ,where F is the continuous Fourier transform, M ∈ Cm×d is a measurementoperator and w is Gaussian noise. We use the model

p(x ∣y)∝ exp (−∥y −MFx∥2/2σ2 − θ∥Ψx∥1)1Rn+(x). (6)

y xMAP

Figure : Radio-interferometric image reconstruction of the W28 supernova.

M. Pereyra (MI — HWU) Bayesian mathematical imaging 23 / 42

Page 25: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Outline

1 Imaging inverse problems

2 Bayesian modelling

3 Bayesian inference

4 Conclusion

M. Pereyra (MI — HWU) Bayesian mathematical imaging 24 / 42

Page 26: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Bayesian decision theory

Given the following elements defining a decision problem:

1 Decision space ∆

2 Loss function L(δ, x) ∶ ∆ ×Rd → R quantifying the loss (or profit)related to taking action δ ∈ ∆ when the truth is x ∈ Rd .

3 A model p(x) representing probabilities for x .

What is the optimal decision δ∗ ∈ ∆ when x is unknown?

M. Pereyra (MI — HWU) Bayesian mathematical imaging 25 / 42

Page 27: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Bayesian decision theory

Given the following elements defining a decision problem:

1 Decision space ∆

2 Loss function L(δ, x) ∶ ∆ ×Rd → R quantifying the loss (or profit)related to taking action δ ∈ ∆ when the truth is x ∈ Rd .

3 A probability model p(x) representing knowledge about x .

According to Bayesian decision theory (Robert, 2001), the optimal decisionunder uncertainty is

δ∗ = argminδ∈∆

EL(δ, x)∣y = argminδ∈∆

∫ L(δ, x)p(x)dx .

M. Pereyra (MI — HWU) Bayesian mathematical imaging 26 / 42

Page 28: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Bayesian point estimators

Bayesian point estimators arise from the decision ”what point x ∈ Rd

summarises x ∣y best?”. The optimal decision under uncertainty is

xL = argminu∈Rd

EL(u, x)∣y = argminu∈Rd

∫ L(u, x)p(x ∣y)dx

where the loss L(u, x) measures the “dissimilarity” between u and x .

General desiderata:

1 L(u, x) ≥ 0, ∀u, x ∈ Rd ,

2 L(u, x) = 0 ⇐⇒ u = x ,

3 L strictly convex w.r.t. its first argument (for estimator uniqueness).

M. Pereyra (MI — HWU) Bayesian mathematical imaging 27 / 42

Page 29: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Bayesian point estimators - MMSE estimation

Example: the squared Euclidean distance L(u, x) = ∥u − x∥2 defines theso-called minimum mean squared error estimator.

xMMSE = argminu∈Rd

∫ ∥u − x∥22 p(x ∣y)dx .

By differentiating w.r.t. to u and equating to zero we obtain that

∫ (xMMSE − x)p(x ∣y)dx = 0 Ô⇒ xMMSE ∫ p(x ∣y)dx = ∫ xp(x ∣y)dx ,

hence xMMSE = Ex ∣y (recall that ∫ p(x ∣y)dx = 1).

M. Pereyra (MI — HWU) Bayesian mathematical imaging 28 / 42

Page 30: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Bayesian point estimators - MAP estimation

The predominant Bayesian approach in imaging is MAP estimation

xMAP = argmaxx∈Rd

p(x ∣y),

= argminx∈Rd

− log p(y ∣x) − log p(x).(7)

In many problems xMAP can be computed efficiently by optimisation(Chambolle and Pock, 2016).

What is the loss function L associated with this estimator?

M. Pereyra (MI — HWU) Bayesian mathematical imaging 29 / 42

Page 31: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Posterior credible regions

Where does the posterior probability mass of x lie?

A set Cα is a posterior credible region of confidence level (1 − α)% if

P [x ∈ Cα∣y] = 1 − α.

For any α ∈ (0,1) there are infinitely many regions of the parameter spacethat verify this property.

The highest posterior density (HPD) region is decision-theoreticallyoptimal in a compactness sense

C∗α = x ∶ φ(x) ≤ γα

with γα ∈ R chosen such that ∫C∗αp(x ∣y)dx = 1 − α holds.

M. Pereyra (MI — HWU) Bayesian mathematical imaging 30 / 42

Page 32: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Hypothesis testing

Hypothesis test split the solution space in two meaningful regions, e.g.,

H0 ∶ x ∈ SH1 ∶ x /∈ S

where S ⊂ Rd contains all solutions with some characteristic of interest.

We can then assess the degree of support for H0 vs. H1 by computing

P(H0∣y) = ∫Sp(x ∣y)dx , P(H1∣y) = 1 − P(H0∣y) .

We can also reject H0 in favour of H1 with significance α ∈ [0,1] if

P(H0∣y) ≤ α.

M. Pereyra (MI — HWU) Bayesian mathematical imaging 31 / 42

Page 33: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Bayesian model selection

The Bayesian framework provides theory for comparing models objectively.

Given K alternative models MjKj=1 with posterior densities

Mj ∶ pj(x ∣y) = pj(y ∣x)pj(x))/pj(y) ,

we compute the (marginal) posterior probability of each model, i.e.,

p(Mj ∣y)∝ p(y ∣Mj)p(Mj) (8)

where p(y ∣Mj) ≜ pj(y) = ∫ pj(y ∣x)pj(x)dx measures model-fit-to-data.

We then select for our inferences the “best” model, i.e.,

M∗ = argmaxj∈1,...,K

p(Mj ∣y).

M. Pereyra (MI — HWU) Bayesian mathematical imaging 32 / 42

Page 34: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Bayesian model calibration

Alternatively, given a continuous class of models Mθ, θ ∈ θ with

Mθ ∶ p(x ∣y , θ) = p(y ∣x , θ)p(x ∣θ)p(y ∣θ) ,

we compute the (marginal) posterior

p(θ∣y) = p(y ∣θ)p(θ)/p(y) (9)

where p(y ∣θ) = ∫ p(y ∣x , θ)p(x ∣θ)dx measures model-fit-to-data.

We then calibrate our model with the “best” value of θ, i.e.,

θMAP = argmaxθ∈Θ

p(θ∣y).

M. Pereyra (MI — HWU) Bayesian mathematical imaging 33 / 42

Page 35: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Bayesian model averaging

We can also use all models simultaneously!

Given K alternative models MjKj=1 with posterior densities

Mj ∶ pj(x ∣y) = pj(y ∣x)pj(x))/pj(y) ,

we marginalise w.r.t. the model selector j , i.e.,

p(x ∣y) =K

∑j=1

p(x ,Mj ∣y) =K

∑j=1

p(x ∣y ,Mj)p(Mj ∣y) (10)

where the posterior probabilities p(Mj ∣y) control the relative importanceof each model.

M. Pereyra (MI — HWU) Bayesian mathematical imaging 34 / 42

Page 36: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Bayesian model calibration

Similarly, given a continuous class of models Mθ, θ ∈ Θ with

Mθ ∶ p(x ∣y , θ) = p(y ∣x , θ)p(x ∣θ)p(y ∣θ) ,

and a prior p(θ), we marginalise θ

p(x ∣y) = ∫Θp(x , θ∣y)dθ,

= ∫Θp(x ∣y , θ)p(θ∣y)dθ

where again p(θ∣y) controls the relative contribution of each model.

M. Pereyra (MI — HWU) Bayesian mathematical imaging 35 / 42

Page 37: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Summary

The Bayesian statistical paradigm provides a power mathematicalframework to solve imaging problems...

It allows deriving optimal estimators for x ..

As well as quantifying the uncertainty in the solutions delivered...

It supports hypothesis tests to inform decisions and conclusions...

And allows operating with partially unknown models...

And with several competing models...

So the problem is quite solved, right?

M. Pereyra (MI — HWU) Bayesian mathematical imaging 36 / 42

Page 38: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Not really...

How do we compute all these probabilities?

M. Pereyra (MI — HWU) Bayesian mathematical imaging 37 / 42

Page 39: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Outline

1 Imaging inverse problems

2 Bayesian modelling

3 Bayesian inference

4 Conclusion

M. Pereyra (MI — HWU) Bayesian mathematical imaging 38 / 42

Page 40: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Conclusion

There are several mathematical frameworks to solve imaging problems.

Variational approaches offer excellent point estimation results and efficientcomputer algorithms with theoretical guarantees.

However, they struggle to go beyond point estimation.

Bayesian statistics provide a powerful framework to formulate moreadvanced inferences and deal with other complications.

However, computing probabilities is challenges and they struggle to scaleto large problems as a result.

M. Pereyra (MI — HWU) Bayesian mathematical imaging 39 / 42

Page 41: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Conclusion

In the next lecture...We will explore ways of accelerating Bayesian inference methods byintegrating modern stochastic and variational approaches at algorithmic,methodological, and theoretical levels.

Thank you!

M. Pereyra (MI — HWU) Bayesian mathematical imaging 40 / 42

Page 42: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Bibliography:

Cai, X., Pereyra, M., and McEwen, J. D. (2017). Uncertainty quantification for radiointerferometric imaging II: MAP estimation. ArXiv e-prints.

Chambolle, A. and Pock, T. (2016). An introduction to continuous optimization forimaging. Acta Numerica, 25:161–319.

Deledalle, C.-A., Vaiter, S., Fadili, J., and Peyre, G. (2014). Stein unbiased gradientestimator of the risk (sugar) for multiple parameter selection. SIAM Journal onImaging Sciences, 7(4):2448–2487.

Durmus, A., Moulines, E., and Pereyra, M. (2018). Efficient Bayesian computation byproximal Markov chain Monte Carlo: when Langevin meets Moreau. SIAM J. ImagingSci., 11(1):473–506.

Fernandez-Vidal, A. and Pereyra, M. (2018). Maximum likelihood estimation ofregularisation parameters. In Proc. IEEE ICIP 2018.

Green, P. J., Latuszynski, K., Pereyra, M., and Robert, C. P. (2015). Bayesiancomputation: a summary of the current state, and samples backwards and forwards.Statistics and Computing, 25(4):835–862.

Moreau, J.-J. (1962). Fonctions convexes duales et points proximaux dans un espaceHilbertien. C. R. Acad. Sci. Paris Ser. A Math., 255:2897–2899.

Pereyra, M. (2015). Proximal Markov chain Monte Carlo algorithms. Statistics andComputing. open access paper, http://dx.doi.org/10.1007/s11222-015-9567-4.

M. Pereyra (MI — HWU) Bayesian mathematical imaging 41 / 42

Page 43: Bayesian inference and mathematical imaging. Part I ... · Bayesian decision theory Given the following elements de ning a decision problem: 1 Decision space 2 Loss function L( ;x)∶

Pereyra, M., Bioucas-Dias, J., and Figueiredo, M. (2015). Maximum-a-posterioriestimation with unknown regularisation parameters. In Proc. Europ. Signal Process.Conf. (EUSIPCO) 2015.

Robert, C. P. (2001). The Bayesian Choice (second edition). Springer Verlag, New-York.

Zhu, L., Zhang, W., Elnatan, D., and Huang, B. (2012). Faster STORM usingcompressed sensing. Nat. Meth., 9(7):721–723.

M. Pereyra (MI — HWU) Bayesian mathematical imaging 42 / 42