SOS Boosting of Image Denoising Algorithms
Transcript of SOS Boosting of Image Denoising Algorithms
SOS Boosting of Image Denoising Algorithms
The research leading to these results has received funding from theEuropean Research Council under European Union's SeventhFramework Program, ERC Grant agreement no. 320649, and by theIntel Collaborative Research Institute for Computational Intelligence
Michael EladThe Computer Science DepartmentThe Technion β Israel Institute of technologyHaifa 32000, Israel
*
* Joint work with Yaniv Romano
Image Denoising
2
Probably the most popular and heavily studied problem in image processing β¦
Why is it so popular? Here are few explanations :(i) It does come up in many applications(ii) It is the simplest inverse problem, platform for new ideas(iii) Many other problems can be recast as an iterated denoising, and β¦ (iv) It is misleadingly simple
Searchingimage and (denois*
or (noise and
removal))
in ISI Web-of-Science, leads to ~5000 journal papers
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Leading Image Denoising Methods
3
Are built upon powerful patch-based (local) image models:
βͺ Non-Local Means (NLM): self-similarity within natural images
βͺ K-SVD: sparse representation modeling of image patches
βͺ BM3D: combines a sparsity prior and non local self-similarity
βͺ Kernel-regression: offers a local directional filter
βͺ EPLL: exploits a GMM model of the image patches
βͺ β¦
Today we present a way to improve various such state-of-the-art image denoising methods, simply by applying
the original algorithm as a βblack-boxβ several times
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Boosting Methods for Denoising
5
In image denoising, there are two sources of possible problems:
β Residual noise in the output image, and
β Residual content in the method noise
Noisy image
π²
Denoised image
ΰ·π± = π π²
Method Noise
π² β ΰ·π±
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Denoise
Existing Boosting Algorithms
6
β Twicing [Tukey (β77), Charest et al. (β06)]
ΰ·π±π+1 = ΰ·π±π + π π² β ΰ·π±π
β TV denoising using Bregman distance [Bregman (β67), Osher et al. (β05)]
ΰ·π±π+1 = π ΰ·π±π +Οπ=1π π² β ΰ·π±π
β Method noise whitening [Romano & Elad (β13)]
β Diffusion [Perona-Malik (β90), Coifman (β06), Milanfar (β12)]
ΰ·π±π+1 = π ΰ·π±π
β EPLL [Zoran & Weiss (β09), Sulam & Elad (β14)]
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
SAIF [Talebi et al. (β12)] chooses automatically the local improvement mechanism: Diffusion or Twicing
Strengthen - Operate - Subtract Boosting
8
β Given any denoiser, how can we improve its performance?
Denoise
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Strengthen - Operate - Subtract Boosting
9
β Given any denoiser, how can we improve its performance?
Denoise
Previous Result
I. Strengthen the signal
II. Operate the denoiser
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Strengthen - Operate - Subtract Boosting
10
SOS formulation: ΰ·π±π+1 = π π² + ΰ·π±π β ΰ·π±π
β Given any denoiser, how can we improve its performance?
Denoise
Previous Result
I. Strengthen the signal
II. Operate the denoiser
III. Subtract the previous estimation from the outcome
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Strengthen - Operate - Subtract Boosting
11
β An improvement is expected since SNR π² + ΰ·π± > SNR π²
In the ideal case, where ΰ·π± = π± , we get
β We suggest strengthening the underlying signal, rather than
βͺ Adding/filtering the method noise β which tends to converge to the noisy image, or
βͺ Operating on the denoising output again and again β which tends to lead to over-smoothing
β SOS treats both sources of errors created in image denoising β¦
SNR π² + π± = 2 β SNR π²
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Image Denoising β A Matrix Formulation
β Observation:
β We study the convergence of the SOS using only the linear part:
β What about sparsity-based denoising methods [Elad & Aharon (β06)] ? We have shown that in this case thatβ’ π is symmetric and positive definite, β’ and ππππ β₯ π > 0, and ππππ₯ = 1,
12
ΰ·π± = π π² = π π²
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
πβ π 2 β€ 1 β π < 1
Denoising Algorithm
Non-Linear Part (decisions/switches)
Spatially adaptive weighted averages
True for NLM, Kernel-regression, BM3D, K-SVD, and many other methods
Convergence Study
β What about the non-linear part and its influence? More on this can be found in our paper β¦
13
For these denoising algorithms, the SOS boosting converges to
ΰ·xβ = π + π βπβ1πy
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Theorem: The SOS converges if π βπ 2 < 1, which holds true for kernel-based (Bilateral filter, NLM, Kernel Regression), and sparsity-based methods (K-SVD)
Generalization
β We introduce two parameters that modify
βͺ The steady-state outcome
βͺ The requirements for convergence (the eigenvalues range), and
βͺ The rate of convergence
β The parameter , affects the steady-state outcome:
β The second parameter, , controls the rate-of-convergence, without affecting the steady-state:
14
ΰ·π±π+1 = π π² + πΰ·π±π β πΰ·π±ππ
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
ΰ·π±π+1 = ππ π² + πΰ·π±π β ππ + π β 1 ΰ·π±π
Generalization
β By defining the error ππ = ΰ·xπ β ΰ·xβ, the SOS yields:
β We derived a closed-form expression for the optimal (π, π) setting
βͺ Given π, what is the best π, leading to the fastest convergence? This is depicted by the dashed curve
15
ππ = πππβ ππ + π β 1 π ππβ1
π
π
Largest eigenvalue of the errorβs transition matrix
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Graph Laplacian
β We can refer to the denoising matrix, π, as a spatially adaptive smoothing operator.
β A Graph Laplacian operator for an image can be defined asβ = π βπ
The intuition: The Laplacian computes the difference between a pixel and its neighborhoodβs weighted average.
β The image content is expected to reside along the eigenvectors corresponding to the small eigenvalues of π, while the noise is spread uniformly over all the eigenspace.
β What can we do with β? β Regularize inverse problems!
17Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Graph Laplacian Regularization
β The regularization can be defined as [Elmoataz et al. (β08), Bougleux et al. (β09)]
β Another option is to integrate the filter also in the data fidelity term [Kheradmand and Milanfar (β13)]
18
ΰ·π± = minπ±
π± β π² 22 + ππ±πβπ±
Seeks for an estimation that is close to the noisy version
While promoting similar pixels to remain similar
ΰ·π± = minπ±
π± β π² π»π π±β π² + ππ±πβπ±
Using the adaptive filter as a weight-matrix
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Graph Laplacian Regularization
β Another natural option is to minimize the following cost function
Its closed-form solution is the steady-state outcome of the SOS
β More on this topic can be found in our paper β¦
19
ΰ·π±β = π + π π βπβ1ππ² = π + πβ β1ππ²
ΰ·π±β = minπ±
π± βππ² 22 + ππ±πβπ±
The SOS boosting acts as a graph Laplacian regularizer
Seeks for an estimation that is close to the denoised version
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Results
β We successfully boost several state-of-the-art denoising algorithms:
βͺ K-SVD, NLM, BM3D, and EPLL
βͺ Without any modifications, simply by applying the originalsoftware as a βblack-boxβ
β We manually tuned two parameters
βͺ β signal emphasis factor
βͺ β noise level, which is an input to the denoiser
β’ Since the noise level of is higher than the one of
21
ΰ·π±π+1 = ππ π² + πΰ·π±π β πΰ·π±π
π
π
π² + ππ±π π²
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Quantitative Comparison
β Average boosting in PSNR* over 5 images (higher is better):
22
*PSNR = 20log10 Ξ€255 MSE
Noise std Improved Methods
π K-SVD NLM BM3D EPLL
10 0.13 0.44 0.01 0.13
20 0.22 0.34 0.02 0.25
25 0.26 0.41 0.03 0.26
50 0.77 0.30 0.07 0.30
75 1.26 0.56 0.11 0.32
100 0.81 0.36 0.14 0.27
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Visual Comparison: K-SVD
23
β Original K-SVD results,
29.06dB
π = 25
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Visual Comparison: K-SVD
24
β SOS K-SVD results,
29.41dB
π = 25
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Visual Comparison: EPLL
25
β Original EPLL results,
Forman
32.44dB
House
32.07dB
π = 25
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Visual Comparison: EPLL
26
β SOS EPLL results,
Forman
32.78dB
House
32.38dB
π = 25
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Visual Comparison: All
27Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Noisy image KSVD (31.20) NLM (30.02) BM3D (31.88) EPLL (30.88)
SOS KSVD (31.91) SOS NLM (30.56) SOS BM3D (31.94) SOS EPLL (31.15)
Visual Comparison: All
28Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Noisy image KSVD (33.72) NLM (31.64) BM3D (34.66) EPLL (33.62)
SOS KSVD (34.4) SOS NLM (32.3) SOS BM3D (34.7) SOS EPLL (34.1)
Conclusions
The SOS boosting algorithm is:
β Easy to use - we simply treat the denoiser π β as a βblack-boxβ
β Applicable to a wide range of denoising algorithms π β
β Guaranteed to converge for many leading denoising algorithms
β Has a straightforward stopping criterion
β Acts as an interesting graph-Laplacian regularizer
β Reduces the local-global gap in patch-based methods
β Guaranteed to improve state-of-the-art methods
β Has an automatic parameter-settings based on MC-SURE
30Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
We are Doneβ¦
Thank you!
Questions?
31Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
β We assume the existence of a dictionary π β βπΓπ whose columns are the atom signals
β Signals are modeled as sparse linear
combinations of the dictionary atoms:
where πΌ is sparse, meaning that it is assumed to contain mostly zeros
β The computation of πΌ from x(or its or its noisy version) is called sparse-coding
β The OMP is a popular sparse-coding technique, especially for low dimensional signals
Sparsity Model β The Basics
D
β¦
D πΌ=x
34
x = ππΌ
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
K-SVD Image Denoising [Elad & Aharon (β06)]
35
Noisy Image
Using OMP
Initial Dictionary Using KSVD
Update the Dictionary
ΰ·π©π = ππππΌπ
DenoisedPatch
A linear combination of few atoms
πΌπ = minπ§
πππz β πππ² 2
2 ππ extracts the ππ‘β patch
from π²
= πππ πππTπππ
βππππTπππ²
Denoiseeach patch
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
K-SVD Image Denoising [Elad & Aharon (β06)]
36
Noisy Image Reconstructed Image
Denoiseeach patch
Using OMP
Initial Dictionary Using KSVD
Update the Dictionary
= ππ + ΟπππTππ
β1ππ + Οπππ
Tπππ πππTπππ
βππππTππ π²
= ππ²
ΰ·π± = minπ±
π π± β π² 22 + Οπ ΰ·π©π β πππ± 2
2
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Reducing the βLocal-Globalβ Gap
Patch-Disagreement as a Way to Improve K-SVD DenoisingICASSP, 2015
β It turns out that the SOS boosting reduces the local/global gap, which is a shortcoming of many patch-based methods:
βͺ Local processing of patches VS. the global need in a whole denoised result
β We define the local disagreements by
Reaching a Consensus
Local independent denoised patch
Disagreementpatch
β The disagreements
βͺ Naturally exist since each noisy patch is denoised independently
βͺ Are based on the intermediate results
Globally averaged patch
38Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
βSharing the Disagreementβ
39
β Inspired by the βConsensus and Sharingβ problem from game-theory:
βͺ There are several agents, each one of them aims to minimize its individual cost (i.e., representing the noisy patch sparsely)
βͺ These agents affect a shared objective term, describing the overall goal (i.e., obtaining the globally denoised image)
β Imitating this concept, we suggest sharing the disagreements
Noisy image
Disagreement per-patch
Patch-based denoising
Est. Image
Noisy patches
Patch Avg.
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Connection to SOS Boosting
β Interestingly, for a fixed filter matrix , βsharing the disagreementβ and the SOS boosting are equivalent
β The connection to the SOS is far from trivial because
βͺ The SOS is blind to the intermediate results (the independent denoised patches, before patch-averaging)
βͺ The intermediate results are crucial for βsharing the disagreementβ approach
40
ΰ·π±π+1 = π π² + ΰ·π±π β ΰ·π±π
The SOS boosting reduces the local-global gap
π
Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
The General Case [Milanfar (β12)]
β Given any denoiser π, the MSE of ΰ·π± = ππ² can be expressed by
βͺ ππππ ΰ·x = Ξ ΰ·π± β π± = πβ π π±
βͺ π£ππ ΰ·x = Ξ ΰ·π± β Ξ ΰ·π± 2 = ππ£2 β Tr πππ
β We define the eigen-decomposition by
βͺ π: orthogonal matrix, representing the βdenoiser spaceβ
βͺ π²: diagonal matrix, containing the eigenvalues ππ
42SOS Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
MSE ΰ·π± =1
ππ± β ΰ·π± 2
2 = ππππ ΰ·π± 22 + π£ππ ΰ·π±
π = ππ²ππ
The General Case [Milanfar (β12)]
β Setting π± = ππ, we get
β Tradeoff: π β 1 reduces the bias but increases the variance
β The optimal filter (π»ππMSE ΰ·x = 0) is obtained for
which is the Wiener filter, but it requires knowledge of ππ2
43SOS Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
MSE ΰ·π± =
π=1
π
ππ β 1 2ππ2 + ππ£
2
π=1
π
ππ2
ππππ ΰ·π± 22 π£ππ ΰ·π±
πππππ‘
=ππ2
ππ2 + ππ£
2
MSE of the SOS boosting
β In the case of the SOS boosting, the filter can be represented as
where αππ =1
1+π 1βππ
β As a result, for π > 0, we get
β A large π reduces the variance of ΰ·π± but increases the bias of ΰ·π±
44SOS Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
MSE π±π ππ =
π=1
π1 + π
1 + π 1 β ππ
2
ππππ π2 ΰ·π± +
π=1
π1
1 + π 1 β ππ
2
π£πππ ΰ·π±
ππ ππ = π + π π βπβ1π = πΰ·©π²ππ
Larger than 1 Smaller than 1
Gaining Improvement
β Let us define an improvement function
βͺ An improvement is obtained if Ξ¦ π < 0
45SOS Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
Ξ¦ π = MSE π±ππ ππ βMSE ΰ·π±
The SOS boosting is always able to improve a suboptimal denoiser
Theorem: For any denoiser π with ππ β ππ2
ππ2+ππ£
2
(suboptimal eigenvalues), βπβ such that Ξ¦ πβ < 0
Minimizing the MSE
β We donβt have the true MSE
β But we can estimate it using SURE [Stein (β81)]
βͺ Requires the analytical form of the divergence of ππ π²
β Solution: Monte-Carlo SURE [Ramani et al. (β08)]
βͺ Treats the denoiser as a black-box
βͺ Estimates the first order difference estimator of the divergence
46SOS Boosting of Image Denoising AlgorithmsBy Yaniv Romano and Michael Elad
MSE ππ π² β ππ π²2
2β 2ππ π² Tπ² + 2ππ£
2 β Tr π»π²ππ π²