Minnesota’s ADAP Medication Adherence Initiative Dave Rompa ADAP/Part B Program Administrator.
Adap%ve Boos%ng for Image Denoising · § IP, stas6cs, CV & ML, coding, and op6mizaons § Classic...
Transcript of Adap%ve Boos%ng for Image Denoising · § IP, stas6cs, CV & ML, coding, and op6mizaons § Classic...
Adap%veBoos%ngforImageDenoising:BeyondLow-rankRepresenta%onandSparseCoding
Zixiang Xiong
Texas A&M University http://lena.tamu.edu
Denoising:Anoldtopicwithwideapps
2
§ Recovering𝐱from𝒚=𝐱+𝐧 hasbeenstudiedfromseveralangles§ IP,sta6s6cs,CV&ML,coding,andop6miza6ons
§ ClassicWienerfilteringtechnique(1949)firstappliedtoimagedenoisingin1980
§ Wavelettheoryfoundwideapplica6onsinimageprocessinginthe1990s
§ Compression(withcri6callysampledwavelettransform)§ Denoisingviawaveletshrinkage
§ Overcompleterepresenta6onisadvantageous!§ Breakthroughs§ Non-localmean(NLM)in2005§ Block-matching+3DWienerfiltering(BM3D)in2006
NLMandBM3Ddenoising
builtuponpowerfulnon-localpatch-basedimagemodels§ Toexploitnon-localself-similarity
Imagedenoisingresearch
4
§ ExplosivegrowthsinceNLM(2005)&BM3D(2006)Papers published on image denoising
According to Google Scholar
Leadingimagedenoisingmethods
5
Arebuiltuponpowerfulpatch-basedimagemodels§ NLM(2005):Self-similaritywithinnaturalimages§ K-SVD(2006):Sparserepresenta6onmodelingofimagepatches§ BM3D(2006):Combinesasparsitypriorandnonlocalself-similarity§ EPLL(2009):Expectedpathloglikelihood§ NCSR(2011):Non-localcentralizedsparserepresenta6on§ SAIST(2013):Spa6allyadap6veitera6vesingular-valuethresholding
§ WNNM(2014):Weightednon-localnuclearmean§ …
Leadingimagedenoisingmethods
where𝐃isafatmatrixwithlow-rankand 𝛼 issparse,meaningisafatmatrixwithlow-rankand 𝛼 issparse,meaningthatitcontainsmostlyzeros
§ Thecomputa6onof 𝛼 from x (oritsnoisyversion)iscalledsparsecoding
6
Twomaincomponents:low-rankrepresenta6on&sparsecoding
§ Signals/imagesaredecomposedintoalow-rankrepresenta6on
x =𝐃 𝛼
D
…
D=x 𝛼
Theore%caladvanceonlow-rankmatrixapprox.
7
Nuclearnormminimiza6on(NNM)[Candès&Recht’09]§ Thenuclearnormofamatrix𝑿∊𝑅↑𝑚×𝑛 isǁ 𝑿ǁ↓∗ =∑↓𝑖 |σ↓𝑖 (𝑿)|§ Nuclearnormisthe6ghtestconvexrelaxa6onoftherankpenalty
ofamatrix§ Let𝒀∊𝑅↑𝑚×𝑛 bethegivendatamatrix,theNNMproblemaims
tomin↓𝑿 { ǁ𝒀−𝑿ǁ↓𝐹↑2 + λ ǁ 𝑿ǁ↓∗ }
whereλisaposi6veregulariza6onparameter§ Theclosed-formsolu6on(Cai&Candès&Shen’10)
𝑿↑∗ =U 𝑺↓λ (Σ) 𝑽↑𝑇 with𝒀=UΣ 𝑽↑𝑇 beingtheSVDof𝒀and 𝑺↓λ (Σ)=max(0, Σ-λ/2) § ThisisjustsoithresholdingintheSVDdomain!
Theore%caladvanceonlow-rankmatrixapprox.
8
Weightednuclearnormminimiza6on(WNNM)[Zhangetal’14]§ ExtendsNNMtoallownon-uniformthresholdingofσ↓𝑖 (𝑿)toexploitapriori
imageinfo
§ Theweightednuclearnormofamatrix𝑿∊𝑅↑𝑚×𝑛 isǁ 𝑿ǁ↓𝒘,∗ =∑↓𝑖 |𝜔↓𝑖 σ↓𝑖 (𝑿)|,
where𝒘=[ 𝜔↓1 , 𝜔↓2 ,…, 𝜔↓𝑚 ]istheweigh6ngvectorofnonnega6vethresholds
§ TheWNNMproblemaimsat
argmin↓𝑿 { ǁ𝒀−𝑿ǁ↓𝐹↑2 +λ ǁ 𝑿ǁ↓𝒘,∗ }§ Whentheweights/thresholdssa6sfy0≤𝜔↓1 ≤ 𝜔↓2 ≤···≤ 𝜔↓𝑚 (Zhang et
al’14),𝑿↑∗ =U𝑺↓𝑾 (Σ)𝑽↑𝑇 and 𝑺↓𝑾 (Σ)=max(0, Σ↓𝑖𝑖 - 𝜔↓𝑖 )
§ Threshold 𝜔↓𝑖 ischosenempiricallyas[Venerlietal’00]
𝜔↓𝑖 =𝑐/[ σ↓𝑖 (𝑿)+ε],withσ↓𝑖 (𝑿) es6matedfrom Σ↓𝑖𝑖 =σ↓𝑖 (𝒀)
Whatisnext?
9
§ Withsomuchprogressbeingmade
§ Isdenoisingdead?[Chanerjee&Milanfar’10]
Chatterjee and Milanfar. Is denoising dead? IEEE Trans. Image Proc., April 2010.
Papers published on image denoising According to Google Scholar
Whatisnext?
10
§ Withsomuchprogressbeingmade
§ Isdenoisingdead?[Chanerjee&Milanfar’10]
Sparsecoding
Low-rankrepresenta6on
Adap6veboos6ng
§ HowcanwedobeKerthanWNNM(2014)?
§ Beyondlow-rankrepresenta%onandsparsecoding
§ Togetgoodimagedenoisingresults,oneneedstodelicatelybalancemathema%calrigorandengineeringapproxima%ons
Chatterjee and Milanfar. Is denoising dead? IEEE Trans. Image Proc., April 2010.
Boos%ng
11
Noisyimage𝐲
Denoisedimage𝐱 =𝑓(𝐲)
Methodnoise𝐲− 𝐱
§ Formulatedenoisingasaregularizedop6miza6onproblem:itera6velyminimizingtheobjec6vefunc6onofminimizingtheMSE
§ Boos6ngperformancebyadap6velyfeedingbacktheresidual/methodnoiseimage
Boos%ng
12
Mul6pleboos6ngalgorithmshavebeenexploredinthepast:
§ Twicing[Tukey’77,Charestetal.’06]
§ 𝐱 ↑𝑘 = 𝐱 ↑𝑘−1 +𝑓(𝐲− 𝐱 ↑𝑘−1 )
§ Diffusion[Perona-Malik’90,Coifmanetal.’06,Milanfar’12]
§ Removesthenoiseleioversthatarefoundinthedenoisedimage
§ 𝐱 ↑𝑘 =𝑓(𝐱 ↑𝑘−1 )
§ Spa6allyadap6veitera6vefiltering(SAIF)[Talebietal.’12]§ Automa6callychoosesthelocalimprovementmechanism:
§ Diffusion§ Twicing
Boos%ng
13
§ SOS[Romano&Elad’15](Strengthen-Operate-Subtract)
§ 𝐱 ↑𝑘 =𝑓(𝐲+ 𝐱 ↑𝑘−1 )− 𝐱 ↑𝑘−1
Denoise
PreviousResult
§ Moregeneral:usingparameter𝜌tocontrolsignalemphasistocontrolsignalemphasis
§ 𝐱 ↑𝑘 =𝑓(𝐲+𝜌𝐱 ↑𝑘−1 )− 𝜌𝐱 ↑𝑘−1
Boos%ng
14
§ Annealing:WNNM[Zhangetal.’14]
§ 𝐱 ↑𝑘+1 =𝑓(𝜌𝐱 ↑𝑘 +(1−𝜌)𝐲)=𝑓( 𝐱 ↑𝑘 +(1−𝜌)(𝐲− 𝐱 ↑𝑘 ))
Denoise
PreviousResult
§ WNNMusesaconstant 𝜌forthewholeimageforthewholeimage
§ 𝜌 =0.9inexperiments
ReviewofWNNM
15
fixed
Basicideaofadap%veboos%ng(AB)
16
• Insteadofusingafixedfeedback/boos6ngfactor1−𝜌,,weadap6velychangeitfromoneitera6ontoanother
• Thisisveryintui6ve• Asthedenoisingperformanceimproves
fromitera6ontoitera6on,thereislessandlessusefulstructureinthemethod/residualnoise
• Thefeedbackfactor 1−𝜌 shoulddecreasefromoneitera6ontoanother
• Howtodoadap6veboos6ngsystema6cally?• Rank-1basedfixed-pointanalysis
Fixed-pointanalysis
17
§ Setup:𝐲=𝐱+𝐧
§ 𝐲 ∈ 𝑅↑𝑚×1 , noisy image patch arranged in an vector
§ 𝐱 ∈ 𝑅↑𝑚×1 , clean image patch;𝐧∈ 𝑅↑𝑚×1 , AWGN noise 𝐧~𝑁(0, 𝜎↑2 )
§ Agenericitera6vepatch-baseddenoisingalgorithm
𝐱 ↑𝑘 =𝑓(𝐱 ↑𝑘−1 +(1− 𝜌↑𝑘 )(𝐲− 𝐱 ↑𝑘−1 ))
=𝑓(𝜌↑𝑘 𝐱 ↑𝑘−1 +(1− 𝜌↑𝑘 )𝐲) 𝑘∈ 𝑍↑+
Ø 𝑓(∙)isnonlinearingeneral,butmostexis6ngalgorithmscanberepresentedas(∙)isnonlinearingeneral,butmostexis6ngalgorithmscanberepresentedasrow-stochas6cposi6vedefinitematrixopera6ons𝑊[Milanfar’13][Milanfar’13]
Ø Assumingconvergence,assign𝐱 ↑𝑘 = 𝐱 ↑𝑘−1 = 𝐱 ↑∗
𝐱 ↑∗ =𝑊(𝜌↑∗ 𝐱 ↑∗ +(1− 𝜌↑∗ )𝐲)
§ Thefixed-pointsolu6on:𝐱 ↑∗ = (𝐼− 𝜌↑∗ 𝑊)↑−1 (1− 𝜌↑∗ )𝑊𝐲
Fixed-pointanalysis
18
FixedPoint:𝐱 ↑∗ = (𝐼− 𝜌↑∗ 𝑊)↑−1 (1− 𝜌↑∗ )𝑊𝐲§ Minimizethemeansquareerrorofes6mator𝐱 ↑∗ :
𝑀𝑆𝐸(𝐱 ↑∗ )=𝐸[(𝐱 ↑∗ −𝐱)↑2 ] = ‖𝑏𝑖𝑎𝑠( 𝐱 ↑∗ )‖↑2 +var(𝐱 ↑∗ )§ Bias: 𝑏𝑖𝑎𝑠(𝐱 ↑∗ )=𝐸(𝐱 ↑∗ )−𝐱 =[(𝐼− 𝜌↑∗ 𝑊 )↑−1 (1− 𝜌↑∗ )𝑊−𝐼]𝐱 = (𝐼− 𝜌↑∗ 𝑊 )↑−1 (𝑊−𝐼)𝐱
• Row-stochas6cposi6vedefinitematrixadmitsaneigen-decomposi6on:𝑊=𝑈Λ 𝑈↑𝑇 ;thecleanpatch𝐱=𝑈𝐛
‖𝑏𝑖𝑎𝑠‖↑𝟐 = ‖𝑈(𝐼− 𝜌↑∗ Λ)↑−𝟏 (Λ−𝐼)𝐛‖↑𝟐 = ‖(𝐼− 𝜌↑∗ Λ)↑−𝟏 (Λ−𝐼)𝐛‖↑𝟐
=∑𝑖=1↑𝑚▒(𝜆↓𝑖 −1)↑2 𝑏↓𝑖↑2 /(1− 𝜌↑∗ 𝜆↓𝑖 )↑2 (Λ=𝑑𝑖𝑎𝑔[λ↓1 ,⋯,
λ↓𝑚 ])
Fixed-pointanalysis
19
FixedPoint:𝐱 ↑∗ = (𝐼− 𝜌↑∗ 𝑊)↑−1 (1− 𝜌↑∗ )𝑊𝐲§ Variance: 𝑣𝑎𝑟(𝐱 ↑∗ ) =𝑡𝑟[𝑐𝑜𝑣(𝐱 ↑∗ )] =𝑡𝑟[𝑐𝑜𝑣( (𝐼− 𝜌↑∗ 𝑊 )↑−1 (1− 𝜌↑∗ )𝑊n)]
= 𝜎↑2 ∑𝑖=1↑𝑚▒(1− 𝜌↑∗ )↑2 𝜆↓𝑖↑2 /(1− 𝜌↑∗ 𝜆↓𝑖 )↑2
§ OverallMSE:
§ noisevarianceà 𝜎↑2 § patchpropertyà 𝑏↓𝑖↑2 § denoisingalgorithmà Λ, 𝜆↓𝑖↑2 § trytofindtheop6mal𝜌↑∗ tominimizetheMSE
§ SpecializethedenoisingmatrixW toSVD-basedsoithresholdingandlookfortheop6mal 𝜌↑∗
𝑀𝑆𝐸=∑𝑖=1↑𝑚▒𝜆↓𝑖↑2 (1− 𝜌↑∗ )↑2 𝜎↑2 + (𝜆↓𝑖 −1)↑2 𝑏↓𝑖↑2 /(1− 𝜌↑∗ 𝜆↓𝑖 )↑2
SVD-basedsoVthresholdinginWNNM§ Group-levelsetup:stack𝑚↓1 similarblockstogether
𝑌=𝑋+𝑁𝑌, 𝑋, 𝑁∊𝑅↑𝑚× 𝑚↓1 𝑌= 𝑈Σ𝑉↑𝑇 Denoiser W 𝑋 =𝑈𝑆↓𝜔 (Σ) 𝑉↑𝑇
Λ= 𝑆↓𝜔 (Σ) Σ↑−1
𝑋 =𝑈𝑆↓𝜔 (Σ)Σ↑−1 𝑈↑𝑇 𝑈Σ𝑉↑𝑇
𝑋 =𝑈𝑆↓𝜔 (Σ)Σ↑−1 𝑈↑𝑇 𝑌
Σ=𝑑𝑖𝑎𝑔[𝑠↓1 ,⋯, 𝑠↓𝑚 ] S↓𝜔 (Σ)=𝑑𝑖𝑎𝑔[max (𝑠↓1 − 𝜔↓1 ,0) ,⋯, max (𝑠↓𝑚 − 𝜔↓𝑚 ,0) ]
𝜆↓𝑖 = max (1− 𝜔↓𝑖 /𝑠↓𝑖 ,0) for𝑖=1,⋯,𝑚
𝑊=𝑈Λ 𝑈↑𝑇
𝑋 =𝑊𝑌
𝑋 =𝑈Λ 𝑈↑𝑇 𝑌
Rank-1basedfixed-pointanalysis
21
§ Rank-1Assump6on:Allnonlocalblockssimilarto𝐱intheoriginal
imageareiden6cal!𝑌 isarank-1matrixplusGaussiannoise
𝐱 𝐲
𝑀𝑆𝐸=∑𝑖=1↑𝑚▒𝜆↓𝑖↑2 (1− 𝜌↑∗ )↑2 𝜎↑2 + (𝜆↓𝑖 −1)↑2 𝑏↓𝑖↑2 /(1− 𝜌↑∗ 𝜆↓𝑖 )↑2
Rank-1basedfixed-pointanalysis
22
§ Rank-1Assump6on:Allnonlocalblockssimilarto𝐱intheoriginalimageareiden6cal!𝑌 isarank-1matrixplusGaussiannoise§ Thefirstsingularvaluein Σdominatestherest:𝑠↓1 ≫𝑠↓𝑖 ,for𝑖=2,
⋯,𝑚
Shabalin and Nobel. Reconstruction of a low-rank matrix in the presence of Gaussian noise. Journal of Multivariate Analysis, 2013. Nadakuditi. Optshrink: An algorithm for improved low-rank signal matrix denoising by optimal, data-driven singular value shrinkage. IEEE Trans. Info. Theory, May 2014.
Rank-1basedfixed-pointanalysis
23
§ Since 𝑠↓1 ≫𝑠↓𝑖 ,for𝑖=2,⋯,𝑚, wehave𝑠↓𝑖 ≤ 𝜔↓𝑖 𝜆↓𝑖 = max (1− 𝜔↓𝑖 /𝑠↓𝑖 ,0) for𝑖=1,⋯,𝑚 à 𝜆↓𝑖 =0, for𝑖=2,⋯,𝑚
𝑀𝑆𝐸=∑𝑖=1↑𝑚▒𝜆↓𝑖↑2 (1− 𝜌↑∗ )↑2 𝜎↑2 + (𝜆↓𝑖 −1)↑2 𝑏↓𝑖↑2 /(1− 𝜌↑∗ 𝜆↓𝑖 )↑2 = 𝜆↓1↑2 (1− 𝜌↑∗ )↑2 𝜎↑2 + (𝜆↓1 −1)↑2 𝑏↓1↑2 /(1− 𝜌↑∗ 𝜆↓1 )↑2
§ Theop6mal 𝜌↑∗ thatminimizetheMSEis𝜌↑∗ = 𝜆↓1 𝜎↑2 −(1− 𝜆↓1 )𝑏↓1↑2 /𝜆↓1 𝜎↑2
Denoiser: 𝑊=𝑈Λ 𝑈↑𝑇
Adap%veboos%ng(AB)
24
𝜌↑∗ = √𝐸↓𝒚 /√𝐸↓𝒚 +√𝐸↓𝐱
§ The 𝑏↓1↑2 standsfortheenergystrengthinpatch𝐱àpatcheswithdifferentcharacteris6csshouldbeassigneddifferent𝜌↑∗ s.
§ 𝜌↑∗ ∈(0.5, 1)whichsa6sfiestheconvergencecondi6on§ Weusethesameformulafor𝜌↑𝑘 inthe𝑘-thitera6onofAB,
theadap6veboos6ngbecomes
§ Thefeedbackfactor1-𝜌↑𝑘 decreasesas𝑘increasesüincreasesü
§ Giventheformulaofop6mal𝜌↑∗ fromfixedpointanalysis 𝜌↑∗ = 𝜆↓1 𝜎↑2 −(1− 𝜆↓1 )𝑏↓1↑2 /𝜆↓1 𝜎↑2 ,es6mate 𝜆↓1 =√𝐸↓𝐱 /𝐸↓𝒚 , 𝑏↓1↑2 = 𝐸↓x ,wehave
𝜌↑𝑘 = √𝐸↓𝐱 ↑𝑘−1 /√𝐸↓𝐱 ↑𝑘−1 +√max(𝐸↓𝐱 ↑𝑘−1 − 𝜎↑2 ,0)
Adap%veboos%ng
25
𝜌↑𝑘 = √𝐸↓𝐱 ↑𝑘−1 /√𝐸↓𝐱 ↑𝑘−1 +√max(𝐸↓𝐱 ↑𝑘−1 −
𝜎↑2 ,0)
Convergenceanalysis
26
Assumingmatrixapproxima6onof𝑓(⋅)≈𝑊
§ Differencebetweenthe 𝑘-thes6mateandthefixedpoint-thes6mateandthefixedpoint 𝒆↑𝑘 = 𝐱 ↑𝑘 − 𝐱 ↑∗
=𝑊(𝜌↑𝑘 𝐱 ↑𝑘−1 +(1− 𝜌↑𝑘 )𝐲)−𝑊(𝜌↑∗ 𝐱 ↑∗ +(1− 𝜌↑∗ )𝐲)
§ Aieralargenumberofitera6ons,replace𝜌↑𝑘 by 𝜌↑∗ 𝒆↑𝑘 = 𝜌↑∗ 𝑊(𝐱 ↑𝑘−1 − 𝐱 ↑∗ )= 𝜌↑∗ 𝑊𝒆↑𝑘−1 = 𝜌↑∗ ↑𝑘 𝑊↑𝑘
𝒆↑0
§ Wehave0< 𝜌↑∗ <1and0≤𝑊<1(thedenoisingoperatoriscontrac6ve)
§ 𝒆↑𝑘 →𝟎as𝑘→∞as𝑘→∞
§ OurABalgorithmconverges
Testimages
27
The20testimagescommonlyusedinexperiments:
Numericalresults
28
ABgivesthebestPSNRresultforeveryimages(20testimages)andtheimprovementoverothersincreaseswithnoisevariance
Numericalresults
29
ABgivesthebestPSNRresultforeveryimages(20testimages)andtheimprovementoverothersincreaseswithnoisevariance
Visualcomparison
30
Comparisonwithlowerbound
31
ABperformsclosesttotheCramer-Raolowerbound
Chatterjee and Milanfar. Is denoising dead? IEEE Trans. Image Proc., April 2010.
Remarks
32
• ABworksasapreprocessingstepineachitera6onbeforetheimageisdenoisedbyleadingmethod
• ComplexityofABisignorablecomparedwiththemaindenoisingalgorithm
• ABisblock-basedtoadapttonon-sta6onarity• TheideaofABisverygeneric
• Canbecombinedwithtraining/learning-basedapproach
• ABisapplicabletootherrepresenta6vemethods• SuchasBM3DandK-SVD
Remarks
33
• ABisapplicabletootherrepresenta6vemethods• FreshPSNRresults(indB)fromBM3D+AB
Sigma = 25 BM3D BM3D+SOS BM3D+AB
Foreman 33.41 33.48 33.49Lena 32.02 32.04 32.05House 32.90 32.90 32.93
FingerPrint 27.72 27.72 27.74Peppers 31.87 31.89 31.90
Imagedenoising--forreal!
34
§ Imagestakenbylow-andhigh-endsmartphones
3264x2448 4160x3120
Imagedenoising--forreal!
35
§ Imagestakenbylow-andhigh-endsmartphones
1031x754 window 1259x771 window
Imagedeblurring
36
§ Couldbepartofdenoising
Imagesuper-resolu%on
37
Facesuper-resolu%onforrecogni%on
38
Imageinpain%ng
39