Skeleton of Presentation

53
1

description

Skeleton of Presentation. Probability density function ( pdf ) estimation using isocontours/isosurfaces Application to Image Registration Application to Image Filtering Circular/spherical density estimation in Euclidean space. PDF Estimation: Contemporary techniques. Kernel density estimate. - PowerPoint PPT Presentation

Transcript of Skeleton of Presentation

Page 1: Skeleton of Presentation

1

Page 2: Skeleton of Presentation

Skeleton of Presentation

Probability density function (pdf) estimation using isocontours/isosurfaces

Application to Image RegistrationApplication to Image FilteringCircular/spherical density estimation

in Euclidean space

2

Page 3: Skeleton of Presentation

PDF Estimation: Contemporary techniques

3

Histograms Kernel density

estimate

Mixture model

Parameter selection:bin-width/bandwidth/

number of componentsBias/variance tradeoff: large bandwidth: high

bias, low bandwidth: high

variance)

Sample-based methods

Do not treat a

signal as a signal

Page 4: Skeleton of Presentation

New approach: isocontours

4

Continuous image representation: using some

interpolant.

Trace out isocontours of the intensity function I(x,y) at several intensity values.

Page 5: Skeleton of Presentation

Intensity at Curves Level andIntensity at Curves Level regionbrown of area )( IP

Page 6: Skeleton of Presentation

New approach: isocontours

6

Assume a uniform density on (x,y)

Random variable transformation from

(x,y) to (I,u)

Integrate out u to get the density of

intensity I

Every point in the image domain

contributes to the density.

22),(

),(

||1

||1)(

yxyxI

yxII

II

du

dxdyddp

Published in CVPR 2006, PAMI 2009.

u = direction along the level

set (dummy variable)

Page 7: Skeleton of Presentation

Joint density: isocontours

7

Page 8: Skeleton of Presentation

Joint density: isocontours

8

)y,x()y,x(})y,x(I,)y,x(I|)y,x{(C

|))y,x(sin()y,x(I)y,x(I|1

||1),(p

2211

21C21I,I 21

at gradients between angle

2211 Iin and Iin Intensity at Curves Level 2222

1111

Iin ),( and Iin ) ,(Intensity at Curves Level

regionblack of area ),( 22221111 IIP

Relationships between geometric and

probabilistic entities.

Page 9: Skeleton of Presentation

Related work

Similar density estimator developed by Kadir and Brady (BMVC 2005) independently of us.

Similar idea: several differences in implementation, motivation, derivation of results and applications.

9

Page 10: Skeleton of Presentation

Densities may not exist: distributions do.

10

})y,x(I,)y,x(I|)y,x{(C

|))y,x(sin()y,x(I)y,x(I|1

||1),(p

2211

21C21I,I 21

|),(|||1)(

),(yxI

dupyxI

I

Densities (derivatives of the cumulative) do not exist where image

gradients are zero, or where image gradients run parallel.

Compute cumulative

interval measures.

)( IP

Page 11: Skeleton of Presentation

Computational issues: No binning problem

11

Page 12: Skeleton of Presentation

12

Standard histograms Isocontour Method32 bins64 bins128 bins256 bins512 bins1024 bins

Page 13: Skeleton of Presentation

13

Page 14: Skeleton of Presentation

Histogramming with upsampling?Randomized/digital approximation to

area calculation.

Strict lower bound on the accuracy of the isocontour method, for a fixed interpolant.

Computationally more expensive than the isocontour method.

14

Page 15: Skeleton of Presentation

Histogramming with Random upsampling?

15

128 x 128 bins

Page 16: Skeleton of Presentation

Choice of interpolant?

Simplest one: linear interpolant to each half-pixel (level curves are segments).

Low-order polynomial interpolants: high bias, low variance.

High-order polynomial interpolants: low bias, high variance.

16

Page 17: Skeleton of Presentation

Choice of interpolant?

17

Polynomial Interpolant

Accuracy of estimated density improves as signal

is sampled with finer resolution.

Assumptions on signal: better interpolant

Bandlimited analog signal, Nyquist-

sampled digital signal:Accurate

reconstruction by sinc interpolant!

(Whitaker-Shannon Sampling Theorem)

Page 18: Skeleton of Presentation

Skeleton of Presentation

Probability density function (pdf) estimation using isocontours

Application to Image Registration

Application to Image FilteringCircular/spherical density estimation

in Euclidean space

18

Page 19: Skeleton of Presentation

Image Registration: Problem Definition

19

• Given two images of an object, to find the geometric transformation that “best” aligns one with the other, w.r.t. some image similarity measure.

Mutual Information: Well known image similarity measure Viola and Wells (IJCV 1995) and Maes et al (TMI 1997).Insensitive to illumination changes: useful in multimodality image registration

Page 20: Skeleton of Presentation

20

Marginal entropy

Joint entropy

Conditional entropy

)( 1IH

),( 21 IIH

)|( 21 IIH

),(log),(),(

)(log)()(

)(log)()(

2112211221

22222

11111

1 2

2

1

ppIIH

ppIH

ppIH

),()()( )|()(),(

2121

21121

IIHIHIHIIHIHIIMI

)(p 2112 ,Joint Probability

)()(

22

11

pp

Marginal Probabilities

Page 21: Skeleton of Presentation

Functions of Geometric Transformation

j)(ip ,12 ),( 21 IIH ),(MI 21 II

1I 2I

Hypothesis: If the alignment between images is optimal then Mutual information is maximum.

21

Page 22: Skeleton of Presentation

Registration experiments: Rotation

2205.0 2.0 7.0

Page 23: Skeleton of Presentation

23

05.0 32 bins128 bins2.0 7.0PVI=partial volume interpolation (Maes

et al, TMI 1997)

Page 24: Skeleton of Presentation

Experiments: Affine registration

24

PD slice T2 sliceWarped T2 sliceWarped and Noisy T2 slice

Brute force search for themaximum of MI

Page 25: Skeleton of Presentation

25

MI with standardhistograms

MI with our method

7.0

Par. of affine transf.

0,3.0,30 ts

Page 26: Skeleton of Presentation

26

Method Error in Theta(avg., var.)

Error in s (avg.,var.)

Error in t(avg., var.)

Histograms (bilinear)

3.7,18.1 0.7,0 0.43,0.08

Isocontours 0,0.06 0,0 0,0

PVI 1.9, 8.5 0.56,0.08 0.49,0.1

Histograms (cubic)

0.3,49.4 0.7,0 0.2,0

2DPointProb 0.3,0.22 0,0 0,0

32 BINS

Page 27: Skeleton of Presentation

Skeleton of Presentation

Probability density function (pdf) estimation using isocontours

Application to Image RegistrationApplication to Image FilteringCircular/spherical density estimation

in Euclidean space

27

Page 28: Skeleton of Presentation

Anisotropic neighborhood filters (Kernel density based filters): Grayscale images

28

),(),(

),(),(

));,(),((

),());,(),((),(

baNyx

baNyx

baIyxIK

yxIbaIyxIKbaI

Central Pixel (a,b):Neighborhood N(a,b) around

(a,b)

K: a decreasing function (typically Gaussian)

Parameter σ controls the degree of anisotropicity of the smoothing

Page 29: Skeleton of Presentation

Anisotropic Neighborhood filters: Problems

29

Sensitivity to the parameter

Sensitivity to the SIZE of the Neighborhood

Does not account for

gradient information

Page 30: Skeleton of Presentation

30

Anisotropic Neighborhood filters: Problems

Treat pixels as independent samples

Page 31: Skeleton of Presentation

Continuous Image Representation

31

),(),(

),(),(

));,(),((

),());,(),((

),(

baNyx

baNyx

baIyxIK

yxIbaIyxIK

baI

),(

),(

));,(),((

));,(),((),(),(

baN

baN

dxdybaIyxIK

dxdybaIyxIKyxIbaI

Interpolate in between the pixel values

Page 32: Skeleton of Presentation

Areas between isocontours at

intensity α and α+Δ (divided by area of neighborhood)= Pr(α < Intensity <

α+Δ|N(a,b))

Continuous Image Representation

32

Page 33: Skeleton of Presentation

),(

),(

));,(),((

));,(),((),(),(

baN

baN

dxdybaIyxIK

dxdybaIyxIKyxIbaI

));,(()),(|(Area0

));,(()),(|(Area0),(

baIKbaNILim

baIKbaNILimbaI

));,(()),(|(Pr0

));,(()),(|(Pr0),(

baIKbaNILim

baIKbaNILimbaI

Areas between isocontours: contribute to weights for

averaging.

33

Published in EMMCVPR 2009

Page 34: Skeleton of Presentation

Extension to RGB images

34

Joint Probability of R,G,B = Area of overlap of isocontour pairs from R, G, B images

));,,(()),,(Pr(

));,,(()),,(Pr(

),(),,(),,(

)(

)(

BGRKBGR

BGRKBGR

baBbaGbaR

Page 35: Skeleton of Presentation

Mean-shift framework

• A clustering method developed by Fukunaga & Hostetler (IEEE Trans. Inf. Theory, 1975).

• Applied to image filtering by Comaniciu and Meer (PAMI 2003).

• Involves independent update of each pixel by maximization of local estimate of probability density of joint spatial and intensity parameters.

35

Page 36: Skeleton of Presentation

Mean-shift framework• One step of mean-shift update around (a,b,c) where

c=I(a,b).

36.cb)I(a,Set 4.

c)b,(a, (2)(1) 3.)c,b,a(c)b,(a, 2.

σc)y)(I(x,

σb)(y

σa)(xexp:y)w(x,

c , )b,a(

y)w(x,

y)w(x,y))I(x,y,(x,

:)c,b,a( 1.

2

2r

2

2s

2

2s

b)N(a,y)(x,

b)N(a,y)(x,

changing. stopstill and stepsRepeat

valueintensity updatedodneighborho the of center updated

Page 37: Skeleton of Presentation

Our Method in Mean-shift Setting

37

I(x,y) X(x,y)=x Y(x,y)=y

Page 38: Skeleton of Presentation

Our Method in Mean-shift Setting

38

);)),(),,(),,((),,(K(

);)),(),,(),,((),,(K(),,(

)),(),,(),,((

),(

),(

baIbaYbaXIYX

baIbaYbaXIYXIYX

baIbaYbaXkkk

baNk

kkkbaNk

kkk

k

k

A

A

Facets of tessellation induced by isocontours and the pixel grid

= Centroid of Facet #k. = Intensity (from interpolated image) at . = Area of Facet #k.

),( kk YX

),( kk YXkI

kA

Page 39: Skeleton of Presentation

Experimental Setup: Grayscale Images

• Piecewise-linear interpolation used for our method in all experiments.

• For our method, Kernel K = pillbox kernel, i.e.

• For discrete mean-shift, Kernel K = Gaussian.• Parameters used: neighborhood radius ρ=3, σ=3.• Noise model: Gaussian noise of variance 0.003 (scale of 0

to 1).

39

1);( zK

0);( zK

If |z| <= σ

If |z| > σ

Page 40: Skeleton of Presentation

40

Original Image Noisy Image

Denoised (Isocontour Mean Shift)

Denoised (Gaussian Kernel Mean Shift)

MSE

Noisy Image 181.27

Isocontour(ρ=3, σ =3)

110.95

Std. Mean shift

(ρ=3, σ =3)

175.27

Std. Mean shift

(ρ=5, σ =5)

151.27

Page 41: Skeleton of Presentation

41

Original Image

Noisy ImageDenoised (Isocontour Mean Shift)

Denoised (Std.Mean Shift)

Noisy image

IsocontourMean shift(ρ=3, σ =3)

Std. mean shift(ρ=3, σ =3)

Std. mean shift (ρ=5, σ =3)

MSE 190 113.8 184.77 153.5

Page 42: Skeleton of Presentation

Experiments on color images

• Use of pillbox kernels for our method.• Use of Gaussian kernels for discrete mean

shift.• Parameters used: neighborhood radius ρ= 6,

σ = 6.• Noise model: Independent Gaussian noise on

each channel with variance 0.003 (on a scale of 0 to 1).

42

Page 43: Skeleton of Presentation

Experiments on color images

• Independent piecewise-linear interpolation on R,G,B channels in our method.

• Smoothing of R, G, B values done by coupled updates using joint probabilities.

43

Page 44: Skeleton of Presentation

Original Image

Noisy Image

Denoised (Isocontour Mean Shift)

Denoised (Gaussian Kernel Mean Shift)

MSE

Noisy Image 572.24

Isocontour(ρ=3, σ =3)

319.88

Std. Mean shift

(ρ=3, σ =3)

547.96

Std. Mean shift

(ρ=5, σ =5)

496.7

44

Page 45: Skeleton of Presentation

Original Image

Noisy ImageDenoised (Isocontour Mean Shift)

Denoised (Gaussian Kernel Mean Shift)

Noisy image

IsocontourMean shift(ρ=3, σ =3)

Std. mean shift(ρ=3, σ =3)

Std. mean shift(ρ=5, σ =5)

MSE 547.9 306.14 526.8 477.2545

Page 46: Skeleton of Presentation

Observations

• Discrete kernel mean shift performs poorly with small neighborhoods and small values of σ.

• Why? Small sample-size problem for kernel density estimation.

• Isocontour based method performs well even in this scenario (number of isocontours/facets >> number of pixels).

• Large σ or large neighborhood values not always necessary for smoothing.

46

Page 47: Skeleton of Presentation

Observations

• Superior behavior observed when comparing isocontour-based neighborhood filters with standard neighborhood filters for the same parameter set and the same number of iterations.

47

Page 48: Skeleton of Presentation

Skeleton of Presentation

Probability density function (pdf) estimation using isocontours

Application to Image RegistrationApplication to Image FilteringCircular/spherical density

estimation in Euclidean space

48

Page 49: Skeleton of Presentation

Circular/spherical density estimation in Euclidean space.

Examples of unit vector data:1. Chromaticity vectors of color

values:

2. Hue (from the HSI color scheme) obtained from the RGB values.

49

222

),,(),,(BGR

BGRbgr

BGR

BG2

)(3arctan

Page 50: Skeleton of Presentation

Conventional approach

50

222

),,(),,(

iii

iiiiiii

BGR

BGRbgrv

Convert RGB values

to unit vectors

Estimate density of unit vectors ,;1)(

1

N

ii )vK(v

Nvp

constantion normalizat)(pCparameterion concentrat

K

uTve)(pC)u,;v(K

Kernel Fisher Mises-von

voMF mixture models

Banerjee et al (JMLR 2005)

Other popular kernels: Watson,

cosine.

Page 51: Skeleton of Presentation

Chromaticity density estimation in Euclidean space.

51

Estimate density of RGB using KDE/Mixture models

Density of (magnitude,chromaticity) using random-

variable transformation

222

2 ),,(),(

BGRm

BGRpmvmp

222

0

2 ),,( )(

BGRm

dmBGRpmvpm

Density of

chromaticity(integrate out magnitude)

N

iiBBiGGiRR

NBGRp

1 2

2)(2)(2)(exp1),,(

Projected normal estimator:

Watson,”Statistics on spheres”, 1983,

Small,”The statistical theory of shape”, 1995

Density of chromaticity:

conditioning on m=1.

2

1

,,

exp2

11)(

ii

i

iii

iT

i

N

i io

mmw

vwm

vvIN

vp

Variable bandwidth voMF KDE:

Bishop, “Neural networks for pattern recognition” 2006.

What’s new? The notion that all estimation can proceed in

Euclidean space.

Page 52: Skeleton of Presentation

Hue density estimation in Euclidean space.

52

Estimate density of RGB using KDE/Mixture models

Use random variable transformation to get density of HSI (hue, saturation,intensity)

Integrate out S,I to get density of hue

Page 53: Skeleton of Presentation

Advantages of proposed approach Consistency between densities of Euclidean and

unit vector data (in terms of random variable transformation/conditioning).

Potential to use the large body of literature available for statistics of Euclidean data (example: Fast Gauss Transform Greengard et al (SIAM Sci. Computing 1991), Duraiswami et al (IJCV 2003).

Model selection can be done in Euclidean space.

53