Recognition, SVD, and PCA. Recognition Suppose you want to find a face in an imageSuppose you want...

29
Recognition, SVD, and Recognition, SVD, and PCA PCA
  • date post

    20-Dec-2015
  • Category

    Documents

  • view

    223
  • download

    1

Transcript of Recognition, SVD, and PCA. Recognition Suppose you want to find a face in an imageSuppose you want...

Recognition, SVD, and PCARecognition, SVD, and PCA

RecognitionRecognition

• Suppose you want to find a face in an Suppose you want to find a face in an imageimage

• One possibility: look for something that One possibility: look for something that looks sort of like a face (oval, dark band looks sort of like a face (oval, dark band near top, dark band near bottom)near top, dark band near bottom)

• Another possibility: look for pieces of Another possibility: look for pieces of faces (eyes, mouth, etc.) in a specific faces (eyes, mouth, etc.) in a specific arrangementarrangement

TemplatesTemplates

• Model of a “generic” or “average” faceModel of a “generic” or “average” face– Learn templates from example dataLearn templates from example data

• For each location in image, look for For each location in image, look for template at that locationtemplate at that location– Optionally also search over scale, Optionally also search over scale,

orientationorientation

TemplatesTemplates

• In the simplest case, based on intensityIn the simplest case, based on intensity– Template is average of all faces in training Template is average of all faces in training

setset

– Comparison based on e.g. SSDComparison based on e.g. SSD

• More complex templatesMore complex templates– Outputs of feature detectorsOutputs of feature detectors

– Color histogramsColor histograms

– Both position and frequency information Both position and frequency information (wavelets)(wavelets)

Face Detection ResultsFace Detection Results

Sample ImagesSample Images

WaveletWaveletHistogramHistogramTemplateTemplate

Detection ofDetection offrontal / profilefrontal / profile

facesfaces

More Face Detection ResultsMore Face Detection Results

Schneiderman and KanadeSchneiderman and Kanade

Recognition UsingRecognition UsingRelations Between TemplatesRelations Between Templates

• Often easier to recognize a small Often easier to recognize a small featurefeature– e.g., lips easier to recognize than facese.g., lips easier to recognize than faces

– For articulated objects (e.g. people), For articulated objects (e.g. people), template for whole class usually template for whole class usually complicatedcomplicated

• So, identify small pieces and look for So, identify small pieces and look for spatial arrangementsspatial arrangements– Many false positives from identifying piecesMany false positives from identifying pieces

Graph MatchingGraph Matching

HeadHead

LegLegLegLeg

ArmArm

ArmArm

BodyBody

HeadHead

HeadHead

BodyBody

BodyBody

ArmArm

ArmArm LegLeg

LegLeg LegLeg

LegLeg

ModelModel Feature detection resultsFeature detection results

Graph MatchingGraph Matching

HeadHead

LegLegLegLeg

ArmArm

ArmArm

BodyBody

ConstraintsConstraints

Next to, right of

Next to, right ofNext to

, below

Next to, below

Graph MatchingGraph Matching

HeadHead

LegLegLegLeg

ArmArm

ArmArm

BodyBody

HeadHead

HeadHead

BodyBody

BodyBody

ArmArm

ArmArm LegLeg

LegLeg LegLeg

LegLeg

Combinatorial searchCombinatorial search

Graph MatchingGraph Matching

HeadHead

LegLegLegLeg

ArmArm

ArmArm

BodyBody

HeadHead

HeadHead

BodyBody

BodyBody

ArmArm

ArmArm LegLeg

LegLeg LegLeg

LegLeg

Combinatorial searchCombinatorial search

OKOK

Graph MatchingGraph Matching

HeadHead

LegLegLegLeg

ArmArm

ArmArm

BodyBody

HeadHead

HeadHead

BodyBody

BodyBody

ArmArm

ArmArm LegLeg

LegLeg LegLeg

LegLeg

Combinatorial searchCombinatorial search

ViolatesViolatesconstraintconstraint

Graph MatchingGraph Matching

• Large search spaceLarge search space– Heuristics for pruningHeuristics for pruning

• Missing featuresMissing features– Look for maximal consistent assignmentLook for maximal consistent assignment

• Noise, spurious featuresNoise, spurious features

• Incomplete constraintsIncomplete constraints– Verification step at endVerification step at end

RecognitionRecognition

• Suppose you want to recognize aSuppose you want to recognize aparticularparticular face face

• How does How does thisthis face differ from average face differ from average faceface

How to Recognize Specific How to Recognize Specific People?People?

• Consider variation from average faceConsider variation from average face

• Not all variations equally importantNot all variations equally important– Variation in a single pixel relatively Variation in a single pixel relatively

unimportantunimportant

• If image is high-dimensional vector, If image is high-dimensional vector, want to find directions in this space with want to find directions in this space with high variationhigh variation

Principal Components AnalaysisPrincipal Components Analaysis

• Principal Components Analysis (PCA): Principal Components Analysis (PCA): approximating a high-dimensional data approximating a high-dimensional data set with a lower-dimensional subspaceset with a lower-dimensional subspace

Original axesOriginal axes

****

******

**** **

**

********

**

**

****** **

**** ******

Data pointsData points

First principal componentFirst principal componentSecond principal componentSecond principal component

Digression:Digression:Singular Value Decomposition Singular Value Decomposition

(SVD)(SVD)• Handy mathematical technique that has Handy mathematical technique that has

application to many problemsapplication to many problems

• Given any Given any mmnn matrix matrix AA, algorithm to , algorithm to find matrices find matrices UU, , VV, and , and WW such that such that

AA = = UU WW VVTT

UU is is mmnn and orthonormal and orthonormal

V V is is nnnn and orthonormal and orthonormal

WW is is nnnn and diagonal and diagonal

SVDSVD

• Treat as black box: code widely Treat as black box: code widely available (available (svd(A,0)svd(A,0) in Matlab) in Matlab)

T

1

00

00

00

VUA

nw

w

T

1

00

00

00

VUA

nw

w

SVDSVD

• The The wwii are called the are called the singular valuessingular values of of

AA

• If If AA is singular, some of the is singular, some of the wwii will be 0will be 0

• In general In general rankrank((AA) = number of nonzero ) = number of nonzero wwii

• SVD is mostly unique (up to SVD is mostly unique (up to permutation of singular values, or if permutation of singular values, or if some some wwii are equal) are equal)

SVD and EigenvectorsSVD and Eigenvectors

• Let Let AA==UWVUWVTT, and let , and let xxii be be iithth column of column of VV

• Consider Consider AATTAA xxii::

• So elements of So elements of WW are squared are squared eigenvalues and columns of eigenvalues and columns of VV are are eigenvectors of eigenvectors of AATTAA

iiiiii xwwxxx 222T2TTTT

0

0

0

1

0

VVWVVWUWVUVWAA iiiiii xwwxxx 222T2TTTT

0

0

0

1

0

VVWVVWUWVUVWAA

SVD and InversesSVD and Inverses

• Why is SVD so useful?Why is SVD so useful?

• Application #1: inversesApplication #1: inverses

• AA-1-1=(=(VVTT))-1-1 WW-1-1 UU-1-1 = = VV WW-1-1 UUTT

• This fails when some This fails when some wwii are 0 are 0– It’s It’s supposedsupposed to fail – singular matrix to fail – singular matrix

• Pseudoinverse: if Pseudoinverse: if wwii=0, set 1/=0, set 1/wwii to 0 (!) to 0 (!)– ““Closest” matrix to inverseClosest” matrix to inverse

– Defined for all (even non-square) matricesDefined for all (even non-square) matrices

– Equal to (Equal to (AATTAA))-1-1AATT

SVD and Least SquaresSVD and Least Squares

• Solving Solving AxAx==bb by least squares by least squares

• xx=pseudoinverse(=pseudoinverse(AA) times ) times bb

• Compute pseudoinverse using SVDCompute pseudoinverse using SVD– Lets you see if data is singularLets you see if data is singular

– Even if not singular, ratio of max to min Even if not singular, ratio of max to min singular values (condition number) tells you singular values (condition number) tells you how stable the solution will behow stable the solution will be

– Set 1/Set 1/wwii to 0 if to 0 if wwii is small (even if not is small (even if not

exactly 0)exactly 0)

SVD and Matrix SimilaritySVD and Matrix Similarity

• One common definition for the norm of a One common definition for the norm of a matrix is the Frobenius norm:matrix is the Frobenius norm:

• Frobenius norm can be computed from Frobenius norm can be computed from SVDSVD

• So changes to a matrix can be evaluated So changes to a matrix can be evaluated by looking at changes to singular valuesby looking at changes to singular values

i j

ija2

FA

i jija

2

FA

i

iw2

FA

iiw2

FA

SVD and Matrix SimilaritySVD and Matrix Similarity

• Suppose you want to find best rank-Suppose you want to find best rank-kk approximation to approximation to AA

• Answer: set all but the largest Answer: set all but the largest kk singular singular values to zerovalues to zero

• Can form compact representation by Can form compact representation by eliminating columns of eliminating columns of UU and and VV corresponding to zeroed corresponding to zeroed wwii

SVD and OrthogonalizationSVD and Orthogonalization

• The matrix The matrix UU is the “closest” is the “closest” orthonormal matrix to orthonormal matrix to AA

• Yet another useful application of the Yet another useful application of the matrix-approximation properties of SVDmatrix-approximation properties of SVD

• Much more stable numerically thanMuch more stable numerically thanGraham-Schmidt orthogonalizationGraham-Schmidt orthogonalization

• Find rotation given general affine matrixFind rotation given general affine matrix

SVD and PCASVD and PCA

• Principal Components Analysis (PCA): Principal Components Analysis (PCA): approximating a high-dimensional data approximating a high-dimensional data setsetwith a lower-dimensional subspacewith a lower-dimensional subspace

Original axesOriginal axes

****

******

**** **

**

********

**

**

****** **

**** ******

Data pointsData points

First principal componentFirst principal componentSecond principal componentSecond principal component

SVD and PCASVD and PCA

• Data matrix with points as rows, take Data matrix with points as rows, take SVDSVD– Subtract out mean (“whitening”)Subtract out mean (“whitening”)

• Columns of Columns of VVkk are principal components are principal components

• Value of Value of wwii gives importance of each gives importance of each

componentcomponent

PCA on Faces: “Eigenfaces”PCA on Faces: “Eigenfaces”

AverageAveragefaceface

First principal componentFirst principal component

OtherOthercomponentscomponents

For all except average,For all except average,“gray” = 0,“gray” = 0,

“white” > 0,“white” > 0,““black” < 0black” < 0

Using PCA for RecognitionUsing PCA for Recognition

• Store each person as coefficients of Store each person as coefficients of projection onto first few principal projection onto first few principal componentscomponents

• Compute projections of target image, Compute projections of target image, compare to database (“nearest compare to database (“nearest neighbor classifier”)neighbor classifier”)

max

0iEigenfaceimage

i

iia

max

0iEigenfaceimage

i

iia