Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.

9
Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki

Transcript of Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.

Page 1: Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.

Eigen Decomposition

Based on the slides by Mani Thomas

Modified and extended by Longin Jan Latecki

Page 2: Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.

Introduction Eigenvalue decomposition Physical interpretation of

eigenvalue/eigenvectors

Page 3: Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.

What are eigenvalues? Given a matrix, A, x is the eigenvector

and is the corresponding eigenvalue if Ax = x A must be square and the determinant of A -

I must be equal to zeroAx - x = 0 iff (A - I) x = 0

Trivial solution is if x = 0 The non trivial solution occurs when det(A - I) = 0

Are eigenvectors unique? If x is an eigenvector, then x is also an

eigenvector and is an eigenvalueA(x) = (Ax) = (x) = (x)

Page 4: Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.

Calculating the Eigenvectors/values Expand the det(A - I) = 0 for a 2 x 2 matrix

For a 2 x 2 matrix, this is a simple quadratic equation with two solutions (maybe complex)

This “characteristic equation” can be used to solve for x

0

00det

010

01detdet

2112221122112

211222112221

1211

2221

1211

aaaaaa

aaaaaa

aa

aa

aaIA

21122211

22211

2211 4 aaaa

aaaa

Page 5: Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.

Eigenvalue example Consider,

The corresponding eigenvectors can be computed as

For = 0, one possible solution is x = (2, -1) For = 5, one possible solution is x = (1, 2)

5,0)41(

02241)41(

0

42

21

2

2211222112211

2

aaaaaa

A

0

0

12

24

12

240

50

05

42

215

0

0

42

21

42

210

00

00

42

210

yx

yx

y

x

y

x

yx

yx

y

x

y

x

For more information: Demos in Linear algebra by G. Strang, http://web.mit.edu/18.06/www/

Page 6: Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.
Page 7: Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.

Physical interpretation Consider a covariance matrix, A, i.e., A = 1/n S ST

for some S

Error ellipse with the major axis as the larger eigenvalue and the minor axis as the smaller eigenvalue

25.0,75.1175.

75.121

A

Page 8: Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.

Physical interpretation

Orthogonal directions of greatest variance in data Projections along PC1 (Principal Component) discriminate

the data most along any one axis

Original Variable A

Ori

gin

al V

ari

ab

le B

PC 1PC 2

Page 9: Eigen Decomposition Based on the slides by Mani Thomas Modified and extended by Longin Jan Latecki.

Physical interpretation First principal component is the direction of

greatest variability (covariance) in the data Second is the next orthogonal

(uncorrelated) direction of greatest variability So first remove all the variability along the first

component, and then find the next direction of greatest variability

And so on … Thus each eigenvectors provides the

directions of data variances in decreasing order of eigenvalues

For more information: See Gram-Schmidt Orthogonalization in G. Strang’s lectures