A principal components analysis self-organizing map

26
Intelligent Database Systems Lab 國國國國國國國國 National Yunlin University of Science and T echnology Advisor : Dr. Hsu Student : Sheng-Hsuan Wang Department of Information Management A principal components analysis self-organizing map Neural Network 17 (2004) 261-270 Ezequiel Lopez-Rubio, Jose Munoz-Perez, Jose Antonio Gomez-Ruiz

description

A principal components analysis self-organizing map. Ezequiel Lopez-Rubio, Jose Munoz-Perez, Jose Antonio Gomez-Ruiz. Advisor : Dr. Hsu Student : Sheng-Hsuan Wang Department of Information Management. Neural Network 17 (2004) 261-270. Outline. Motivation Objective - PowerPoint PPT Presentation

Transcript of A principal components analysis self-organizing map

Page 1: A principal components analysis self-organizing map

Intelligent Database Systems Lab

國立雲林科技大學National Yunlin University of Science and Technology

Advisor : Dr. Hsu Student : Sheng-Hsuan Wang

Department of Information Management

A principal components analysis self-organizing map

Neural Network 17 (2004) 261-270

Ezequiel Lopez-Rubio, Jose Munoz-Perez, Jose Antonio Gomez-Ruiz

Page 2: A principal components analysis self-organizing map

2

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.

Outline

Motivation Objective The ASSOM network The PCASOM model Experiments Conclusion

Page 3: A principal components analysis self-organizing map

3

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Motivation

The adaptive subspace self-organizing map (ASSOM) is an alternative to the standard principal component analysis (PCA) algorithm─ Look for the most relevant features of the input data.─ However, its training equations are complexes.

Separate ability in the classical PCA and ASSOM.

Page 4: A principal components analysis self-organizing map

4

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Objective

This paper proposed a new self-organizing neural model that performs principal components analysis─ Like the ASSOM, but has a broader capability to

represent the input distribution.

Page 5: A principal components analysis self-organizing map

5

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.The ASSOM network

The ASSOM network uses subspaces in each node rather than just single weights.

The ASSOM network is based on training not just using single samples but sets of slightly translated, rotated and/or scaled signal or image samples, called episodes.

Each neuron of an ASSOM network represents a subset of the input data with a vector basis which is adapted so that the local geometry of the input data is build.

Page 6: A principal components analysis self-organizing map

6

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.The ASSOM network

orthogonal projection

projection error

Page 7: A principal components analysis self-organizing map

7

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.The ASSOM network orthogonal projection

─ A vector x on an orthonormal vector basis B={bh|h=1,…,K}

The vector x can be decomposed into two vectors─ orthogonal projection and projection error.

Page 8: A principal components analysis self-organizing map

8

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.The ASSOM network The input vectors are grouped into episodes in order t

o be presented to the network.

An episode S(t) has many time instants tp belongs to S(t), each with an input vector x(tp).

episodes: sets of slightly translated, rotated or scaled samples.

Page 9: A principal components analysis self-organizing map

9

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.The ASSOM network Winner lookup

Learning─ Basis vectors rotation

─ DissipationEliminate instability

Orthonormalization─ Orthonormalize every basis for good performance.

Page 10: A principal components analysis self-organizing map

10

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.The ASSOM network The objective function is the average expected spatial

ly weighted normalized squared projection error over the episodes.

─ The Robbins-Monro stochastic approximation is used to minimize objective function, which leads to Basis vectors rotation.

Page 11: A principal components analysis self-organizing map

11

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.The PCASOM network

Neuron weights updating─ Use the covariance matrix to store the information.─ The covariance matrix of an input vector x is defined as

─ M input samples

Page 12: A principal components analysis self-organizing map

12

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.The PCASOM network

The best approximation─ It is an unbiased estimator with minimum variance.─ If we obtain N new input samples

Page 13: A principal components analysis self-organizing map

13

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.The PCASOM network

Page 14: A principal components analysis self-organizing map

14

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.The PCASOM network

The outputs of the algorithm are RV and eIII the new approximations.

Page 15: A principal components analysis self-organizing map

15

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.The PCASOM network

Page 16: A principal components analysis self-organizing map

16

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.The PCASOM network

Competition among neurons─ The neuron c that has the minimum sum of projection e

rrors is the winner:

─ Orth(x, B) is the orthogonal projection of vector x on basis B.

Page 17: A principal components analysis self-organizing map

17

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.The PCASOM network

Network topology─ Neighborhood function─ update the vector ei and the matrix Ri

Page 18: A principal components analysis self-organizing map

18

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.The PCASOM network

Summary─ For every unit i, obtain the initial covariance matrix R

(0).─ For every unit i, build the vector ei(0) by using small ra

ndom value.─ At time instant t, select the input vectors x(t). Compute

the winning neuron c.─ For every unit i; update the vector ei and the matrix Ri

─ Convergence condition.

Page 19: A principal components analysis self-organizing map

19

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Comparison with ASSOM

Solidly rooted on statistics. Update equation is more stable

─ Matrix sums

Does not need episodes. It has a wider capability to represent the input

distribution.

Page 20: A principal components analysis self-organizing map

20

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Drawback of the classical PCA

Page 21: A principal components analysis self-organizing map

21

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

Convergence speed experiment─ Relative error for an input vector x

projection error norm for BMU

the norm of the input vector

Page 22: A principal components analysis self-organizing map

22

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

Separation capability experiment

Page 23: A principal components analysis self-organizing map

23

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

UCI benchmark databases experiment

Page 24: A principal components analysis self-organizing map

24

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Experiments

UCI benchmark databases experiment

Page 25: A principal components analysis self-organizing map

25

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Conclusions

A new self-organizing network that performs PCA─ Related to the ASSOM─ Its training equations are much simpler─ Its input representation capability is broader

Experiments show that the new model has better performance than the ASSOM network.

Page 26: A principal components analysis self-organizing map

26

Intelligent Database Systems Lab

N.Y.U.S.T.

I. M.Personal opinion

Valuable idea─ SOM based on PCA

Contribution─ Input data─ Cluster shape─ Performance

Drawback─ Hard to implement.