Intelligent Database Systems Lab
國立雲林科技大學National Yunlin University of Science and Technology
Advisor : Dr. Hsu Student : Sheng-Hsuan Wang
Department of Information Management
A principal components analysis self-organizing map
Neural Network 17 (2004) 261-270
Ezequiel Lopez-Rubio, Jose Munoz-Perez, Jose Antonio Gomez-Ruiz
2
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.
Outline
Motivation Objective The ASSOM network The PCASOM model Experiments Conclusion
3
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.Motivation
The adaptive subspace self-organizing map (ASSOM) is an alternative to the standard principal component analysis (PCA) algorithm─ Look for the most relevant features of the input data.─ However, its training equations are complexes.
Separate ability in the classical PCA and ASSOM.
4
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.Objective
This paper proposed a new self-organizing neural model that performs principal components analysis─ Like the ASSOM, but has a broader capability to
represent the input distribution.
5
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.The ASSOM network
The ASSOM network uses subspaces in each node rather than just single weights.
The ASSOM network is based on training not just using single samples but sets of slightly translated, rotated and/or scaled signal or image samples, called episodes.
Each neuron of an ASSOM network represents a subset of the input data with a vector basis which is adapted so that the local geometry of the input data is build.
6
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.The ASSOM network
orthogonal projection
projection error
7
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.The ASSOM network orthogonal projection
─ A vector x on an orthonormal vector basis B={bh|h=1,…,K}
The vector x can be decomposed into two vectors─ orthogonal projection and projection error.
8
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.The ASSOM network The input vectors are grouped into episodes in order t
o be presented to the network.
An episode S(t) has many time instants tp belongs to S(t), each with an input vector x(tp).
episodes: sets of slightly translated, rotated or scaled samples.
9
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.The ASSOM network Winner lookup
Learning─ Basis vectors rotation
─ DissipationEliminate instability
Orthonormalization─ Orthonormalize every basis for good performance.
10
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.The ASSOM network The objective function is the average expected spatial
ly weighted normalized squared projection error over the episodes.
─ The Robbins-Monro stochastic approximation is used to minimize objective function, which leads to Basis vectors rotation.
11
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.The PCASOM network
Neuron weights updating─ Use the covariance matrix to store the information.─ The covariance matrix of an input vector x is defined as
─ M input samples
12
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.The PCASOM network
The best approximation─ It is an unbiased estimator with minimum variance.─ If we obtain N new input samples
13
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.The PCASOM network
14
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.The PCASOM network
The outputs of the algorithm are RV and eIII the new approximations.
15
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.The PCASOM network
16
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.The PCASOM network
Competition among neurons─ The neuron c that has the minimum sum of projection e
rrors is the winner:
─ Orth(x, B) is the orthogonal projection of vector x on basis B.
17
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.The PCASOM network
Network topology─ Neighborhood function─ update the vector ei and the matrix Ri
18
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.The PCASOM network
Summary─ For every unit i, obtain the initial covariance matrix R
(0).─ For every unit i, build the vector ei(0) by using small ra
ndom value.─ At time instant t, select the input vectors x(t). Compute
the winning neuron c.─ For every unit i; update the vector ei and the matrix Ri
─ Convergence condition.
19
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.Comparison with ASSOM
Solidly rooted on statistics. Update equation is more stable
─ Matrix sums
Does not need episodes. It has a wider capability to represent the input
distribution.
20
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.Drawback of the classical PCA
21
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.Experiments
Convergence speed experiment─ Relative error for an input vector x
projection error norm for BMU
the norm of the input vector
22
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.Experiments
Separation capability experiment
23
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.Experiments
UCI benchmark databases experiment
24
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.Experiments
UCI benchmark databases experiment
25
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.Conclusions
A new self-organizing network that performs PCA─ Related to the ASSOM─ Its training equations are much simpler─ Its input representation capability is broader
Experiments show that the new model has better performance than the ASSOM network.
26
Intelligent Database Systems Lab
N.Y.U.S.T.
I. M.Personal opinion
Valuable idea─ SOM based on PCA
Contribution─ Input data─ Cluster shape─ Performance
Drawback─ Hard to implement.
Top Related