Entropic graphs: Applications Alfred O. Hero Dept. EECS, Dept BME, Dept. Statistics University of...

44
Entropic graphs: Applications Alfred O. Hero Dept. EECS, Dept BME, Dept. Statistics University of Michigan - Ann Arbor [email protected] http://www.eecs.umich.edu/~hero 1.Dimension reduction and pattern matching 2.Entropic graphs for manifold learning 3.Simulation studies 4.Applications to face and digit databases
  • date post

    20-Dec-2015
  • Category

    Documents

  • view

    225
  • download

    0

Transcript of Entropic graphs: Applications Alfred O. Hero Dept. EECS, Dept BME, Dept. Statistics University of...

Entropic graphs: Applications

Alfred O. Hero Dept. EECS, Dept BME, Dept. Statistics

University of Michigan - Ann Arbor [email protected]

http://www.eecs.umich.edu/~hero

1. Dimension reduction and pattern matching2. Entropic graphs for manifold learning3. Simulation studies4. Applications to face and digit databases

1.Dimension Reduction and Pattern Matching

• 128x128 images of faces• Different poses, illuminations, facial expressions

• The set of all face images evolve on a lower dimensional imbedded manifold in R^(16384)

Face Manifold

Classification on Face Manifold

Manifold Learning:What is it good for?

• Interpreting high dimensional data• Discovery and exploitation of lower dimensional

structure • Deducing non-linear dependencies between

populations• Improving detection and classification

performance• Improving image compression performance

Background on Manifold Learning1. Manifold intrinsic dimension estimation

1. Local KLE, Fukunaga, Olsen (1971)2. Nearest neighbor algorithm, Pettis, Bailey, Jain, Dubes (1971) 3. Fractal measures, Camastra and Vinciarelli (2002)4. Packing numbers, Kegl (2002)

2. Manifold Reconstruction1. Isomap-MDS, Tenenbaum, de Silva, Langford (2000)2. Locally Linear Embeddings (LLE), Roweiss, Saul (2000)3. Laplacian eigenmaps (LE), Belkin, Niyogi (2002)4. Hessian eigenmaps (HE), Grimes, Donoho (2003)

3. Characterization of sampling distributions on manifolds1. Statistics of directional data, Watson (1956), Mardia (1972)2. Data compression on 3D surfaces, Kolarov, Lynch (1997) 3. Statistics of shape, Kendall (1984), Kent, Mardia (2001)

A statistical sample

Sampling distribution

2dim manifold

Domain Sampling

Embedding

Sampling on a Domain Manifold

Observed sample

Learning 3D ManifoldsRef: Tenenbaum&etal (2000)

• Sampling density fy = Uniform on manifold

N=400 N=800

Ref: Roweiss&etal (2000)

Swiss Roll S-Curve

Sampled S-curve

What is shortest path between points A and B along manifold?

A

B

Geodesic from A to B is shortest path

Euclidean Path is poor approximation

Geodesic Graph Path Approximation

A B

Dykstra’s shortestpath approximates geodesic

k-NNG skeletonk=4

ISOMAP (PCA) Reconstruction

• Compute k-NN skeleton on observed sample• Run Dykstra’s shortest path algorithm between

all pairs of vertices of k-NN• Generate Geodesic pairwise distance matrix

approximation • Perform MDS on

• Reconstruct sample in manifold domain

ISOMAP Convergence• When domain mapping is an isometry, domain is

open and convex, and true domain dimension d is known (de Silva&etal:2001):

• How to estimate d?

• How to estimate attributes of sampling density?

How to Estimate d?

0 2 4 6 8 10 12 14 16 18 200

0.005

0.01

0.015R

esid

ual v

aria

nce

Isomap dimensionality

Residual variance vs dimentionality- Data Set 1Landmark-ISOMAP residual curveFor Abilene Netflow data set

2. Entropic Graphs

• in D-dimensional Euclidean space

• Euclidean MST with edge power weighting gamma:

• pairwise distance matrix over • edge length matrix of spanning trees over

• Euclidean k-NNG with edge power weighting gamma:

• When obtain Geodesic MST

Example: Uniform Planar Sample

Example: MST on Planar Sample

Example: k-NNG on Planar Sample

Convergence of Euclidean MST

Beardwood, Halton, Hammersley Theorem:

GMST Convergence Theorem

Ref: Costa&Hero:TSP2003

k-NNG Convergence Theorem

Shrinkwrap Interpretation

n=400 n=800

Dimension = “Shrinkage rate” as vary number of resampled points on M

Joint Estimation Algorithm

• Convergence theorem suggests log-linear model

• Use bootstrap resampling to estimate mean graph length and apply LS to jointly estimate slope and intercept from sequence

• Extract d and H from slope and intercept

3. Simulation Studies: Swiss Roll

GMST kNN

K=4

• n=400, f=Uniform on manifold

Estimates of GMST Length

785 790 795 800805

806

807

808

809

810

811

812

813

814

815

n

E[L

n]

Segment n=786:799 of MST sequence (=1,m=10) for unif sampled Swiss Roll

Bootstrap SE bar (83% CI)

loglogLinear Fit to GMST Length

6.665 6.67 6.675 6.68 6.6856.692

6.694

6.696

6.698

6.7

6.702

6.704Segment of logMST sequence (=1,m=10) for unif sampled Swiss Roll

log(n)

log

(E[L

n])

y = 0.53*x + 3.2

log(E[Ln])

LS fit

GMST Dimension and Entropy Estimates

• From LS fit find:• Intrinsic dimension estimate

• Alpha-entropy estimate ( )

– Ground truth:

MST/kNN Comparisons

n=800 n=400

n=800 n=400

MST MST

kNN kNN

Entropic Graphs on S2 Sphere in 3D

• n=500, f=Uniform on manifoldGMST kNN

k-NNG on Sphere S4 in 5D

Histogram of resampled d-estimates of k-NNG

N=1000 points uniformly distributed on S4 (sphere) in 5D

• k=7 for all algorithms

• kNN resampled 5 times

• Length regressed on 10 or 20 samples at end of mean length sequence

• 30 experiments performed

• ISOMAP always estimates d=5

Table of relative frequencies of correct d estimate

n

Table of relative frequencies of correct d estimate

Estimated entropy (n = 600)True Entropy

kNN/GMST Comparisons

GMST 4-NN

kNN/GMST Comparisons for Uniform Hyperplane

Improve Performance by Bootstrap Resampling• Main idea: Averaging of weak learners

– Using fewer (N) samples per MST estimate, generate large number (M) of weak estimates of d and H

– Reduce bias by averaging these estimates (M>>1,N=1) – Better than optimizing estimate of MST length (M=1,N>>1)

Illustration of bootstrap resampling method: A,B: N=1 vs C: M=1

Table of relative frequencies of correct d estimate using the GMST,with (N = 1) and without (M = 1) bias correction.

kNN/GMST Comparisons for Uniform Hyperplane

4. Application: ISOMAP Face Database

• http://isomap.stanford.edu/datasets.html

• Synthesized 3D face surface• Computer generated images representing 700 different angles and illuminations• Subsampled to 64 x 64 resolution (D=4096)• Disagreement over intrinsic dimensionality

– d=3 (Tenenbaum) vs d=4 (Kegl)

Resampling Histogram of d hat

Mean GMST Length Function

d=3H=21.1 bits

Mean kNNG (k=7) length

d=4H=21.8 bits

Application: Yale Face Database

• Description of Yale face database 2– Photographic folios of many people’s faces – Each face folio contains images at 585

different illumination/pose conditions– Subsampled to 64 by 64 pixels (4096 extrinsic

dimensions)

• Objective: determine intrinsic dimension and entropy of a typical face folio

Samples from Face database B

GMST for 3 Face Folios

Real valued intrinsic dimension estimates using 3-NN graph for face 1.

Real valued intrinsic dimension estimates using 3-NN graph for face 2.

Dimension Estimator Histograms for Face database B

Remarks on Yale Facebase B• GMST LS estimation parameters

– Local Geodesic approximation used to generate pairwise distance matrix

– Estimates based on 25 resamplings over 18 largest folio sizes

• To represent any folio we might hope to attain– factor > 600 reduction in degrees of freedom (dim)– only 1/10 bit per pixel for compression– a practical parameterization/encoder?

Sample: MNIST Handwritten Digits

Application: MNIST Digit Database

Estimated intrinsic dimension

Histogram of intrinsic dimension estimates: GMST (left) and 5-NN (right) (M = 1, N = 10, Q = 15).

MNIST Digit Database

ISOMAP (k = 6) residual variance plot.

The digits database contains nonlinear transformations, such as width distortions of each digit, that are not adequately modeled by ISOMAP!

MNIST Digit Database

Conclusions• Entropic graphs give accurate global and consistent

estimators of dimension and entropy• Manifold learning and model reduction

– LLE, LE, HE estimate d by finding local linear representation of manifold

– Entropic graph estimates d from global resampling – Initialization of ISOMAP… with entropic graph estimator

• Computational considerations– GMST, kNN with pairwise distance matrix: O(E log E)– GMST with greedy neighborhood search: O(d n log n)– kNN with kdb tree partitioning: O(d n log n)

References• A. O. Hero, B. Ma, O. Michel and J. D. Gorman,

“Application of entropic graphs,” IEEE Signal Processing Magazine, Sept 2002.

• H. Neemuchwala, A.O. Hero and P. Carson, “Entropic graphs for image registration,” to appear in European Journal of Signal Processing, 2003.

• J. Costa and A. O. Hero, “Manifold learning with geodesic minimal spanning trees,” to appear in IEEE T-SP (Special Issue on Machine Learning), 2004.

• A. O. Hero, J. Costa and B. Ma, "Convergence rates of minimal graphs with random vertices," submitted to IEEE T-IT, March 2001.

• J. Costa, A. O. Hero and C. Vignat, "On solutions to multivariate maximum alpha-entropy Problems", in Energy Minimization Methods in Computer Vision and Pattern Recognition (EMM-CVPR), Eds. M. Figueiredo, R. Rangagaran, J. Zerubia, Springer-Verlag, 2003