Local Quadrature Reconstruction On Smooth Manifolds
description
Transcript of Local Quadrature Reconstruction On Smooth Manifolds
LOCAL QUADRATURE RECONSTRUCTION ON SMOOTH MANIFOLDS
M.Tech Thesis Submitted by
Bhuwan DhingraY8127167
To the Department of Electrical Engineering
IIT Kanpur
Supervisors – Prof Amitabha Mukerjee, Prof KS Venkatesh
2
EXAMPLES Image sets with a few degrees of freedom
Disk-Shaped Planar Robot –
Each image lies in but has only 2 degrees of freedom Images sampled from a 2-d manifold
Other Examples:
n = 76 101 3, m = 1 n = 100 100 3, m = 2
n = 100 100 3, m = 2
3
MANIFOLDS
An m-dimensional manifold is a topological space which resembles the Euclidean space near each point
The manifold itself may lie in but it is everywhere locally homeomorphic to Generally n >> m
Homeomorphism – A continuous mapping with a continuous inverse
4
LATENT SPACE
Manifold points in Global Latent Vectors in
Cannot find global latent vectors for – Sphere, Torus, Cylinder etc. as these are not homeomorphic to any Euclidean space.
5
DIMENSIONALITY REDUCTION
Linear – Principal Components Analysis (PCA) Finds linear subspace in direction of maximum data
variance
6
NON-LINEAR DIMENSIONALITY REDUCTION (NLDR)
Kernel PCA (Scholkopf,1999) – Applies the kernel trick to project the data to a
high-dimensional space followed by normal PCA
ISOMAP (Tenenbaum,2000) – Preserves geodesic distances between points on
the manifold
LLE (Saul,2000) – Points are expressed as a linear combination of
their nearest neighbors, and the relationships are preserved in a global low-dimensional embedding
7
NON-LINEAR DIMENSIONALITY REDUCTION (NLDR)
8
PCA V NLDR
PCA
• Works only for data isometric to a hyperplane in
• Provides an explicit mapping between the latent space and manifold
NLDR
• Works for arbitrary non-linear manifolds
• Gives the embedding only for training points and no mapping between the two spaces
9
OUT-OF-SAMPLE POINTS
Out-of-Sample Extension – find for new
Out-of-Sample Reconstruction – find for new
𝒙𝒒
𝑓 𝑀
𝑓 𝑆𝒚𝒒
10
OUT OF SAMPLE EXTENSION
(Bengio,2004) – Cast several popular NLDR methods into a
unified framework as special cases of Kernel PCA with data dependent kernels
Nystrom method used to approximate the out-of-sample extension
(Strange,2011) –
11
OUT OF SAMPLE RECONSTRUCTION
Applications -
Video Frame Interpolation
12
OUT OF SAMPLE RECONSTRUCTION
Applications -
Generating Novel Views of an Object
13
OUT OF SAMPLE RECONSTRUCTION
Applications -
Robot Motion PlanningTesting if a local path is collision free
Local Planner
Reconstruct to see if collision free
14
OUT OF SAMPLE RECONSTRUCTION
Existing Methods –
Linear interpolation: Find k-nearest neighbors of new point Minimize
Reconstruction
Equivalent to fitting a hyper-plane through a small neighborhood on the manifold
Least Squares solution for finding optimal weights requires time
𝑘>𝑚
15
OUT OF SAMPLE RECONSTRUCTION
Existing Methods –
Locally Smooth Manifold Learning (Dollar,2006):
Learn a Warping function on the manifold which given a point generates its neighbors using a global regression
Computation time of LSML increases as where is the total number of points on manifold
16
LOCAL QUADRATURE RECONSTRUCTION (LQR)
Consider a local patch on -dimensional hypersurface in ()
Take to be the origin and the tangent space to be spanned by the first canonical vectors
𝒙=¿
17
LOCAL DIFFERENTIAL GEOMETRIC MODEL
Smoothness of the manifold implies –
For our choice of coordinate system,
Hessian:
: Principal Directions – Span the tangent space: Principal Curvatures
18
PRINCIPAL DIRECTIONS AND CURVATURES
19
LOCAL DIFFERENTIAL GEOMETRIC MODEL
First unit vectors in chosen to lie along the principal directions
Ignore higher order terms to get Quadrature Embedding
20
QUADRATURE EMBEDDING OF SMOOTH MANIFOLDS
Generalization to -dimensional Riemannian manifolds in (Tyagi,2012)
𝒛=[𝑧 1𝑧 2… 𝑧𝑚]∈𝑇 𝑝𝑀Tangent Space Components
Normal Space Components
21
QUADRATIC REGRESSION
If tangent vectors are aligned with principal directions:
If not, we need cross-terms:
In general, for robust estimation:
Need points
…+
…+
�̂�𝑚+𝑖=h(𝑖) ( �̂� )
22
MOTIVATION FOR LQR
Total curvature parameters in normal space above
Require prohibitively large number of points in for regression
Claim: Directions of high data variance exhibit high curvature
LQR extracts only principal components from the normal space
23
LQR
• PCA
Tangent and Normal Space Estimation
• Least Squares estimation on first principal components
Linear Regression on Tangent Space
• Least Squares estimation along next principal components
Quadratic Regression on Normal Space
24
TANGENT AND NORMAL SPACE ESTIMATION
Eigenvectors of PCA on -NN of image shown
• - Tangent Vectors
• - Normal Vectors
As sampling density on the manifold increases tangent space found by PCA approaches true tangent space (Tyagi,2012)
𝑘1=14
25
2-D LINEAR LEAST SQUARES
𝑘2=7
26
QUADRATIC REGRESSION
Test Imag
e
LQR
Linear
27
ROTATING TEAPOT
𝑛=23028 ,𝑚=1
28
REGULARIZATION
Important to avoid over fitting since is not much greater than
Linear Regression:
Quadratic Regression:
ℰ (𝑊 )=‖𝒚𝒒−∑𝑖
𝑤𝑖 𝒚 𝑖‖𝟐+𝜆𝐿𝑊
𝑇𝑊
29
FREE PARAMETERS
30
NUMBER OF NORMAL COMPONENTS
For setting we use the following rule –
are the eigenvalues of the covariance matrix for PCA
is set to the minimum value such that
is threshold of data variance we want to consider
31
COMPLEXITY
-NN search: or
PCA:
Linear Regression:
Quadratic Regression:
Projection:
32
ROTATING TEAPOT
Original Images
LQR
Linear Reconstruction
s
𝑀𝑆𝐸𝐿𝑄𝑅=80.44𝑀𝑆𝐸𝐿𝑖𝑛=99.97
33
DISK-SHAPED PLANAR ROBOT
was set with a energy threshold
34
DISK-SHAPED PLANAR ROBOT
Top – Original Images, Middle – LQR, Bottom - Linear
𝑘1=14 ,𝑘2=7 ,𝜆𝐿=10− 3 ,𝜆𝑄=1
35
DISK SHAPED PLANAR ROBOT
Of 200 tested images, LQR outperformed Linear in 183
Size of test point proportional to error
36
SOME FAILURES
37
VIDEO COMPRESSION
Video Sequences with few degrees of freedom are low-dimensional trajectories in the space of all images
NLDR methods can be used to assign latent vectors to each frame
Total frames in latent vectors
in Transmitter:
latent vectors and frames
Reconstruct frames
NLDR
LQR
Retain only frames
Receiver:
𝑚≪𝑛
38
FOREMAN VIDEO SEQUENCE
39
FRAME INTERPOLATION
,
40
FRAME INTERPOLATION
,
41
FEATURES OF LQR
Advantages – Finds better reconstructions than linear
interpolation by considering second order terms in time
No training phase Can be applied to any latent space generated by
any NLDR algorithm
42
FEATURES OF LQR
Limitation – Number of neighbors increases with the
dimensionality of the manifold as Need exponentially greater total number of
points on the manifold Computation time increases as or Over fitting due to large number of parameters
in regression
Cannot be used for manifolds with high value of Ex: MNIST digits dataset (), Face datasets ()
43
THANK YOU
44
APPENDIX
45
EXAMPLES
Curves and surfaces -
n = 2m =
1
n = 3m =
1
n = 3m =
2
CircleSpiral
Swiss-Roll
46
LATENT VECTORS
Manifolds which are also globally homeomorphic to can be endowed with an m-dimensional representation called its Latent Vectors
Latent vectors are not unique
Latent space may be known explicitly from function generating data, or can be found using Dimensionality Reduction
47
DEFINITIONS
48
LOCAL QUADRATURE RECONSTRUCTION (LQR)
Restricted to a small neighborhood on the manifold like linear interpolation
Fits a differential geometric model to this neighborhood
Better reconstruction than linear interpolation since we retain up to second order terms in the Taylor series expansion
49
LQR
• Extract principal components from -nearest neighbors of (nearest neighbor of )
• First span the Tangent Space at (Tyagi,2012)• Next orthogonal to the Tangent Space
Tangent and Normal Space Estimation
• Linearly interpolate between Latent Space and Tangent Space to find projection of out-of-sample point onto
• Both these spaces are -dimensional• Need points in neighborhood
Linear Regression on Tangent Space
• Fit a second order equation along each of normal space components
• Least Squares regression over -nearest neighbors used to find optimal coefficients of the equation
• Need +1 points in the neighborhood
Quadratic Regression on Normal Space
50
TANGENT AND NORMAL SPACE ESTIMATION (PCA)
: -nearest neighbors of
Eigendecomposition:
Estimated Tangent Space:
Estimated Normal Space:
𝑈=[𝒖𝟏𝒖𝟐…𝒖𝒌𝟏] 𝚲=𝒅𝒊𝒂𝒈(𝝀𝟏 ,𝝀𝟐…𝝀𝒌𝟏
)
𝑁 𝑝𝑀=𝑠𝑝𝑎𝑛(𝒖𝒎+𝟏 ,𝒖𝒎+𝟐…𝒖𝒎+𝒅)
51
SWISS ROLL
𝑘1=18
52
LINEAR REGRESSION
: -nearest neighbors of in latent space
Need points
53
LEAST SQUARES WITH REGULARIZATION
We want to solve an overdetermined system of equations
Least Squares – Minimize
Regularization – Minimize
Solution –
54
LQR V LINEAR RECONSTRUCTION
55
IMAGE SETS
Modification – Use instead of for finding tangent space and
normal space components
56
EXTRAPOLATION V INTERPOLATION
57
PLANAR ARTICULATED ROBOT ARM
𝑀𝑆𝐸𝐿𝑄𝑅=8 .74 ,𝑀𝑆𝐸𝐿𝑖𝑛=9 .63