Outline Motivation Examples Theoretical Problems Results Perspectives
Manifold Learning
Olivier Bousquet
Curves and Surfaces, Avignon, 2006
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
1 Motivation
2 Examples
3 Theoretical Problems
4 Results
5 Perspectives
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Acknowledgements
Most of the work presented here has been done by or in collaborationwith
Ulrike von Luxburg
Matthias Hein (special thanks for the pictures)
Mikhail Belkin
Jean-Yves Audibert
Olivier Chapelle
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Learning
Machine Learning: develop algorithms to automatically extract”patterns” or ”regularities” from data (generalization)
Typical tasks
Clustering: Find groups of similar pointsDimensionality reduction: Project points in a lowerdimensional space while preserving structureSemi-supervised: Given labelled and unlabelled points, build alabelling functionSupervised: Given labelled points, build a labelling function
All these tasks are not well-defined
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Learning
Machine Learning: develop algorithms to automatically extract”patterns” or ”regularities” from data (generalization)
Typical tasks
Clustering: Find groups of similar pointsDimensionality reduction: Project points in a lowerdimensional space while preserving structureSemi-supervised: Given labelled and unlabelled points, build alabelling functionSupervised: Given labelled points, build a labelling function
All these tasks are not well-defined
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Learning
Machine Learning: develop algorithms to automatically extract”patterns” or ”regularities” from data (generalization)
Typical tasks
Clustering: Find groups of similar pointsDimensionality reduction: Project points in a lowerdimensional space while preserving structureSemi-supervised: Given labelled and unlabelled points, build alabelling functionSupervised: Given labelled points, build a labelling function
All these tasks are not well-defined
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Learning
Machine Learning: develop algorithms to automatically extract”patterns” or ”regularities” from data (generalization)
Typical tasks
Clustering: Find groups of similar pointsDimensionality reduction: Project points in a lowerdimensional space while preserving structureSemi-supervised: Given labelled and unlabelled points, build alabelling functionSupervised: Given labelled points, build a labelling function
All these tasks are not well-defined
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Learning
Machine Learning: develop algorithms to automatically extract”patterns” or ”regularities” from data (generalization)
Typical tasks
Clustering: Find groups of similar pointsDimensionality reduction: Project points in a lowerdimensional space while preserving structureSemi-supervised: Given labelled and unlabelled points, build alabelling functionSupervised: Given labelled points, build a labelling function
All these tasks are not well-defined
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Learning
Machine Learning: develop algorithms to automatically extract”patterns” or ”regularities” from data (generalization)
Typical tasks
Clustering: Find groups of similar pointsDimensionality reduction: Project points in a lowerdimensional space while preserving structureSemi-supervised: Given labelled and unlabelled points, build alabelling functionSupervised: Given labelled points, build a labelling function
All these tasks are not well-defined
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Clustering
−1.5 −1 −0.5 0 0.5 1 1.5 2 2.5−1
−0.5
0
0.5
1
1.5
Identify the two ”groups”
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Dimensionality Reduction
Objects in R4096 ?But only 3 parameters: 2 angles and 1 for illumation. They span a3-dimensional submanifold of R4096!
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Dimensionality Reduction
Objects in R4096 ?But only 3 parameters: 2 angles and 1 for illumation. They span a3-dimensional submanifold of R4096!
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Semi-supervised Learning
−1.5 −1 −0.5 0 0.5 1 1.5 2 2.5−1
−0.5
0
0.5
1
1.5
Assign a label to the black points
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Supervised Learning
−1.5 −1 −0.5 0 0.5 1 1.5 2 2.5−1
−0.5
0
0.5
1
1.5
Build a function which predicts the label of all points in the space
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Formal Definitions
Clustering: Given {X1, . . . ,Xn}, build a functionf : X → {1, . . . , k}.
Dimensionality reduction: Given {X1, . . . ,Xn} ∈ RD , build afunction f : RD → Rd
Semi-supervised: Given {X1, . . . ,Xn} and {Y1, . . . ,Ym} withm << n, build a function f : X → Y
Supervised: Given {(X1,Y1), . . . , (Xn,Yn)}, build f : X → Y
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Questions
Key question: how to define ”regularity”, or similarity betweenpoints?
How to translate this into an algorithm?
Focus of this talk: consider and exploit the geometric structure ofthe data
Learning Theory: provide a framework for analyzing the behavior ofthese algorithms
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Questions
Key question: how to define ”regularity”, or similarity betweenpoints?
How to translate this into an algorithm?
Focus of this talk: consider and exploit the geometric structure ofthe data
Learning Theory: provide a framework for analyzing the behavior ofthese algorithms
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Questions
Key question: how to define ”regularity”, or similarity betweenpoints?
How to translate this into an algorithm?
Focus of this talk: consider and exploit the geometric structure ofthe data
Learning Theory: provide a framework for analyzing the behavior ofthese algorithms
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Questions
Key question: how to define ”regularity”, or similarity betweenpoints?
How to translate this into an algorithm?
Focus of this talk: consider and exploit the geometric structure ofthe data
Learning Theory: provide a framework for analyzing the behavior ofthese algorithms
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Geometric Learning
Typically, one tries to implement the following simple idea
Clustering: two ”close” points should be in the same clusterDimensionality reduction: ”closeness” should be preservedSemi-supervised/supervised: two ”close” points should havethe same label
Examples: k-means clustering, k-nearest neighbors...
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Geometric Learning
Typically, one tries to implement the following simple idea
Clustering: two ”close” points should be in the same clusterDimensionality reduction: ”closeness” should be preservedSemi-supervised/supervised: two ”close” points should havethe same label
Examples: k-means clustering, k-nearest neighbors...
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Geometric Learning
Typically, one tries to implement the following simple idea
Clustering: two ”close” points should be in the same clusterDimensionality reduction: ”closeness” should be preservedSemi-supervised/supervised: two ”close” points should havethe same label
Examples: k-means clustering, k-nearest neighbors...
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Geometric Learning
Typically, one tries to implement the following simple idea
Clustering: two ”close” points should be in the same clusterDimensionality reduction: ”closeness” should be preservedSemi-supervised/supervised: two ”close” points should havethe same label
Examples: k-means clustering, k-nearest neighbors...
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Geometric Learning
Typically, one tries to implement the following simple idea
Clustering: two ”close” points should be in the same clusterDimensionality reduction: ”closeness” should be preservedSemi-supervised/supervised: two ”close” points should havethe same label
Examples: k-means clustering, k-nearest neighbors...
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Manifold Learning
The notion of closeness can be further refined using the distribution ofthe data:
Probabilistic viewpoint: density shortens the distances
Cluster viewpoint: points in connected regions share the sameproperties
Manifold viewpoint: distance should be measured ”along” the datamanifold
Mixed version: two ”close” points are those connected by a shortpath going through high density regions
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Manifold Learning
The notion of closeness can be further refined using the distribution ofthe data:
Probabilistic viewpoint: density shortens the distances
Cluster viewpoint: points in connected regions share the sameproperties
Manifold viewpoint: distance should be measured ”along” the datamanifold
Mixed version: two ”close” points are those connected by a shortpath going through high density regions
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Manifold Learning
The notion of closeness can be further refined using the distribution ofthe data:
Probabilistic viewpoint: density shortens the distances
Cluster viewpoint: points in connected regions share the sameproperties
Manifold viewpoint: distance should be measured ”along” the datamanifold
Mixed version: two ”close” points are those connected by a shortpath going through high density regions
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Manifold Learning
The notion of closeness can be further refined using the distribution ofthe data:
Probabilistic viewpoint: density shortens the distances
Cluster viewpoint: points in connected regions share the sameproperties
Manifold viewpoint: distance should be measured ”along” the datamanifold
Mixed version: two ”close” points are those connected by a shortpath going through high density regions
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Regularization
Adopt a functional viewpoint for convenience
We want to define functions that conform with the regularitiesmentioned above: values on close points should be close:Xi ∼ Xj ⇒ f (Xi ) ∼ f (Xj)
Gradient norm∫‖∇f ‖2dµ
Weighted gradient norm∫‖∇f ‖2pαdµ
Manifold gradient∫‖∇Mf ‖2pαdµ
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Regularization
Adopt a functional viewpoint for convenience
We want to define functions that conform with the regularitiesmentioned above: values on close points should be close:Xi ∼ Xj ⇒ f (Xi ) ∼ f (Xj)
Gradient norm∫‖∇f ‖2dµ
Weighted gradient norm∫‖∇f ‖2pαdµ
Manifold gradient∫‖∇Mf ‖2pαdµ
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Regularization
Adopt a functional viewpoint for convenience
We want to define functions that conform with the regularitiesmentioned above: values on close points should be close:Xi ∼ Xj ⇒ f (Xi ) ∼ f (Xj)
Gradient norm∫‖∇f ‖2dµ
Weighted gradient norm∫‖∇f ‖2pαdµ
Manifold gradient∫‖∇Mf ‖2pαdµ
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Regularization
Adopt a functional viewpoint for convenience
We want to define functions that conform with the regularitiesmentioned above: values on close points should be close:Xi ∼ Xj ⇒ f (Xi ) ∼ f (Xj)
Gradient norm∫‖∇f ‖2dµ
Weighted gradient norm∫‖∇f ‖2pαdµ
Manifold gradient∫‖∇Mf ‖2pαdµ
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Regularization
Adopt a functional viewpoint for convenience
We want to define functions that conform with the regularitiesmentioned above: values on close points should be close:Xi ∼ Xj ⇒ f (Xi ) ∼ f (Xj)
Gradient norm∫‖∇f ‖2dµ
Weighted gradient norm∫‖∇f ‖2pαdµ
Manifold gradient∫‖∇Mf ‖2pαdµ
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Implementation of this idea
Problem: the manifold structure M is unknown, and the density isunknown!
Approaches: density estimation, manifold estimation, densityestimation on a manifold ???
None of these
Manifolds or densities may not exist as suchJust focus on the smoothness to be enforced on the datapoints
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Implementation of this idea
Problem: the manifold structure M is unknown, and the density isunknown!
Approaches: density estimation, manifold estimation, densityestimation on a manifold ???
None of these
Manifolds or densities may not exist as suchJust focus on the smoothness to be enforced on the datapoints
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Implementation of this idea
Problem: the manifold structure M is unknown, and the density isunknown!
Approaches: density estimation, manifold estimation, densityestimation on a manifold ???
None of these
Manifolds or densities may not exist as suchJust focus on the smoothness to be enforced on the datapoints
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Implementation of this idea
Problem: the manifold structure M is unknown, and the density isunknown!
Approaches: density estimation, manifold estimation, densityestimation on a manifold ???
None of these
Manifolds or densities may not exist as suchJust focus on the smoothness to be enforced on the datapoints
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Implementation of this idea
Problem: the manifold structure M is unknown, and the density isunknown!
Approaches: density estimation, manifold estimation, densityestimation on a manifold ???
None of these
Manifolds or densities may not exist as suchJust focus on the smoothness to be enforced on the datapoints
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Laplacian Regularization
Neighborhood graph: weighted graph with xi as vertices,w(xi , xj) = h(‖xi − xj‖) (h(z) = e−z2
, h(z) = 1[z≤t]...)
Regularizer
N(f ) =n∑
i,j=1
w(xi , xj)(f (xi )− f (xj))2
When w(xi , xj) is large (points are close), the function f shouldhave a close value on xi and xj
Laplacian of the graph L = D −W , with Dii =∑
j wij and Wij = wij
N(f ) = f TLf
Variant L′ = (1− D−1W ) (normalized Laplacian)
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Laplacian Regularization
Neighborhood graph: weighted graph with xi as vertices,w(xi , xj) = h(‖xi − xj‖) (h(z) = e−z2
, h(z) = 1[z≤t]...)
Regularizer
N(f ) =n∑
i,j=1
w(xi , xj)(f (xi )− f (xj))2
When w(xi , xj) is large (points are close), the function f shouldhave a close value on xi and xj
Laplacian of the graph L = D −W , with Dii =∑
j wij and Wij = wij
N(f ) = f TLf
Variant L′ = (1− D−1W ) (normalized Laplacian)
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Laplacian Regularization
Neighborhood graph: weighted graph with xi as vertices,w(xi , xj) = h(‖xi − xj‖) (h(z) = e−z2
, h(z) = 1[z≤t]...)
Regularizer
N(f ) =n∑
i,j=1
w(xi , xj)(f (xi )− f (xj))2
When w(xi , xj) is large (points are close), the function f shouldhave a close value on xi and xj
Laplacian of the graph L = D −W , with Dii =∑
j wij and Wij = wij
N(f ) = f TLf
Variant L′ = (1− D−1W ) (normalized Laplacian)
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Laplacian Regularization
Neighborhood graph: weighted graph with xi as vertices,w(xi , xj) = h(‖xi − xj‖) (h(z) = e−z2
, h(z) = 1[z≤t]...)
Regularizer
N(f ) =n∑
i,j=1
w(xi , xj)(f (xi )− f (xj))2
When w(xi , xj) is large (points are close), the function f shouldhave a close value on xi and xj
Laplacian of the graph L = D −W , with Dii =∑
j wij and Wij = wij
N(f ) = f TLf
Variant L′ = (1− D−1W ) (normalized Laplacian)
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Using Laplacian Regularizer
Dimensionality reduction: project on last eigenvectors of L
Clustering: threshold eigenvectors of L, or project first and usek-means afterwards
minf⊥1 ‖f ‖=1
f TLf
Semi-supervised/supervised: use regularization
minf
n∑i=1
(f (xi )− yi )2 + λf TLf
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Using Laplacian Regularizer
Dimensionality reduction: project on last eigenvectors of L
Clustering: threshold eigenvectors of L, or project first and usek-means afterwards
minf⊥1 ‖f ‖=1
f TLf
Semi-supervised/supervised: use regularization
minf
n∑i=1
(f (xi )− yi )2 + λf TLf
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Using Laplacian Regularizer
Dimensionality reduction: project on last eigenvectors of L
Clustering: threshold eigenvectors of L, or project first and usek-means afterwards
minf⊥1 ‖f ‖=1
f TLf
Semi-supervised/supervised: use regularization
minf
n∑i=1
(f (xi )− yi )2 + λf TLf
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
−1.5 −1 −0.5 0 0.5 1 1.5 2 2.5−1
−0.5
0
0.5
1
1.5
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
−1.5 −1 −0.5 0 0.5 1 1.5 2 2.5−1
−0.5
0
0.5
1
1.5
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
−1.5 −1 −0.5 0 0.5 1 1.5 2 2.5−1
−0.5
0
0.5
1
1.5
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
−1.5 −1 −0.5 0 0.5 1 1.5 2 2.5−1
−0.5
0
0.5
1
1.5
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
C
B
D
A
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
−0.4 −0.3 −0.2 −0.1 0 0.1 0.2 0.3 0.4 0.5 0.6
−0.4
−0.3
−0.2
−0.1
0
0.1
0.2
0.3
Laplacian Eigenmap (first two eigenvectors)
A
C D
B
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
IID Framework
Assume the data is sampled i.i.d. from an unknown P
What happens when n →∞?
Towards what does the empirical quantities converge, at which rate,what does it mean?
Do we really obtain the desired effect?
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
IID Framework
Assume the data is sampled i.i.d. from an unknown P
What happens when n →∞?
Towards what does the empirical quantities converge, at which rate,what does it mean?
Do we really obtain the desired effect?
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
IID Framework
Assume the data is sampled i.i.d. from an unknown P
What happens when n →∞?
Towards what does the empirical quantities converge, at which rate,what does it mean?
Do we really obtain the desired effect?
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
IID Framework
Assume the data is sampled i.i.d. from an unknown P
What happens when n →∞?
Towards what does the empirical quantities converge, at which rate,what does it mean?
Do we really obtain the desired effect?
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Convergence properties
Two cases:
the neighborhood size is fixed (von Luxburg, B., Belkin 2005)
the neighborhood size goes to zero as n increases (Hein, Audibert2005, 2006)
Theorem
On a compact manifold M with metric g and density p (wrt µ), if t → 0and ntd+4/ log n →∞
limn→∞
f TL′f =
∫M‖∇f ‖2M p2
√det gdµ a.s.
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Convergence properties
Two cases:
the neighborhood size is fixed (von Luxburg, B., Belkin 2005)
the neighborhood size goes to zero as n increases (Hein, Audibert2005, 2006)
Theorem
On a compact manifold M with metric g and density p (wrt µ), if t → 0and ntd+4/ log n →∞
limn→∞
f TL′f =
∫M‖∇f ‖2M p2
√det gdµ a.s.
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Convergence properties
Two cases:
the neighborhood size is fixed (von Luxburg, B., Belkin 2005)
the neighborhood size goes to zero as n increases (Hein, Audibert2005, 2006)
Theorem
On a compact manifold M with metric g and density p (wrt µ), if t → 0and ntd+4/ log n →∞
limn→∞
f TL′f =
∫M‖∇f ‖2M p2
√det gdµ a.s.
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Conclusion
Goal is not to identify the manifold, but to exploit the(approximate) low-dimensionality / clustered-ness of the data
Transpose manifolds to graphs (finite set of data)
Very active area of research (best semi-supervised algorithms usethis idea) but
Theory very limitedMany algorithmic issues (choice of the graph, weights,regularizer...)Large application potential
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Conclusion
Goal is not to identify the manifold, but to exploit the(approximate) low-dimensionality / clustered-ness of the data
Transpose manifolds to graphs (finite set of data)
Very active area of research (best semi-supervised algorithms usethis idea) but
Theory very limitedMany algorithmic issues (choice of the graph, weights,regularizer...)Large application potential
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Conclusion
Goal is not to identify the manifold, but to exploit the(approximate) low-dimensionality / clustered-ness of the data
Transpose manifolds to graphs (finite set of data)
Very active area of research (best semi-supervised algorithms usethis idea) but
Theory very limitedMany algorithmic issues (choice of the graph, weights,regularizer...)Large application potential
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Conclusion
Goal is not to identify the manifold, but to exploit the(approximate) low-dimensionality / clustered-ness of the data
Transpose manifolds to graphs (finite set of data)
Very active area of research (best semi-supervised algorithms usethis idea) but
Theory very limitedMany algorithmic issues (choice of the graph, weights,regularizer...)Large application potential
Olivier Bousquet Manifold Learning
Outline Motivation Examples Theoretical Problems Results Perspectives
Bibliography
Online resources: http://www.cse.msu.edu/ lawhiu/manifold/
U. von Luxburg, O. Bousquet, and M. Belkin. Limits of spectralclustering. In Advances in Neural Information Processing Systems(NIPS) 17. MIT Press, Cambridge, MA, 2005.
M. Hein, J.-Y. Audibert, and U. von Luxburg. From Graphs toManifolds - Weak and Strong Pointwise Consistency of GraphLaplacians. In Proceedings of the 18th Annual Conferecnce onLearning Theory (COLT), pages 470-485, Springer, 2005.
M. Hein and J.-Y. Audibert: Intrinsic Dimensionality Estimation ofSubmanifolds in Euclidean space. In Proceedings of the 22ndInternational Conference on Machine Learning, 289 - 296 (2005)
M. Hein: Uniform convergence of adaptive graph-basedregularization. In Proceedings of the Conference on Learning Theory(COLT), 15, Springer, New York (2006)
Olivier Bousquet Manifold Learning
Top Related