The Johnson-Lindenstrauss Lemma Meets Compressed Sensing
Transcript of The Johnson-Lindenstrauss Lemma Meets Compressed Sensing
![Page 1: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/1.jpg)
Mark Davenport
Richard Baraniuk Ron DeVore
Michael Wakin
The Johnson-Lindenstrauss Lemma Meets
Compressed Sensing
dsp.rice.edu/cs
![Page 2: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/2.jpg)
Compressed Sensing (CS)
Observe
Random measurements
-sparse
![Page 3: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/3.jpg)
Randomness in CSNew signal modelsNew applications
![Page 4: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/4.jpg)
Restricted Isometry Property
What makes a “good” CS matrix?
![Page 5: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/5.jpg)
Restricted Isometry Property
What makes a “good” CS matrix?
Let denote the set of all K-sparse signals.
has RIP of order K if there exists
such that
for all
![Page 6: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/6.jpg)
Restricted Isometry Property
What makes a “good” CS matrix?
Let denote the set of all K-sparse signals.
has RIP of order K if there exists
such that
for all
k-planes
![Page 7: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/7.jpg)
Proof of RIP
Random matrix will satisfy RIP for largest possible K with high probability [Candès, Tao; Donoho]
![Page 8: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/8.jpg)
Proof of RIP
Random matrix will satisfy RIP for largest possible K with high probability [Candès, Tao; Donoho]
Appeal to known results on singular values of random matrices [Davidson, Szarek; Litvak, Pajor,
Rudelson, Tomczak-Jaegermann]
![Page 9: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/9.jpg)
Proof of RIP
Random matrix will satisfy RIP for largest possible K with high probability [Candès, Tao; Donoho]
Appeal to known results on singular values of random matrices [Davidson, Szarek; Litvak, Pajor,
Rudelson, Tomczak-Jaegermann]
This is not light reading…
![Page 10: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/10.jpg)
“Proof” of RIP
“It uses a lot of newer mathematical techniques, things that were developed in the 80's and 90's. Noncommutative geometry, random matrices … the proof is very… hip.” - Hal
![Page 11: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/11.jpg)
Dimensionality Reduction
Point dataset lives in high-dimensional space
Number of data points is small
Compress data to few dimensions
We do not lose information – can distinguish data points
![Page 12: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/12.jpg)
Johnson-Lindenstrauss Lemma
![Page 13: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/13.jpg)
Johnson-Lindenstrauss Lemma
Proof relies on a simple concentration of measure inequality
![Page 14: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/14.jpg)
Gaussian
Bernoulli [Achlioptas]
Favorable JL Distributions
![Page 15: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/15.jpg)
“Database-friendly” [Achlioptas]
Fast JL Transform [Ailon, Chazelle]
Favorable JL Distributions
: Sparse Gaussian matrix : Fast Hadamard transform: Random modulation
![Page 16: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/16.jpg)
Theorem: Supposing F is drawn from a JL-favorable distribution, then with probability at least 1 - , Fmeets the RIP with .
JL Meets CS [Baraniuk, DeVore, Davenport, Wakin]
![Page 17: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/17.jpg)
Theorem: Supposing F is drawn from a JL-favorable distribution, then with probability at least 1 - , Fmeets the RIP with .
Key idea
construct a set of points Q
JL Meets CS [Baraniuk, DeVore, Davenport, Wakin]
![Page 18: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/18.jpg)
Theorem: Supposing F is drawn from a JL-favorable distribution, then with probability at least 1 - , Fmeets the RIP with .
Key idea
construct a set of points Q
apply JL lemma (union bound on concentration of measure)
JL Meets CS [Baraniuk, DeVore, Davenport, Wakin]
![Page 19: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/19.jpg)
Theorem: Supposing F is drawn from a JL-favorable distribution, then with probability at least 1 - , Fmeets the RIP with .
Key idea
construct a set of points Q
apply JL lemma (union bound on concentration of measure)
show that isometry on Q extends to isometry on
JL Meets CS [Baraniuk, DeVore, Davenport, Wakin]
![Page 20: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/20.jpg)
Recall: RIP
has RIP of order K if there exists
such that
for all
![Page 21: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/21.jpg)
Recall: RIP
has RIP of order K if there exists
such that
for all
Fix a K-dimensional subspace
![Page 22: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/22.jpg)
Recall: RIP
has RIP of order K if there exists
such that
for all
Fix a K-dimensional subspace
Consider only
![Page 23: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/23.jpg)
Recall: RIP
has RIP of order K if there exists
such that
for all
Fix a K-dimensional subspace
Consider only
Pick Q such that for anythere exists a such that
![Page 24: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/24.jpg)
Bootstrapping
Apply JL to get
![Page 25: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/25.jpg)
Bootstrapping
Apply JL to get
Define A to be the smallest number such that
for all with
![Page 26: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/26.jpg)
Bootstrapping
Apply JL to get
Define A to be the smallest number such that
for all with
For any , pick the closest
![Page 27: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/27.jpg)
Bootstrapping
Apply JL to get
Define A to be the smallest number such that
for all with
For any , pick the closest
Hence
![Page 28: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/28.jpg)
How Many Points?
For each K-dimensional space, we need
![Page 29: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/29.jpg)
How Many Points?
For each K-dimensional space, we need
spaces to consider
![Page 30: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/30.jpg)
How Many Points?
For each K-dimensional space, we need
spaces to consider
How many measurements do we need to get RIP with probability at least ?
![Page 31: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/31.jpg)
How Many Points?
For each K-dimensional space, we need
spaces to consider
How many measurements do we need to get RIP with probability at least ?
![Page 32: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/32.jpg)
Universality
Easy to see why random matrices are universal with respect to sparsity basis
Resample your points in new basis – JL provides guarantee for arbitrary set of points
Gaussian
Bernoulli
Others…
![Page 33: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/33.jpg)
Better understanding of the relevant geometry
provides simple proofs of key CS / n-width results
Summary
![Page 34: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/34.jpg)
Better understanding of the relevant geometry
provides simple proofs of key CS / n-width results
New conditions on what it takes to be a good CS matrix
concentration of measure around the mean
Summary
![Page 35: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/35.jpg)
Better understanding of the relevant geometry
provides simple proofs of key CS / n-width results
New conditions on what it takes to be a good CS matrix
concentration of measure around the mean
New signal models
manifolds
Summary
![Page 36: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/36.jpg)
Better understanding of the relevant geometry
provides simple proofs of key CS / n-width results
New conditions on what it takes to be a good CS matrix
concentration of measure around the mean
New signal models
manifolds
Natural setting for studying information scalability
detection
estimation
learning
Summary
![Page 37: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/37.jpg)
Randomness in CSNew signal models
New applications
![Page 38: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/38.jpg)
Manifold Compressive Sensing
Locally Euclidean topological space
Typically for signal processing
nonlinear K-dimensional “surface” in signal space RN
potentially very low dimensional signal model
Examples (all nonlinear)
chirps
modulation schemes
image articulations
![Page 39: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/39.jpg)
Stable Manifold Embedding
Stability [Wakin, Baraniuk]
Number of measurements required
![Page 40: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/40.jpg)
Example: Linear Chirps
original initial guess initial error
N = 256
K = 2 (start & end frequencies)
M = 5: 55% success
M = 30: 99% success
![Page 41: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/41.jpg)
Manifold Learning
Manifold learning algorithmsfor sampled data in RN
ISOMAP, LLE, HLLE, etc.
Stable embedding preserves key properties in RM
ambient and geodesic distances
dimension and volume of the manifold
path lengths and curvature
topology, local neighborhoods, and angles
etc…
Can we learn these properties from projections in RM ?
savings in computation, storage, acquisition costs
![Page 42: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/42.jpg)
Example: Manifold Learning
ISOMAP HLLELaplacian
Eigenmaps
R4096
RM
M=15 M=15M=20
![Page 43: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/43.jpg)
Randomness in CSNew signal modelsNew applications
![Page 44: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/44.jpg)
Detection – Matched Filter
Testing for presence of a known signal s
Sufficient statistic for detecting s:
![Page 45: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/45.jpg)
Compressive Matched Filter
Now suppose we have CS measurements
when is an orthoprojector, remains white noise
new sufficient statistic is simply the compressive matched filter (smashed filter?)
![Page 46: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/46.jpg)
CMF – Performance
ROC curve for Neyman-Pearson detector:
From JL lemma, for random orthoprojector
Thus
![Page 47: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/47.jpg)
CMF – Performance
![Page 48: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/48.jpg)
Generalization – Classification
More generally, suppose we want to classify between several possible signals
by the JL Lemma these distances are preserved
![Page 49: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/49.jpg)
CMF as an Estimator
How well does the compressive matched filter estimate the output of the true matched filter?
With probability at least
where
[Alon, Gibbons, Matias, Szegedy;
Davenport, Baraniuk, Wakin]
![Page 50: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/50.jpg)
Conclusions dsp.rice.edu/CS
Random matrices work for CS for the same reason that they work for the JL lemma
![Page 51: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/51.jpg)
Conclusions dsp.rice.edu/CS
Random matrices work for CS for the same reason that they work for the JL lemma
Another way to characterize “good” CS matrices
![Page 52: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/52.jpg)
Conclusions dsp.rice.edu/CS
Random matrices work for CS for the same reason that they work for the JL lemma
Another way to characterize “good” CS matrices
Allows us to extend CS to new signal models
manifold/parametric models
![Page 53: The Johnson-Lindenstrauss Lemma Meets Compressed Sensing](https://reader033.fdocuments.in/reader033/viewer/2022060219/6293b1a938181c0f133fd16f/html5/thumbnails/53.jpg)
Conclusions dsp.rice.edu/CS
Random matrices work for CS for the same reason that they work for the JL lemma
Another way to characterize “good” CS matrices
Allows us to extend CS to new signal models
manifold/parametric models
Allows us to extend CS to new settings
detection
classification/learning
estimation