A Single-letter Characterization of Optimal Noisy Compressed Sensing
description
Transcript of A Single-letter Characterization of Optimal Noisy Compressed Sensing
![Page 1: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/1.jpg)
A Single-letter Characterization of Optimal Noisy
Compressed Sensing
Dongning Guo
Dror Baron
Shlomo Shamai
![Page 2: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/2.jpg)
Setting• Replace samples by more general measurements
based on a few linear projections (inner products)
measurements sparsesignal
# non-zeros
![Page 3: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/3.jpg)
Signal Model• Signal entry Xn= BnUn
• iid Bn» Bernoulli() sparse• iid Un» PU
PU
Bernoulli()
Multiplier
PX
![Page 4: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/4.jpg)
Measurement Noise• Measurement process is typically analog• Analog systems add noise, non-linearities, etc.
• Assume Gaussian noise for ease of analysis
• Can be generalized to non-Gaussian noise
![Page 5: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/5.jpg)
• Noiseless measurements denoted y0
• Noise• Noisy measurements• Unit-norm columns SNR=
Noise Model
noiseless
SNR
![Page 6: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/6.jpg)
• Model process as measurement channel
• Measurements provide information!
channel
CS measurement CS decoding
source encoder
channel encoder
channel decoder
source decoder
Allerton 2006 [Sarvotham, Baron, & Baraniuk]
![Page 7: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/7.jpg)
• Theorem: [Sarvotham, Baron, & Baraniuk 2006] For sparse signal with rate-distortion function R(D), lower bound on measurement rate
s.t. SNR and distortion D
• Numerous single-letter bounds – [Aeron, Zhao, & Saligrama]– [Akcakaya and Tarokh]– [Rangan, Fletcher, & Goyal]– [Gastpar & Reeves]– [Wang, Wainwright, & Ramchandran]– [Tune, Bhaskaran, & Hanly]– …
Single-Letter Bounds
![Page 8: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/8.jpg)
Goal: Precise Single-letter Characterization of Optimal CS
![Page 9: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/9.jpg)
What Single-letter Characterization?
•Ultimately what can one say about Xn given Y?
(sufficient statistic)•Very complicated•Want a simple characterization of its quality•Large-system limit:
channel posterior
![Page 10: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/10.jpg)
Main Result: Single-letter Characterization• Result1: Conditioned on Xn=xn, the
observations (Y,) are statistically equivalent to
easy to compute…
• Estimation quality from (Y,) just as good as noisier scalar observation
degradation
channel posterior
![Page 11: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/11.jpg)
• 2(0,1) is fixed point of
• Take-home point: degraded scalar channel
• Non-rigorous owing to replica method w/ symmetry assumption– used in CDMA detection [Tanaka 2002, Guo & Verdu 2005]
• Related analysis [Rangan, Fletcher, & Goyal 2009] – MMSE estimate (not posterior) using [Guo & Verdu 2005]– extended to several CS algorithms particularly LASSO
Details
![Page 12: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/12.jpg)
Decoupling
![Page 13: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/13.jpg)
• Result2: Large system limit; any arbitrary (constant) L input elements decouple:
• Take-home point: “interference” from each individual signal entry vanishes
Decoupling Result
![Page 14: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/14.jpg)
Sparse Measurement Matrices
![Page 15: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/15.jpg)
Sparse Measurement Matrices [Baron, Sarvotham, & Baraniuk]
• LDPC measurement matrix (sparse)• Mostly zeros in ; nonzeros » P
• Each row contains ¼Nq randomly placed nonzeros • Fast matrix-vector multiplication
fast encoding / decoding
sparse matrix
![Page 16: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/16.jpg)
CS Decoding Using BP [Baron, Sarvotham, & Baraniuk]
• Measurement matrix represented by graph • Estimate input iteratively• Implemented via nonparametric BP [Bickson,Sommer,…]
measurements y
signal x
![Page 17: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/17.jpg)
Identical Single-letter Characterization w/BP
• Result3: Conditioned on Xn=xn, the observations (Y,) are statistically equivalent to
• Sparse matrices just as good• BP is asymptotically optimal!
identical degradation
![Page 18: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/18.jpg)
Decoupling Between Two Input Entries (N=500, M=250, =0.1, =10)
density
![Page 19: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/19.jpg)
CS-BP vs Other CS Methods (N=1000, =0.1, q=0.02)
M
MM
SE
CS-BP
![Page 20: A Single-letter Characterization of Optimal Noisy Compressed Sensing](https://reader036.fdocuments.in/reader036/viewer/2022062810/56815d03550346895dcb0358/html5/thumbnails/20.jpg)
Conclusion• Single-letter characterization of CS
• Decoupling
• Sparse matrices just as good
• Asymptotically optimal CS-BP algorithm