Post on 29-Jan-2016
Source Localizationon a budget
Volkan Cevher
volkan@rice.eduRice University
Petros Rich Anna Martin Lance
Localization Problem
• Goal: Localize targetsby fusing measurementsfrom a network of sensors
[Cevher, Duarte, Baraniuk; EUSIPCO 2007| Model and Zibulevsky; SP 2006| Cevher et al.; ICASSP 2006| Malioutov, Cetin, and Willsky; IEEE TSP, 2005| Chen et al.; Proc. of IEEE 2003]
Localization Problem
• Goal: Localize targetsby fusing measurementsfrom a network of sensors
– collect time signal data– communicate signals across
the network– solve an optimization
problem
Digital Revolution
• Goal: Localize targetsby fusing measurementsfrom a network of sensors
– collect time signal data– communicate signals across
the network– solve an optimization
problem
<>
Digital Data Acquisition
Foundation: Shannon/Nyquist sampling theorem
time space
“if you sample densely enough (at the Nyquist rate), you can perfectly reconstruct the original analog data”
Major Trends in Sensing
higher resolution / denser sampling
large numbers of sensors
increasing # of modalities / mobility
• Goal: Localize targetsby fusing measurementsfrom a network of sensors
– collect time signal data requires potentially
high-rate (Nyquist)sampling
– communicate signalsacross the network potentially large
communicationburden
– solve an optimizationproblem e.g., MLE
Need compression
Problems of the Current Paradigm
Approaches
• Do nothing / Ignore
be content with the existing approaches
– generalizes well
– robust
Approaches
• Finite Rate of Innovation
Sketching / Streaming
Compressive Sensing
[Vetterli, Marziliano, Blu; Blu, Dragotti, Vetterli, Marziliano, Coulot; Gilbert, Indyk, Strauss, Cormode, Muthukrishnan; Donoho; Candes, Romberg, Tao; Candes, Tao]
Approaches
• Finite Rate of Innovation
Sketching / Streaming
Compressive Sensing
PARSITY
Agenda
• A short review of compressive sensing
• Localization via dimensionality reduction
– Experimental results
• A broader view of localization
• Conclusions
A Short Review of Compressive Sensing
Theory
Compressive Sensing 101
• Goal: Recover a sparse orcompressible signal from measurements
• Problem: Randomprojection not full rank
• Solution: Exploit the sparsity / compressibilitygeometry of acquired signal
• Goal: Recover a sparse orcompressible signal from measurements
• Problem: Randomprojection not full rankbut satisfies Restricted Isometry Property (RIP)
• Solution: Exploit the sparsity / compressibility geometry of acquired signal
– iid Gaussian– iid Bernoulli– …
Compressive Sensing 101
• Goal: Recover a sparse orcompressible signal from measurements
• Problem: Randomprojection not full rankbut satisfies Restricted Isometry Property (RIP)
• Solution: Exploit the sparsity / compressibility geometry of acquired signal
via convex optimization or greedy algorithm
– iid Gaussian– iid Bernoulli– …
Compressive Sensing 101
• Sparse signal: only K out of N coordinates nonzero
– model: union of K-dimensional subspacesaligned w/ coordinate axes
Concise Signal Structure
sorted index
• Sparse signal: only K out of N coordinates nonzero
– model: union of K-dimensional subspaces
• Compressible signal: sorted coordinates decay rapidly to zero
well-approximated by a K-sparse signal(simply by thresholding)
sorted index
Concise Signal Structure
power-lawdecay
Restricted Isometry Property (RIP)• Preserve the structure of sparse/compressible signals
• RIP of order 2K implies: for all K-sparse x1 and x2
K-planes
Restricted Isometry Property (RIP)• Preserve the structure of sparse/compressible signals
• Random subGaussian (iid Gaussian, Bernoulli) matrix has the RIP with high probability if
K-planes
Recovery Algorithms
• Goal:given
recover
• and convex optimization formulations– basis pursuit, Dantzig selector, Lasso, …
• Greedy algorithms– orthogonal matching pursuit,
iterative thresholding (IT), compressive sensing matching pursuit (CoSaMP)
– at their core: iterative sparse approximation
Performance of Recovery
• Using methods, IT, CoSaMP
• Sparse signals
– noise-free measurements: exact recovery – noisy measurements: stable recovery
• Compressible signals
– recovery as good as K-sparse approximation
CS recoveryerror
(METRIC)
signal K-termapprox error
noise
Universality
• Random measurements can be used for signals sparse in any basis
Universality
• Random measurements can be used for signals sparse in any basis
Universality
• Random measurements can be used for signals sparse in any basis
sparsecoefficient
vector
nonzeroentries
Signal recovery is not always required.
ELVIS:
Enhanced Localization via Incoherence and Sparsity
(Back to Localization)
An Important Detail
• Solve two entangled problems for localization
– estimate source locations
– estimate source signals
Today
• Instead, solve one localization problem
– estimate source locations by exploiting random projections of
observed signals– estimate source signals
ELVIS
• Instead, solve one localization problem
– estimate source locations by exploiting random projections of
observed signals– estimate source signals
• Bayesian model order selection & MAP estimation in a decentralized sparse approximation framework that leverages
– source sparsity
– incoherence of sources
– spatial sparsity of sources
[VC, Boufounos, Baraniuk, Gilbert, Strauss; IPSN’09]
Problem Setup
• Discretize space into a localization grid withN grid points
– fixes localization resolution
– P sensors do not have tobe on grid points
Localization as Sparse Approximation
localizationgrid
actual sensormeasurements true
targetlocation
local localizationdictionary
Multiple Targets
localizationgrid
actual sensormeasurements 2 true
targetlocations
Local Dictionaries
• Sample sparse / compressible signal using CS
– Fourier sampling [Gilbert, Strauss]
• Calculate at sensor i using measurements
– for all grid positionsn=1,…,N:
assume that target is at grid position n
– for all sensors j=1,…,P: use Green’s function to
estimate signal sensor j would measure if target was at position n
Valid Dictionaries • S.A. works when columns of
are mutual incoherent
• True when target signal has fast-decaying autocorrelation
• Extends to multiple targets with small cross-correlation
Typical Correlation FunctionsToyota Prius
Isuzu Rodeo
Chevy Camaro
ACF: Toyota Prius
ACF: Isuzu Rodeo
ACF: Chevy Camaro
CCF: Rodeo vs. Prius
CCF: Rodeo vs. Camaro
CCF: Camaro vs. Prius
An Important Issue
localizationgrid
actual sensormeasurements
Need to sendacross
the network
Enter Compressive Sensing
• Sparse localization vector <>acquire and transmit compressive measurementsof the actual observations without losing information
So Far…
• Use random projections of observed signals two ways:
– create local sensor dictionaries that sparsify source locations
– create intersensor communication messages
(K targets on N-dim grid)
populatedusing recovered
signalsrandom iid
ELVIS Highlights
• Use random projections of observed signals two ways:
– create local sensor dictionaries that sparsify source locations
sample at source sparsity– create intersensor communication
messagescommunicate at spatial sparsityrobust to (i) quantization (1-bit quantization–paper)
(ii) packet drops
No Signal Reconstruction
ELVIS Dictionary
ELVIS Highlights
• Use random projections of observed signals two ways:
– create local sensor dictionaries that sparsify source locations
sample at source sparsity– create intersensor communication
messagescommunicate at spatial sparsityrobust to (i) quantization (1-bit quantization–paper)
(ii) packet drops
• Provable greedy estimation for ELVIS dictionaries
Bearing pursuit – computationally efficient reconstruction
No Signal Reconstruction
Experiments
Field Data Results5 vehicle convoy
>100 × sub-Nyquist
Sensing System Problems
• Common theme so far…
sensors > representations > metrics > “do our best”
• Purpose of deployment
– Multi-objective: sensing, lifetime, connectivity, coverage,
reliability, etc…
Competition among Objectives
• Common theme so far…
sensors > representations > metrics > “do our best”
• Purpose of deployment
– Multi-objective: sensing, lifetime, connectivity, coverage,
reliability, etc…
• Limited resources > conflicts in objectives
Diversity of Objectives
• Multiple objectives
– localization area– lifetime time– connectivity probability– coverage area– reliability probability
• Unifying framework
– utility
Optimality
• Pareto efficiency
– Economics / optimization literature
• Pareto Frontier
– a fundamental limit for achievable utilities
Pareto Frontiers for Localization
• Mathematical framework for multi objective design best sensors portfolio
• Elements of the design
– constant budget > optimization polytope
– sensor dictionary
– random deployment
– communications
[VC, Kaplan; IPSN’09, TOSN’09]
Pareto Frontiers for Localization
• Mathematical framework for multi objective design
• Elements of the design
– constant budget
– sensor dictionary > measurement type (bearing or range) and error, sensor reliability,
field-of- view, sensing range, and mobility
… …
$10 $30 $200 $1M $5M
Pareto Frontiers for Localization
• Mathematical framework for multi objective design
• Elements of the design
– constant budget
– sensor dictionary
– random deployment > optimize expected / worst case
utilities– communications
Pareto Frontiers for Localization
• Mathematical framework for multi objective design
• Elements of the design
– constant budget
– sensor dictionary
– random deployment
– communications > bearing or range
Statement of Results – 1
• Theory to predict the localization performancewith management
– signals > performance characterizations
– key idea:
duality among sensors <>existence of a reference sensing system
• Provable diminishing returns
• Optimal heterogeneity
– sparse solutions
bounded by # of objectives
– key theorems:
concentration of resources
dominating sensor pairs
• Solution algorithms
– integer optimization
Statement of Results – 2
Conclusions
• CS
– sensing via dimensionality reduction
• ELVIS
– source localization via dimensionality reduction
– provable and efficient recovery via bearing pursuit
• Current work
– clock synchronization
– sensor position errors via linear filtering
• Pareto Frontiers w/ ELVIS: reactive systems
Questions?
Volkan Cevher / volkan@rice.edu