Action Recognition on Distributed Body Sensor Networks via...

Post on 28-Mar-2021

3 views 0 download

Transcript of Action Recognition on Distributed Body Sensor Networks via...

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Action Recognition on Distributed Body Sensor Networksvia Sparse Representation

Allen Y. Yang<yang@eecs.berkeley.edu>

June 28, 2008. CVPR4HB

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

New Paradigm of Distributed Pattern Recognition

Centralized Recognition

powerful processors(virtually) unlimited memory

(virtually) unlimited bandwidthobservations in controlled environment

simple sensor management

Distributed Recognition

mobile processorslimited onboard memory

band-limited communicationshigh-percentage of occlusion/outliers

complex sensor networks

Main constraints for sensor network applications

1 Conserve power consumption.

2 Adapt to network configuration changes.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

New Paradigm of Distributed Pattern Recognition

Centralized Recognition

powerful processors(virtually) unlimited memory

(virtually) unlimited bandwidthobservations in controlled environment

simple sensor management

Distributed Recognition

mobile processorslimited onboard memory

band-limited communicationshigh-percentage of occlusion/outliers

complex sensor networks

Main constraints for sensor network applications

1 Conserve power consumption.

2 Adapt to network configuration changes.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

d-WAR: Distributed Wearable Action Recognition

ArchitectureEight sensors on human body.

Locations are given and fixed.

Each sensor carries triaxial accelerometer and biaxial gyroscope.

Sampling frequency: 20Hz.

Figure: Readings from 8 x-axis accelerometers and x-axis gyroscopes for a stand-kneel-stand sequence.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Challenges

1 Simultaneous segmentation and classification.

2 Individual sensors not sufficient to classify full-body motions.Single sensors on the upper body can not recognize lower body motions.Vice Versa.

3 Simulate sensor failure and network congestion by different subsets of active sensors.

4 Identity independence:

Figure: Same actions performed by two subjects.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Challenges

1 Simultaneous segmentation and classification.2 Individual sensors not sufficient to classify full-body motions.

Single sensors on the upper body can not recognize lower body motions.Vice Versa.

3 Simulate sensor failure and network congestion by different subsets of active sensors.

4 Identity independence:

Figure: Same actions performed by two subjects.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Challenges

1 Simultaneous segmentation and classification.2 Individual sensors not sufficient to classify full-body motions.

Single sensors on the upper body can not recognize lower body motions.Vice Versa.

3 Simulate sensor failure and network congestion by different subsets of active sensors.

4 Identity independence:

Figure: Same actions performed by two subjects.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Challenges

1 Simultaneous segmentation and classification.2 Individual sensors not sufficient to classify full-body motions.

Single sensors on the upper body can not recognize lower body motions.Vice Versa.

3 Simulate sensor failure and network congestion by different subsets of active sensors.

4 Identity independence:

Figure: Same actions performed by two subjects.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Outline

1 Overview: Classification framework via `1-minimization.

2 Individual sensor: Obtains limited classification ability and becomes active only when certainevents are locally detected.

3 Global classifier: Adapts to the change of active sensors in network and boost multi-sensorrecognition.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Sparse Representation

Sparsity

A signal is sparse if most of its coefficients are (approximately) zero.

1 Sparsity in frequency domain

Figure: 2-D DCT transform.

2 Sparsity in spatial domain

Figure: Gene microarray data.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Sparse Representation

Sparsity

A signal is sparse if most of its coefficients are (approximately) zero.

1 Sparsity in frequency domain

Figure: 2-D DCT transform.

2 Sparsity in spatial domain

Figure: Gene microarray data.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Sparsity in human visual cortex [Olshausen & Field 1997, Serre & Poggio 2006]

1 Feed-forward: No iterative feedback loop.2 Redundancy: Average 80-200 neurons for each feature representation.3 Recognition: Information exchange between stages is not about individual neurons, but

rather how many neurons as a group fire together.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Sparsity in human visual cortex [Olshausen & Field 1997, Serre & Poggio 2006]

1 Feed-forward: No iterative feedback loop.2 Redundancy: Average 80-200 neurons for each feature representation.3 Recognition: Information exchange between stages is not about individual neurons, but

rather how many neurons as a group fire together.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Sparsity and `1-Minimization

1 “Black gold” age [Claerbout & Muir 1973, Taylor, Banks & McCoy 1979]

Figure: Deconvolution of spike train.

2 Basis pursuit [Chen & Donoho 1999]: Given y = Ax and x unknown,

x∗ = arg min ‖x‖1, subject to y = Ax

3 The Lasso (least absolute shrinkage and selection operator) [Tibshirani 1996]

x∗ = arg min ‖x‖1, subject to ‖y − Ax‖2 ≤ ε

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Sparsity and `1-Minimization

1 “Black gold” age [Claerbout & Muir 1973, Taylor, Banks & McCoy 1979]

Figure: Deconvolution of spike train.

2 Basis pursuit [Chen & Donoho 1999]: Given y = Ax and x unknown,

x∗ = arg min ‖x‖1, subject to y = Ax

3 The Lasso (least absolute shrinkage and selection operator) [Tibshirani 1996]

x∗ = arg min ‖x‖1, subject to ‖y − Ax‖2 ≤ ε

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Sparse Representation Encodes Membership

1 NotationTraining: For K classes, collect training samples {v1,1, · · · , v1,n1

}, · · · , {vK,1, · · · , vK,nK} ∈ RD .

Test: Present a new y ∈ RD , solve for label(y) ∈ [1, 2, · · · , K ].

2 Subspace model: Assume y belongs to Class i :

y = αi,1vi,1 + αi,2vi,2 + · · ·+ αi,n1vi,ni

,= Aiαi ,

where Ai = [vi,1, vi,2, · · · , vi,ni].

3 Nevertheless, Class i is the unknown variable we need to solve:

Sparse representation y = [A1,A2, · · · ,AK ]

α1α2

...αK

= Ax.

Sparse representation encodes membership!

x0 = [ 0 ··· 0 αTi 0 ··· 0 ]T ∈ Rn

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Sparse Representation Encodes Membership

1 NotationTraining: For K classes, collect training samples {v1,1, · · · , v1,n1

}, · · · , {vK,1, · · · , vK,nK} ∈ RD .

Test: Present a new y ∈ RD , solve for label(y) ∈ [1, 2, · · · , K ].

2 Subspace model: Assume y belongs to Class i :

y = αi,1vi,1 + αi,2vi,2 + · · ·+ αi,n1vi,ni

,= Aiαi ,

where Ai = [vi,1, vi,2, · · · , vi,ni].

3 Nevertheless, Class i is the unknown variable we need to solve:

Sparse representation y = [A1,A2, · · · ,AK ]

α1α2

...αK

= Ax.

Sparse representation encodes membership!

x0 = [ 0 ··· 0 αTi 0 ··· 0 ]T ∈ Rn

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Sparse Representation Encodes Membership

1 NotationTraining: For K classes, collect training samples {v1,1, · · · , v1,n1

}, · · · , {vK,1, · · · , vK,nK} ∈ RD .

Test: Present a new y ∈ RD , solve for label(y) ∈ [1, 2, · · · , K ].

2 Subspace model: Assume y belongs to Class i :

y = αi,1vi,1 + αi,2vi,2 + · · ·+ αi,n1vi,ni

,= Aiαi ,

where Ai = [vi,1, vi,2, · · · , vi,ni].

3 Nevertheless, Class i is the unknown variable we need to solve:

Sparse representation y = [A1,A2, · · · ,AK ]

α1α2

...αK

= Ax.

Sparse representation encodes membership!

x0 = [ 0 ··· 0 αTi 0 ··· 0 ]T ∈ Rn

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Sparse Representation Encodes Membership

1 NotationTraining: For K classes, collect training samples {v1,1, · · · , v1,n1

}, · · · , {vK,1, · · · , vK,nK} ∈ RD .

Test: Present a new y ∈ RD , solve for label(y) ∈ [1, 2, · · · , K ].

2 Subspace model: Assume y belongs to Class i :

y = αi,1vi,1 + αi,2vi,2 + · · ·+ αi,n1vi,ni

,= Aiαi ,

where Ai = [vi,1, vi,2, · · · , vi,ni].

3 Nevertheless, Class i is the unknown variable we need to solve:

Sparse representation y = [A1,A2, · · · ,AK ]

α1α2

...αK

= Ax.

Sparse representation encodes membership!

x0 = [ 0 ··· 0 αTi 0 ··· 0 ]T ∈ Rn

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Distributed Action Recognition

1 Training samples: manually segment and normalize to duration h.

2 On each sensor node i , stack training actions into vector form

vi = [x(1), · · · , x(h), y(1), · · · , y(h), z(1), · · · , z(h), θ(1), · · · , θ(h), ρ(1), · · · , ρ(h)]T ∈ R5h

3 Full body motion

Training sample: v =

( v1

...v8

)Test sample: y =

y1

...y8

∈ R8·5h

4 Action subspace: If y is from Class i ,

y =

y1

...y8

= αi,1

( v1

...v8

)1

+ · · ·+ αi,ni

( v1

...v8

)ni

= Aiαi .

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Distributed Action Recognition

1 Training samples: manually segment and normalize to duration h.

2 On each sensor node i , stack training actions into vector form

vi = [x(1), · · · , x(h), y(1), · · · , y(h), z(1), · · · , z(h), θ(1), · · · , θ(h), ρ(1), · · · , ρ(h)]T ∈ R5h

3 Full body motion

Training sample: v =

( v1

...v8

)Test sample: y =

y1

...y8

∈ R8·5h

4 Action subspace: If y is from Class i ,

y =

y1

...y8

= αi,1

( v1

...v8

)1

+ · · ·+ αi,ni

( v1

...v8

)ni

= Aiαi .

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Distributed Action Recognition

1 Training samples: manually segment and normalize to duration h.

2 On each sensor node i , stack training actions into vector form

vi = [x(1), · · · , x(h), y(1), · · · , y(h), z(1), · · · , z(h), θ(1), · · · , θ(h), ρ(1), · · · , ρ(h)]T ∈ R5h

3 Full body motion

Training sample: v =

( v1

...v8

)Test sample: y =

y1

...y8

∈ R8·5h

4 Action subspace: If y is from Class i ,

y =

y1

...y8

= αi,1

( v1

...v8

)1

+ · · ·+ αi,ni

( v1

...v8

)ni

= Aiαi .

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Distributed Action Recognition

1 Training samples: manually segment and normalize to duration h.

2 On each sensor node i , stack training actions into vector form

vi = [x(1), · · · , x(h), y(1), · · · , y(h), z(1), · · · , z(h), θ(1), · · · , θ(h), ρ(1), · · · , ρ(h)]T ∈ R5h

3 Full body motion

Training sample: v =

( v1

...v8

)Test sample: y =

y1

...y8

∈ R8·5h

4 Action subspace: If y is from Class i ,

y =

y1

...y8

= αi,1

( v1

...v8

)1

+ · · ·+ αi,ni

( v1

...v8

)ni

= Aiαi .

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Sparse Representation

1 Nevertheless, label(y) = i is the unknown membership function to solve:

Sparse Representation: y =[A1 A2 · · · AK

]α1

α2

...αK

= Ax ∈ R8·5h.

2 One solution: x = [0, 0, · · · , αTi , 0, · · · , 0]T .

Sparse representation encodes membership.

3 Two problems:Directly solving the linear system is intractable.Seeking the sparsest solution.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Sparse Representation

1 Nevertheless, label(y) = i is the unknown membership function to solve:

Sparse Representation: y =[A1 A2 · · · AK

]α1

α2

...αK

= Ax ∈ R8·5h.

2 One solution: x = [0, 0, · · · , αTi , 0, · · · , 0]T .

Sparse representation encodes membership.

3 Two problems:Directly solving the linear system is intractable.Seeking the sparsest solution.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Sparse Representation

1 Nevertheless, label(y) = i is the unknown membership function to solve:

Sparse Representation: y =[A1 A2 · · · AK

]α1

α2

...αK

= Ax ∈ R8·5h.

2 One solution: x = [0, 0, · · · , αTi , 0, · · · , 0]T .

Sparse representation encodes membership.

3 Two problems:Directly solving the linear system is intractable.Seeking the sparsest solution.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Dimensionality Reduction

1 Construct Fisher/LDA features Ri ∈ R10×5h on each node i :

yi = Ri yi = RiAi x = Ai x ∈ R10

2 Globally y1

...y8

=

R1 ··· 0

.... . .

...0 ··· R8

y1

...y8

= RAx = Ax ∈ R8·10

3 During the transformation, the data matrix A and x remain unchanged.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Dimensionality Reduction

1 Construct Fisher/LDA features Ri ∈ R10×5h on each node i :

yi = Ri yi = RiAi x = Ai x ∈ R10

2 Globally y1

...y8

=

R1 ··· 0

.... . .

...0 ··· R8

y1

...y8

= RAx = Ax ∈ R8·10

3 During the transformation, the data matrix A and x remain unchanged.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Dimensionality Reduction

1 Construct Fisher/LDA features Ri ∈ R10×5h on each node i :

yi = Ri yi = RiAi x = Ai x ∈ R10

2 Globally y1

...y8

=

R1 ··· 0

.... . .

...0 ··· R8

y1

...y8

= RAx = Ax ∈ R8·10

3 During the transformation, the data matrix A and x remain unchanged.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Seeking Sparsest Solution: `1-Minimization

1 Ideal solution: `0-minimization

(P0) x∗ = arg minx‖x‖0 s.t. y = Ax.

where ‖ · ‖0 simply counts the number of nonzero terms.However, such solution is generally NP-hard.

2 Compressed sensing: under mild condition, equivalence relation

(P1) x∗ = arg minx‖x‖1 s.t. y = Ax.

3 `1-Ball

`1-Minimization is convex.

Solution equal to `0-minimization.

Reference: Robust face recognition via sparse representation. (in press) PAMI, 2008.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Seeking Sparsest Solution: `1-Minimization

1 Ideal solution: `0-minimization

(P0) x∗ = arg minx‖x‖0 s.t. y = Ax.

where ‖ · ‖0 simply counts the number of nonzero terms.However, such solution is generally NP-hard.

2 Compressed sensing: under mild condition, equivalence relation

(P1) x∗ = arg minx‖x‖1 s.t. y = Ax.

3 `1-Ball

`1-Minimization is convex.

Solution equal to `0-minimization.

Reference: Robust face recognition via sparse representation. (in press) PAMI, 2008.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Seeking Sparsest Solution: `1-Minimization

1 Ideal solution: `0-minimization

(P0) x∗ = arg minx‖x‖0 s.t. y = Ax.

where ‖ · ‖0 simply counts the number of nonzero terms.However, such solution is generally NP-hard.

2 Compressed sensing: under mild condition, equivalence relation

(P1) x∗ = arg minx‖x‖1 s.t. y = Ax.

3 `1-Ball

`1-Minimization is convex.

Solution equal to `0-minimization.

Reference: Robust face recognition via sparse representation. (in press) PAMI, 2008.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Distributed Recognition Framework

1 Distributed Sparse Representation y1

...y8

= R

( v1

...v8

)1

, · · · ,( v1

...v8

)n

x⇔

y1=R1(v1,1,··· ,v1,n)x

...y8=R8(v8,1,··· ,v8,n)x

2 Local classifier: For each note i , solve

yi = Ri

(vi,1, · · · , vi,n

)x ∈ R10

3 Adaptive global classifier: Suppose only first L sensors are available (L ≤ 8): y1

...yL

=

R1 ··· 0

.... . .

...0 ··· RL

v1

...vL

1

, · · · ,

v1

...vL

n

x ∈ R10L

4 The representation x and training matrix A remain invariant.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Distributed Recognition Framework

1 Distributed Sparse Representation y1

...y8

= R

( v1

...v8

)1

, · · · ,( v1

...v8

)n

x⇔

y1=R1(v1,1,··· ,v1,n)x

...y8=R8(v8,1,··· ,v8,n)x

2 Local classifier: For each note i , solve

yi = Ri

(vi,1, · · · , vi,n

)x ∈ R10

3 Adaptive global classifier: Suppose only first L sensors are available (L ≤ 8): y1

...yL

=

R1 ··· 0

.... . .

...0 ··· RL

v1

...vL

1

, · · · ,

v1

...vL

n

x ∈ R10L

4 The representation x and training matrix A remain invariant.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Distributed Recognition Framework

1 Distributed Sparse Representation y1

...y8

= R

( v1

...v8

)1

, · · · ,( v1

...v8

)n

x⇔

y1=R1(v1,1,··· ,v1,n)x

...y8=R8(v8,1,··· ,v8,n)x

2 Local classifier: For each note i , solve

yi = Ri

(vi,1, · · · , vi,n

)x ∈ R10

3 Adaptive global classifier: Suppose only first L sensors are available (L ≤ 8): y1

...yL

=

R1 ··· 0

.... . .

...0 ··· RL

v1

...vL

1

, · · · ,

v1

...vL

n

x ∈ R10L

4 The representation x and training matrix A remain invariant.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Distributed Recognition Framework

1 Distributed Sparse Representation y1

...y8

= R

( v1

...v8

)1

, · · · ,( v1

...v8

)n

x⇔

y1=R1(v1,1,··· ,v1,n)x

...y8=R8(v8,1,··· ,v8,n)x

2 Local classifier: For each note i , solve

yi = Ri

(vi,1, · · · , vi,n

)x ∈ R10

3 Adaptive global classifier: Suppose only first L sensors are available (L ≤ 8): y1

...yL

=

R1 ··· 0

.... . .

...0 ··· RL

v1

...vL

1

, · · · ,

v1

...vL

n

x ∈ R10L

4 The representation x and training matrix A remain invariant.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Rejection of Invalid Actions (Segmentation)

1 Multiple segmentation hypothesis

2 Outlier rejection

Outlier Rejection

When `1-solution is not sparse or concentrated to one subspace, the test sample is invalid.

Sparsity Concentration Index: SCI(x).

=K ·maxi ‖δi (x)‖1/‖x‖1 − 1

K − 1∈ [0, 1].

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Rejection of Invalid Actions (Segmentation)

1 Multiple segmentation hypothesis

2 Outlier rejection

Outlier Rejection

When `1-solution is not sparse or concentrated to one subspace, the test sample is invalid.

Sparsity Concentration Index: SCI(x).

=K ·maxi ‖δi (x)‖1/‖x‖1 − 1

K − 1∈ [0, 1].

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Rejection of Invalid Actions (Segmentation)

1 Multiple segmentation hypothesis

2 Outlier rejection

Outlier Rejection

When `1-solution is not sparse or concentrated to one subspace, the test sample is invalid.

Sparsity Concentration Index: SCI(x).

=K ·maxi ‖δi (x)‖1/‖x‖1 − 1

K − 1∈ [0, 1].

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Experiment

Precision vs Recall:

Sensors 2 7 2,7 1,2,7 1- 3, 7,8 1- 8

Prec [%] 89.8 94.6 94.4 92.8 94.6 98.8Rec [%] 65 61.5 82.5 80.6 89.5 94.2

Segmentation results using all 8 sensors:

(a) Stand-Sit-Stand (b) Sit-Lie-Sit

(c) Rotate-Left (d) Go-Downstairs

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Experiment

Precision vs Recall:

Sensors 2 7 2,7 1,2,7 1- 3, 7,8 1- 8

Prec [%] 89.8 94.6 94.4 92.8 94.6 98.8Rec [%] 65 61.5 82.5 80.6 89.5 94.2

Segmentation results using all 8 sensors:

(a) Stand-Sit-Stand (b) Sit-Lie-Sit

(c) Rotate-Left (d) Go-Downstairs

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Confusion Tables

(a) 1- 8

(b) 7

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Conclusion

1 Mixture subspace model: 10-D action spaces suffice for data communication.

2 Accurate recognition via sparse representationFull-body network: 99% accuracy with 95% recall.Keep one on upper body and one on lower body: 94% accuracy and 82% recall.Reduce to single sensors: 90% accuracy.

3 ApplicationsBeyond Wii controllers and iPhones.

Eldertech: falling detection, mobility monitoring.

Energy Expenditure: lifestyle-related chronic diseases.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Conclusion

1 Mixture subspace model: 10-D action spaces suffice for data communication.2 Accurate recognition via sparse representation

Full-body network: 99% accuracy with 95% recall.Keep one on upper body and one on lower body: 94% accuracy and 82% recall.Reduce to single sensors: 90% accuracy.

3 ApplicationsBeyond Wii controllers and iPhones.

Eldertech: falling detection, mobility monitoring.

Energy Expenditure: lifestyle-related chronic diseases.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Conclusion

1 Mixture subspace model: 10-D action spaces suffice for data communication.2 Accurate recognition via sparse representation

Full-body network: 99% accuracy with 95% recall.Keep one on upper body and one on lower body: 94% accuracy and 82% recall.Reduce to single sensors: 90% accuracy.

3 ApplicationsBeyond Wii controllers and iPhones.

Eldertech: falling detection, mobility monitoring.

Energy Expenditure: lifestyle-related chronic diseases.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Conclusion

1 Mixture subspace model: 10-D action spaces suffice for data communication.2 Accurate recognition via sparse representation

Full-body network: 99% accuracy with 95% recall.Keep one on upper body and one on lower body: 94% accuracy and 82% recall.Reduce to single sensors: 90% accuracy.

3 ApplicationsBeyond Wii controllers and iPhones.

Eldertech: falling detection, mobility monitoring.

Energy Expenditure: lifestyle-related chronic diseases.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Conclusion

1 Mixture subspace model: 10-D action spaces suffice for data communication.2 Accurate recognition via sparse representation

Full-body network: 99% accuracy with 95% recall.Keep one on upper body and one on lower body: 94% accuracy and 82% recall.Reduce to single sensors: 90% accuracy.

3 ApplicationsBeyond Wii controllers and iPhones.

Eldertech: falling detection, mobility monitoring.

Energy Expenditure: lifestyle-related chronic diseases.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

http://www.eecs.berkeley.edu/~yang/software/WAR/index.html

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition

Introduction Sparse Representation Distributed Pattern Recognition Conclusion

Acknowledgments

Collaborators

Berkeley: Ruzena Bajcsy, Shankar Sastry, Sameer Iyengar

Cornell: Philip Kuryloski

UT-Dallas: Roozbeh Jafari

Funding Support

NSF TRUST Center

ARO MURI: Heterogeneous Sensor Networks (HSN)

Telecom Italia Lab at Berkeley

References

Robust face recognition via sparse representation. (in press) PAMI, 2008.

Donoho. For most large underdetermined systems of equations, the minimal `1-normnear-solution approximates the sparsest near-solution. 2004.

Candes. Compressive sampling. 2006.

Donoho. Neighborly polytopes and sparse solution of underdetermined linear equations.2004.

Allen Y. Yang <yang@eecs.berkeley.edu> Wearable Action Recognition