Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The...

47
Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu

Transcript of Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The...

Page 1: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Real-Time Vision on a Mobile Robot Platform

Mohan SridharanJoint work with Peter Stone

The University of Texas at [email protected]

Page 2: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Motivation

Computer vision challenging. “State-of-the-art” approaches not applicable to

real systems. Computational and/or memory constraints.

Focus: efficient algorithms that work in real-time on mobile robots.

Page 3: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Overview

Complete vision system developed on a mobile robot.

Challenges to address: Color Segmentation. Object recognition. Line detection. Illumination invariance.

On-board processing– computational and memory constraints.

Page 4: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Test Platform – Sony ERS7

20 degrees of freedom.

Primary sensor – CMOS camera.

IR, touch sensors, accelerometers.

Wireless LAN. Soccer on 4.5x3m

field – play humans by 2050!

Page 5: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

The Aibo Vision System – I/O

Input: Image pixels in YCbCr Color space. Frame rate: 30 fps. Resolution: 208 x 160.

Output: Distances and angles to objects.

Constraints: On-board processing: 576 MHz. Rapidly varying camera positions.

Page 6: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Robot’s view of the world…

Page 7: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Vision System – Flowchart…

Page 8: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Vision System – Phase 1: Segmentation.

Color Segmentation: Hand-label discrete

colors. Intermediate color

maps. NNr weighted

average – Master color cube.

128x128x128 color map – 2MB.

Page 9: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Vision System – Phase 1: Segmentation.

Use perceptually motivated color space – LAB .

Offline training in LAB – generate equivalent YCbCr cube.

Page 10: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Vision System – Phase 1: Segmentation.

Page 11: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Vision System – Phase 1: Segmentation.

Use perceptually motivated color space – LAB.

Offline training in LAB – generate equivalent YCbCr cube.

Reduce problem to table lookup. Robust performance with shadows,

highlights. YCbCr – 82%, LAB – 91%.

Page 12: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Sample Images – Color Segmentation.

Page 13: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Sample Video – Color Segmentation.

Page 14: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Some Problems…

Sensitive to illumination. Frequent re-training. Robot needs to detect and adapt to change.

Off-board color labeling – time consuming. Autonomous color learning possible…

Page 15: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Vision System – Phase 2: Blobs.

Run-Length encoding. Starting point, length in pixels.

Region Merging. Combine run-lengths of same color. Maintain properties: pixels, runs.

Bounding boxes. Abstract representation – four corners. Maintains properties for further analysis.

Page 16: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Sample Images – Blob Detection.

Page 17: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Vision System – Phase 2: Objects.

Object Recognition. Heuristics on size, shape and color. Previously stored bounding box properties. Domain knowledge. Remove spurious blobs.

Distances and angles: known geometry.

Page 18: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Sample Images – Objects.

Page 19: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.
Page 20: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Vision System – Phase 3: Lines.

Popular approaches: Hough transform, Convolution kernels – computationally expensive.

Domain knowledge. Scan lines – green-

white transitions – candidate edge pixels.

Page 21: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Vision System – Phase 3: Lines.

Incremental least square fit for lines. Efficient and easy to implement. Reasonably robust to noise.

Lines provide orientation information. Line Intersections can be used as

markers. Inputs to localization. Ambiguity removed through prior position

knowledge.

Page 22: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Sample Images – Objects + Lines.

Page 23: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Some Problems…

Systems needs to be re-calibrated: Illumination changes. Natural light variations: day/night.

Re-calibration very time consuming. More than an hour spent each time…

Cannot achieve overall goal – play humans. That is not happening anytime soon, but still…

Page 24: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Illumination Sensitivity – Samples.

Trained under one illumination:

Under different illumination:

Page 25: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Illumination Sensitivity – Movie…

Page 26: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Illumination Invariance - Approach.

Three discrete illuminations – bright, intermediate, dark.

Training: Performed offline. Color map for each illumination. Normalized RGB (rgb – use only rg) sample

distributions for each illumination.

Page 27: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Illumination Invariance – Training.

Illumination: bright – color map

Page 28: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Illumination Invariance – Training.

Illumination: bright – map and distributions.

Page 29: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Illumination Invariance – Training.

Page 30: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Illumination Invariance – Testing.

Page 31: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Illumination Invariance – Testing.

Page 32: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Illumination Invariance – Testing.

Page 33: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Illumination Invariance – Testing.

Page 34: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Illumination Invariance – Testing.

Testing - KLDivergence as a distance measure: Robust to artifacts. Performed on-board the robot, about once a

second. Parameter estimation described in the paper.

Works for conditions not trained for… Paper has numerical results.

Page 35: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Adapting to Illumination changes – Video

Page 36: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Some Related Work…

CMU vision system: Basic implementation. James Bruce et al., IROS 2000

German Team vision system: Scan Lines. Rofer et al., RoboCup 2003

Mean-shift: Color Segmentation. Comaniciu and Peer: PAMI 2002

Page 37: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Conclusions

A complete real-time vision system – on board processing.

Implemented new/modified version of vision algorithms.

Good performance on challenging problems: segmentation, object recognition and illumination invariance.

Page 38: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Future Work…

Autonomous color learning. AAAI-05 paper available online.

Working in more general environments, outside the lab.

Automatic detection of and adaptation to illumination changes.

Still a long way to go to play humans .

Page 39: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Autonomous Color Learning – Video

More videos online www.cs.utexas.edu/~AustinVilla/

Page 40: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

THAT’S ALL FOLKS

www.cs.utexas.edu/~AustinVilla/

Page 41: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.
Page 42: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Question – 1: So, what is new??

Robust color space for segmentation. Domain-specific object recognition + line

detection. Towards illumination invariance. Complete vision system – closed loop. Accept – cannot compare with other teams,

but overall performance good at competitions…

Page 43: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Vision – 1: Why LAB??

Robust color space for segmentation. Perceptually motivated. Tackles minor changes – shadows,

highlights. Used in robot rescue…

Page 44: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Vision – 2: Edge pixels + Least Squares??

Conventional approaches time consuming. Scan lines faster:

Reduces colors needing bounding boxes. LS easier to implement – fast too.

Accept – have not compared with any other method…

Page 45: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Vision – 3: Normalized RGB ??

YCbCr separates luminance – but not good for practice on Aibo.

Normalized RGB (rgb): Reduces number of dimensions - storage. More robust to minor variations.

Accept – have compared with YCbCr alone – LAB works but more storage and calculations…

Page 46: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Illumination Invariance – Training.

Page 47: Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin smohan@ece.utexas.edu.

Illumination Invariance – Testing.