Google Glass, The META and Co. - How to calibrate your Optical See-Through Head Mounted Displays
-
Upload
jens-grubert -
Category
Mobile
-
view
1.068 -
download
4
description
Transcript of Google Glass, The META and Co. - How to calibrate your Optical See-Through Head Mounted Displays
Introduction to Optical See-Through HMD Calibration
Jens Grubert (TU Graz) Yuta Itoh (TU Munich)
9th Sep 2014
Theory14:15 Introduction to OST Calibration15:00 coffee break15:15 Details of OST Calibration16:15 coffee break
Practice16:30 Hands on session: calibration of OST HMDs17:30 Discussion: experiences, feedback17:50 wrap-up, mailing list18:00 end of tutorial
Part 1
Theory: Introduction
Non Optical See-Through
Optical See-Through (OST)
Head Mounted Displays (HMD)
Issues on OST-HMD
Photo by Mikepanhu
ConsistencyConsistency
Photo by javier1949
The Lack of consistenciesSpatial
Visual
Temporal
Social
Temporal Inconsist. in OST-HMD
“How fast is fast enough? : a study of the effects of latency in direct-touch pointing tasks” Jota et al. CH’13
https://www.youtube.com/watch?v=PCbSTj7LjJg
“latencies down to 2.38 ms are required to alleviate user perception when dragging”
Temporal Inconsist. in OST-HMD
Digital Light Processing Projector
“Minimizing Latency for Augmented Reality Displays: Frames Considered Harmful” Zheng et al. ISMAR’14
Visual Consistency
Occlusion, Depth, Color, Shadow,
Lee and woo, 2009
Kiyokawa et al. 2003
Liu et al. 2008
Wide Field of View, etc…Visual Consistency
“Pinlight Displays: Wide Field of View Augmented Reality Eyeglassesusing Defocused Point Light Sources” Maimone et al., TOG’14
Social Consistency
Image from Google Glass: Don't Be A Glasshole | Mashable
Social Consistency
Image from http://www.thephoblographer.com/
Spatial registration Calibration
Spatial Inconsistency in OST-HMD
Spatial Inconsistency in OST-HMD
Calibration of Eye&OST-HMD
Find 3D-2D Projection :
Camera Calibration Analogy
KIntrinsic
Find 3D-2D Projection:
OST-HMD’s Screen to Camera
P = K*[R t]
P
We can not see what you see!
In the Eye of the Beholder…
P = K*[R t] ?
– Intensive user interaction–User-dependent noise
Manual alignment
[AZU97]
P is a Black Box
PPFind 3D-2D Projection:
3D 2D
–Medium user interaction
–User-dependent noise
SPAAM: Single Point Active Alignment Method
[TN00][GTN02]
N times
You got a perfect P!!!
P
Oops, sorry I touched your HMD…
P
Essential Difficulties
?1 Data acquisition
2 Dynamic parameter changes
Data collectionSPAAM, MPAAM, Stereo Calibration
Part 2: Overview
State of the art
Practical tips
Confirmation Methods
Evaluation
Data Collection: SPAAM
Data Collection: MPAAM
Data Collection: Stereo
Confirmation Methods
Keyboard
Voice
Handheld
Waiting
State of the Art
Practical Tips
Theory14:15 Introduction to OST Calibration15:00 coffee break15:15 Details of OST Calibration16:15 coffee break
Practice16:30 Hands on session: calibration of OST HMDs17:30 Discussion: experiences, feedback17:50 wrap-up, mailing list18:00 end of tutorial
Part 1
Theory: Details
Data Collection Methods: SPAAM Single Point Active Alignment Method
Eye-HMD Calibration
P3D
2D
P is a Black Box
PFind 3D-2D Projection:
P 3D2D
P
2D
Say, is a Perspective ProjectionP
2D-3D correspondences gives
3D2D P
P
Only users can see the 2D points!
–Medium user interaction
–User-dependent noise
SPAAM: Single Point Active Alignment Method
[TN00][GTN02]
N times
SPAAM: Single Point Active Alignment Method
Minimum 6 pairs
Better 16~20 pairs
3D
2D
3D
3D
Better distributed in Z axis
Data Collection Methods: Stereo Calibration
SPAAM: Calibration for a Single Display
How to calibrate stereo systems?
How to calibrate stereo systems?
Idea 1: Calibrate each eye individually
Calibrate each eye individually
How to calibrate stereo systems?
Idea 2: Calibrate both eyes simultaneously Why?Save time
Calibrate both eyes simultaneously
Idea
1. display 2D objects with disparity in left and right eye appears as single object at a certain distance
2. Align virtual with physical 3D object Get point correspondence for both eyes
[GSW00]
Challenges for Simultaneous Alignment
• Shape of the virtual object
• Occlusion of physical target
• Vergence-accomodation conflict
Simultaneous calibration can be significantly faster to calibrate
Perceptual issues might hinder quality calibration
Stereo Calibration Take Aways
Data Collection Methods: Multi-Point Collection
Idea
SPAAM:
align a single point multiple times
Multi-Point Active Alignment (MPAAM):
align several points concurently but only once
Why?
save time
Example: SPAAM
Example: MPAAM
MPAAM Variants
• Align all pointsat once
• Minimum of six points
• Vary spatial distribution
[TMX07]
MPAAM Variants
• Align all pointsat once
• Minimum of six points
• Vary spatial distribution
• Missing: tradeoff # points - # calibration steps [GTM10]
Performance• MPAAM can be conducted
significantly faster than SPAAM (in average in 84 seconds vs 154 seconds for SPAAM) [GTM10]
• MPAAM has comparable accuracy in the calibrated range
MPAAM take aways
MPAAM can be alternative to SPAAM if
• Working volume can be covered by calibration body
• Need for repeated calibration (e.g., after HMD slips)
Confirmation Methods
User has to confirm 2D-3D matching
I take the pair NOW!
How to make confirmation stable?
I take the pair NOW!
Different confirmation options
Keyboard
Voice
Handheld
Waiting
[MDW11]
Less motion is better[MDW11]
Evaluation: User in the Loop
Evaluation Questions• How accurate is the overlay given
the current calibration? [MGT01] [GTM10]
• How much do the calibration results vary between calibrations? [ASO11]
• What is the impact of individual error sources on the calibration results?– Head pointing accuracy, body sway,
confirmation methods ... [AXH11]
Evaluation Questions• How accurate is the overlay
given the current calibration? [MGT01] [GTM10]
• How much do the calibration results vary between calibrations? [ASO11]
• What is the impact of individual error sources on the calibration results?– Head pointing accuracy, body sway,
confirmation methods ... [AXH11]
How accurate is the overlay given the current calibration?
Popular Approaches
Use a camera Ask the user
User in the Loop Evaluation
Qualitative feedback„overlay looks good“
Quantitative feedback
User in the Loop Evaluation
Qualitative feedback„overlay looks good“
Quantitative feedback
Quantitative Feedback
McGarrity et al. [MGT01]:• Use a tracked evaluation
board• Ask AR system to
superimpose object on,
• Ask user to indicate where she perceives the object on the board ,
• Offset:
Quantitative Feedback
McGarrity et al. [MGT01]:• Use a tracked evaluation
board• Ask AR system to
superimpose object on,
• Ask user to indicate where she perceives the object on the board ,
• Offset:
Quantitative Feedback
• Drawback of stylus approach: evaluation only within arm‘s reach
Alternatives• Use laser pointer + human
operator instead (beware pointing accuracy) [GTM10]
• Use projector / large display + indirect pointing (e.g., mouse)
Quantitative Feedback
Benefits:• Only way to approximate how the
user herself perceives the augmentation
Drawbacks:• Only valid for current view (distance,
orientation)• Additional pointing error introduced
Take Aways
• Quantitative user feedback only way to approximate how large the registration error is for indivdual users
• Feedback methods introduce additional (pointing) errors
• Make sure to test for all relevant working distances
Evaluation: Error Measurements
OST-HMD Calibration
2DProjection
Matrix
P3D
Ideal Case
3D-2D pairs:S
Eye positions:(Camera center)
2D Projection Error
,
Wrong Projection
Reprojection Error
3D Eye Positions[m]
>10 cm
Semi-Automatic Calibration Approaches
Motivation
Can the calibration process be shortened?
User guided See-Through Calibration too tedious
https://www.flickr.com/photos/stuartncook/4613088809/in/photostream/
Observation
We have to estimate 11 parameters
--> At least 6 point correspodences needed
P2D 3D
Reminder: Collecting Correspondences
Idea
Separate certain parameters which are independent from the user?
The user would need to collect fewer point correspondences, making the task faster and easier.
Reminder: Calibratrion Parameters Pinhole Camera
TCS: Tracking Coordinate SystemEDCS: Eye-Display Coordinate System
TCS
EDCS
Rotation and Translation between Tracking Coordinate System and Eye-Display Coordinate System: 6 Parameters for center of projection
5 intrinsic parameters of Eye-Display optical system:
focal length (x,y), shear, principal point (x,y)
(+ more if you want to modell distortion)
Separate intrinsic + extrinsic parameters
[OZT04]:1. Determine ALL parameters
(including distortion) via camera without user intervention
2. Update center of projection in a user phase
State of the art: Automatic Method
Utilizes 3D Eye Localization [IK14]
– Interaction-free, thus do not bother users
–More accurate than a realistic SPAAM setup
INDICA: Interaction-free DIsplay CAlibration
1. Estimate a 2D iris ellipse– Iris detector + Fitting by
RANSAC
2. Back project it to 3D circle
[SBD12]
3D Eye Position Estimation
[NNT11]
Manual (SPAAM)
Interaction Free (INDICA Recycle)
Interaction Free (INDICA Full)
World to HMD(eye) Projection
3D
Screen
2D
P
P
P
Simple No user interaction
Accurate better than Degraded manual calibrations
Summary of INDICA
Calibration of OST-HMDs using3D eye position
Practical Tips
How many control points for SPAAM?
• Minimum of 6 can lead to unstable and innaccurate results?
• The more the better? Not neccesarily 16-20 control points sufficient if points are equally distributed in all three dimensions
16 20
Calibration Error [mm] [CAR94]
Calibration Volume
If possible calibrate the working volume you want to operate in
Working Volume
Calibration
Volume
Quality of Tracking System
Ensure the best calibration possible for your external tracking system
Ensure a low latency
Summary of Part 2
Reducing user errors:- Data-collection- Confirmation- Evaluation
Manual to automatic:State of the art
Practical tips
References 1/2[AXH11] Axholt, M. (2011). Pinhole Camera Calibration in the Presence of Human Noise.
[ASO11] Axholt, M., Skoglund, M. A., O'Connell, S. D., Cooper, M. D., Ellis, S. R., & Ynnerman, A. (2011, March). Parameter estimation variance of the single point active alignment method in optical see-through head mounted display calibration. In Virtual Reality Conference (VR), 2011 IEEE (pp. 27-34). IEEE.
[AZU97] Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
[CAR94] Chen, L., Armstrong, C. W., & Raftopoulos, D. D. (1994). An investigation on the accuracy of three-dimensional space reconstruction using the direct linear transformation technique. Journal of biomechanics, 27(4), 493-500.
[CNN11] Christian, N., Atsushi, N., & Haruo, T. (2011). Image-based Eye Pose and Reflection Analysis for Advanced Interaction Techniques and Scene Understanding. CVIM,, 2011(31), 1-16.
[GTM10] Grubert, J., Tuemler, J., Mecke, R., & Schenk, M. (2010). Comparative User Study of two See-through Calibration Methods. In VR (pp. 269-270).
[GTN02] Genc, Y., Tuceryan, M., & Navab, N. (2002, September). Practical solutions for calibration of optical see-through devices. In Proceedings of the 1st International Symposium on Mixed and Augmented Reality (p. 169). IEEE Computer Society.
References 2/2[MAE14] Moser, K. R., Axholt, M., & Edward Swan, J. (2014, March). Baseline SPAAM calibration accuracy and precision in the absence of human postural sway error. In Virtual Reality (VR), 2014 iEEE (pp. 99-100). IEEE.
[MGT01] McGarrity, E., Genc, Y., Tuceryan, M., Owen, C., & Navab, N. (2001). A new system for online quantitative evaluation of optical see-through augmentation. In ISAR 2001 (pp. 157-166). IEEE.
[MDW11] P. Maier, A. Dey, C. A. Waechter, C. Sandor, M. Tönnis and G. Klinker, "An empiric evaluation of confirmation methods for optical see-through head-mounted display calibration. In International Symposium on Mixed and Augmented Reality (ISMAR), 2011 IEEE.
[OZT04] Owen, C. B., Zhou, J., Tang, A., & Xiao, F. (2004, November). Display-relative calibration for optical see-through head-mounted displays. In Mixed and Augmented Reality, 2004. ISMAR 2004. Third IEEE and ACM International Symposium on (pp. 70-78). IEEE.
[SBD12] Świrski, L., Bulling, A., & Dodgson, N. (2012, March). Robust real-time pupil tracking in highly off-axis images. In Proceedings of the Symposium on Eye Tracking Research and Applications (pp. 173-176). ACM.
[TU00] Tuceryan, M., & Navab, N. (2000). Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR. In Augmented Reality, 2000.(ISAR 2000). Proceedings. IEEE and ACM International Symposium on (pp. 149-158). IEEE.
104
Online References
Up to date references for the field of optical see-through calibration can be
found here:
http://www.mendeley.com/groups/4218141/calibration-of-optical-see-through-head-mounted-displays/overview
/