Implementation of Gesture Recognition in the Immersive Visualization Environment
description
Transcript of Implementation of Gesture Recognition in the Immersive Visualization Environment
Implementation of Gesture Recognition in the Immersive Visualization
Environment
By Danny Catacora
Under the guidance of:Judith Terrill andTerence Griffin
National Institute of Standards and Technology
To add a set of gestures that were intuitive to the users, and have actions that could be easily reconfigured to be applied to different visualizations
Problem Statement:
Courtesy of capital-benefits-group.com.au
Outline:• Introduction to the IVE in the RAVE
• Importance of needed gestures in the IVE
• Procedure consisted of designing a gesture, record data, and coding it
• Created similar Wii gestures that are easily defined and recognized
• Future work includes more complicated gestures
Introduction:IVE?• Immersive Visualization
Environment
• Produces visualizations from large amounts of computed data
• A visualization technology that allows human interaction
Courtesy of NIST
Introduction:
Courtesy of NIST
RAVE?• Reconfigurable
Automatic Virtual Environment
• Takes up the size of a room
• Consists of a wand, headset, and 3 screens
Importance of Problem:
• Flexible and intuitive navigation methods, but lacks pointer gestures
• Newer interactions needed
• Applicable and basic gestures should be investigated
Courtesy of NIST
Research & Design:Game Technology:• The Wii :
o Uses a remote controller to recognize gestures• Xbox Kinect :
o Removes the controller, recognizes human movement and actions
Improving 3D Gesture Recognition with Spatially Convenient Input Devices
Procedure:Steps for each gesture: • Gesture definition:
o Specify what happens when the gesture is performedo Define what the gesture program will recognize/look for
• Data collection:
o Figure out how each variable will respond to the gestureo Determine correlations between gestures and changes in
position, speed, and acceleration
Procedure:Steps for each gesture: • Program implementation of gesture:
o Write code that will recognize gesture accuratelyo Use algorithms and data analysis to write code in C++
• Gesture testing:
o Test the written code for the gesture in both environmentso See if calibration for the gesture works
Data Interpretation:
• Had program record the movement of the wand
• Wand data could be analyzed for specific changes in the x, y, z, yaw, pitch, and roll values.
• Code was written to catch the patterns of gestures.
Results:
• Resulting algorithms and code that recognizes gestures
• Further
complex gestures later defined from these basic gestures
Results:• Displaying gesture recognition in the RAVE• Display: from terminal, to pop-ups, to 3d arrow
Conclusion
• In the end, able to successfully identify a total 15 gestures
• Actions completely user definable
• Gestures will help current interaction within the IVE • Set basis for more complex gestures in the future
Future Research:
• Eventually remove the wires, glasses, or the remote
• Set basis for more complex gestures in the future
• Establish a greater user-friendly interaction between man and machine in the RAVE
Acknowledgments:
• Judith Terrill - N.I.S.T • Terence Griffen - N.I.S.T • John Hagedorn - N.I.S.T • Steven Satterfield - N.I.S.T • Elizabeth Duval - Montgomery Blair High
School
Thank You.Questions?