Mixed Reality Systems -Lab IV – Augmented Reality- Christoph Anthes.
-
Upload
alyson-byrd -
Category
Documents
-
view
218 -
download
2
Embed Size (px)
Transcript of Mixed Reality Systems -Lab IV – Augmented Reality- Christoph Anthes.

Mixed Reality Systems-Lab IV – Augmented Reality-
Christoph Anthes

LabMixed Reality Systems 2
Overview• ARToolKit• Combining ARToolKit with OpenSG
• Initialisation• ARToolKit Loop• Helper Functions• Tracking Objects• Interaction with objects
• Combining ARToolKit with inVRs

LabMixed Reality Systems 3
ARToolKit• Current Version 2.72.1available at Sourceforge page• C and C++ API• Cross-platform API (Windows, Linux, MacOS, IRIX)• OpenGL used for rendering, GLUT used for event
handling• Used video library depends on chosen platform• Architecture – General Picture
• Build on GLUT with OpenGL, C and C++• Makes use of the video device specific and
graphics drivers• Often used only as an independent tracking
library• Own application is designed to be build on top of OpenGL and
ARToolKit• Interfaces to larger tracking libraries exist (e.g. OpenTracker)
From http://www.hitl.washington.edu/artoolkit/documentation/From http://www.hitl.washington.edu/artoolkit/documentation/

Lab
ARToolKit• Architecture – ARToolKit Focus
• AR module• Core module with marker
tracking, calibration and parameter collection
• Video module• Collection of video routines for
capturing the video input frames • Wrapper around the standard
platform SDK video capture routines• Gsub module
• Graphic routines based on the OpenGL and GLUT libraries • Gsub_lite module
• Replaces GSub with a more efficient collection of graphics routines
• Independent of any particular windowing toolkit4Mixed Reality Systems
From http://www.hitl.washington.edu/artoolkit/documentation/From http://www.hitl.washington.edu/artoolkit/documentation/

Lab
ARToolKit• Coordinate Systems
• Several coordinate systems are to be used
• Most important camera and marker coordinates
• From camera to screen coordinates a transformation via a distortion function can be performed
• Z-axis of the marker is pointing upward• Z-axis of the camera is pointing in the scene• Top left corner of the screen is 0,0
• arGetTransMat()• Returns the coordinates of the marker in the Camera coordinate
System
• arMatrixInverse() • Returns the coordinates of the camera in the marker coordinate
system5Mixed Reality Systems
From http://www.hitl.washington.edu/artoolkit/documentation/From http://www.hitl.washington.edu/artoolkit/documentation/

LabMixed Reality Systems 6
ARToolKit• Basic Application
From http://www.hitl.washington.edu/artoolkit/documentation/From http://www.hitl.washington.edu/artoolkit/documentation/

LabMixed Reality Systems 7
ARToolKit• Corresponding Function Calls in plain ARToolKit API
From http://www.hitl.washington.edu/artoolkit/documentation/From http://www.hitl.washington.edu/artoolkit/documentation/

LabMixed Reality Systems 8
ARToolKit• Connections to Scene Graphs in General
• Three steps• Initialising the Camera• Transforming the Object• “Real Occluders” – slightly advanced
• OpenSG• Creation of a new node with fscEdit
• OpenSceneGraph• Extension of scene view class• Display of occluding geometry• Overwrite colour buffer with video image• Finally display of recognised objects
• Two big approaches• OSGART (http://www.artoolworks.com/community/osgart/index.html)
• OSGAR (http://www.gvu.gatech.edu/ael/projects/ARSceneGraph.html)

LabMixed Reality Systems 9
Combining ARToolKit and OpenSG• OpenSG provides examples for interconnecting either
ARToolKit or ARToolKit Plus• We are going to work with ARToolKit• So we first need the additional include files
• gsub.h – contains main display functions used in ARToolkit
• video.h – provides multi-platform video input support for ARToolKit
• param.h – contains principal routines for loading, saving, and modify camera parameters
• ar.h – provides image analysis and marker detection routines

LabMixed Reality Systems 10
Combining ARToolKit and OpenSG• If we take a look at our example we start with a set of
forward declarations some which we have not seen in the previous OpenSG or inVRs tutorial
• Additionally more methods are used at the end of the code
• We start with the setup and cleanup methods:• initARToolkit() is used for initialisation of the ARToolKit
components of the example• initOpenSG() initialises the OpenSG setup of the example• initGlut() registers the GLUT callbacks as we have seen in
previous examples• setupCamera () provides a setup for our interconnected webcam• cleanupARToolkit() stops the video capture of ARToolKit and
closes the video stream processing • cleanupOpenSG() frees used variables, stops the binding to
ARToolKit, and call osgExit()
• Let’s have a detailed look at the setup methods

LabMixed Reality Systems 11
Combining ARToolKit and OpenSG• initARToolkit()
• Wraps and calls the different internal setup functions for ARToolKit
• The cleanup method is registered as a callback at program termination

LabMixed Reality Systems 12
Combining ARToolKit and OpenSG• setupCamera()
• The camera setup is defined in this function• Camera parameters from calibration file are parsed and
evaluated• Conversion has to take place in order to write data out in the
right format• The ModelViewMatrix and the ProjectionMatrix are set• This setup is stored inside an OpenSG camera object
• The camera parameters are stored in binary files• E.g. currentParams.mat

LabMixed Reality Systems 13
Combining ARToolKit and OpenSG• setupCamera()

LabMixed Reality Systems 14
Combining ARToolKit and OpenSG• initOpenSG()
• The standard OpenSG setup is performed• A GLUT window is created and initialised• A root node with an anonymous group core is created• The previously described camera setup is triggered• A background object is interconnected with a video texture• The SimpleSceneManager is initialised and interconnected
with the window and the root node of the scene• The background image as well as the camera are attached to
the viewport of the just generated window• All changes are committed to OpenSG• Finally the OpenSG cleanup function is registered to be
triggered at the termination of the application

LabMixed Reality Systems 15
Combining ARToolKit and OpenSG• initOpenSG()

LabMixed Reality Systems 16
Combining ARToolKit and OpenSG• Then we have the ARToolKit processing methods
• In the captureFrame() method the image is retrieved from the camera
• The detectMarkers() function triggers marker detection and outputs the amount of found markers
• applyMarkerTrans() applies the transformation from a given marker to an OpenSG transformation core
• ARToolKit Loop• These steps represent our steps 2-4 from our ARToolKit loop• Step 1 was the initialisation given with the previous set of
functions • Step 5 the rendering is performed by OpenSG
• They have to be processed frame by frame, thus they are called inside the display loop of the application

LabMixed Reality Systems 17
Combining ARToolKit and OpenSG• captureFrame()
• This method retrieves an image which was captured by ARToolKit from the incoming video stream
• The image data is set in an OpenSG image and an update notification is issued
• This frame will later on be rendered as a background image• It is as well used by ARToolKit for image processing which is
performed in the next step

LabMixed Reality Systems 18
Combining ARToolKit and OpenSG• detectMarkers()
• In this method the markers visible on the image are detected• The amount of detected markers as well as the markers
themselves are returned by reference• A threshold parameter determines the binarisation of the
image• Output describing the amount of detected markers as well as
the ids of the markers is generated on the console

LabMixed Reality Systems 19
Combining ARToolKit and OpenSG• applyMarkerTrans()
• Extracts the transformation information from a marker and applies it to a given OpenSG model
• By using an STL map a binding between
• The marker transformation is requested from ARToolKit via arGetMarkerTrans
• It is then converted into an OpenSG matrix• If a marker is found in the map it becomes activated again
and the transformation matrix just retrieved is applied on an OpenSG object
• The changes are finally committed

LabMixed Reality Systems 20
Combining ARToolKit and OpenSG• applyMarkerTrans()

LabMixed Reality Systems 21
Combining ARToolKit and OpenSG• Additional methods which are used as helper
functions and appear in the code• The method arMatrixToOSGMatrix() performs a data
conversion from ARToolKit to OpenSG • The argConvGLcpara() method is used for conversion of
ARToolKit camera parameters to OpenGL parameters• getTranslation() returns the translation vector of a
transformation matrix

LabMixed Reality Systems 22
Combining ARToolKit and OpenSG• arMatrixToOSGMatrix()
• Some very basic reformatting of an ARToolKit Matrix to an OpenSG Matrix is performed in this method

LabMixed Reality Systems 23
Combining ARToolKit and OpenSG• argConvGLcpara() is used to transform ARToolKit
intrinsic camera parameters matrix format to an OpenGL matrix format• More details on camera calibration and the parameters are
given in the computer vision class in Winter semester

LabMixed Reality Systems 24
Combining ARToolKit and OpenSG• Additional methods which are used as helper
functions and appear in the code• createPattern() connects a marker with an OpenSG sub
scene graph• createBackground() creates an image background based on
the image gathererd from the ARToolKit Video stream• createModel() loads a sub scene graph from disk and equips
it with an additional transformation code
• Before we start coding let’s have a more detailed look at these methods

LabMixed Reality Systems 25
Combining ARToolKit and OpenSG• createPattern()
• This helper method interconnects a pattern from ARToolKit with a sub scene graph provided by OpenSG, it uses createModel() as helper
• The pattern is then registered at an STL map

LabMixed Reality Systems 26
Combining ARToolKit and OpenSG• createBackground()
• This method creates the image background of OpenSG based on the data gathered from the ARToolKit video stream

LabMixed Reality Systems 27
Combining ARToolKit and OpenSG• createModel()
• This is a helper method which simply loads a model and attaches it to a node with a ComponentTransform core
• Now we should know all necessary helper functions and will go on with the main function and the display function

LabMixed Reality Systems 28
Tracking Objects• Our main function in this example is very simple
since most of the processing is performed in the display loop• Initialisation of OpenSG and ARToolKit is performed• The video loop and the display loop are triggered
• We now insert our first snippet into main in order to display an object on a marker
• Compile and execute now
Snippet 1-1Snippet 1-1

LabMixed Reality Systems 29
Tracking Objects• If we now take a closer look at our display loop we
can see at the beginning our 3 ARToolKit steps
• We should now enhance the scene by adding two objects and removing the first one
• If you compile and execute you code now you should be able to see two objects
• Now we want to perform some interaction between the objects
Snippet 2-1Snippet 2-1

LabMixed Reality Systems 30
Tracking Objects• With the next snippet we check the proximity of the
two objects and change the scale• First we retrieve the transformations of the objects
and calculate the distance
Snippet 2-2 – First PartSnippet 2-2 – First Part

LabMixed Reality Systems 31
Interaction with Objects• We calculate the scale based on the distance and
apply it on the transformation matrices of our two objects
• Afterwards changes have to be committed
Snippet 2-2 – Second PartSnippet 2-2 – Second Part

LabMixed Reality Systems 32
Interaction with Objects• If you execute your code now you should see
something like this (most likely with a different user)

LabMixed Reality Systems 33
Interaction with Objects• In the next step things will become more complicated
an object should move from a marker to another marker
• Actually we want to have a frog jump from one stone on a marker to another stone on a different marker
• First we uncomment the scaling part of the object by inserting the following snippet
Snippet 3-1Snippet 3-1

LabMixed Reality Systems 34
Interaction with Objects• We initialise a frog model and other nodes for our
animation
Snippet 3-2Snippet 3-2

LabMixed Reality Systems 35
Interaction with Objects• Now we trigger the initialisation in the main method
• For the animation of motion between one and another stone we install a timer
• Inside the Display loop delta values are calculated in order to determine the time since the last frame
• An overall counter is incremented
Snippet 3-3Snippet 3-3
Snippet 3-4Snippet 3-4

LabMixed Reality Systems 36
Interaction with Objects• For animating a jumping frog from one stone to another
stone, we implement a state machine with 5 states• SITTING – the frog sits for a given time before it can try to jump,
afterwards it automatically switches to the WAITING state• WAITING – two markers have to be close enough to make the
frog jump, if the proximity check returns positively it will jump• JUMPING – in the state a change in the scene graph is
performed. The frog node is attached to a transformation node• FLYING – in this state the actual calculation for the trajectory of
the frog is calculated and applied on the transformation node. Once the target is reached the state changes to LANDING
• LANDING – as in the JUMPING state a change in the scene graph hierarchy is performed. The state machine switches to SITTING again.
• The whole state machine is implemented inside a single snippet through which we will go through now

LabMixed Reality Systems 37
Interaction with Objects• State machine – SITTING – WAITING
• We check the passed time in sitting state and switch after 2000ms to the WAITING state. The counter is reset afterwards.
• In the WAITING state we check the marker proximity and switch to the JUMPING state if successful.
Snippet 3-5 Part 1Snippet 3-5 Part 1

LabMixed Reality Systems 38
Interaction with Objects• State machine – JUMPING
• We detach the frog from the stone and attach it to a transformation core the transformation matrix of the core is initialised with the markers transformation matrix
• The state changes to FLYING and the counter is reset.
Snippet 3-5 Part 2Snippet 3-5 Part 2

LabMixed Reality Systems 39
Interaction with Objects• State machine – FLYING
• Retrieve the target position and the current position and determine the path of the frog
• In case the flying time becomes to high we switch back to the LANDING state
• If the distance between the frog and the target becomes to big it is going back
Snippet 3-5 Part 3Snippet 3-5 Part 3

LabMixed Reality Systems 40
Interaction with Objects• State machine – FLYING continued
• If everything went alright we determine the direction vector by normalising the path
• We scale this normalised vector to move the frog at a constant speed based on the current deltaTime
• No we calculate a transformation matrix based on this vector• Finally the generated transformation is applied on the frog
transformation
Snippet 3-5 Part 4Snippet 3-5 Part 4

LabMixed Reality Systems 41
Interaction with Objects• State machine – LANDING
• The frog is detached from the transformation node and attached to the new marker
• The counter is reset and the state is set back to the initial SITTING state
• At the end of each iteration of the display loop the changes are committed
Snippet 3-4 Part 5Snippet 3-4 Part 5

LabMixed Reality Systems 42
Interaction with Objects• This is how it should look• The final step is fully up to
you to implement• Have a frog jumping
between three markers• You will have to load an
additional marker• You will have to determine
the distance between thethree different markers and determine the closest
• Let the frog hop from one marker to the closest marker

LabMixed Reality Systems 43
Combining ARToolKit with inVRs• inVRs provides a simple binding to the ARToolKit which
implements interaction with markers and the scene• The next lab will show some basic desktop interaction
with the ARToolKit• To enable full AR support for the inVRs framework the
input interface as well as the output interface can be enhanced
• Things to do at home• Try to make two frogs switch stones when they look at each other• Take a look at multi-pattern calibration• Try to interconnect inVRs with the ARToolKit and write an
ARToolKit marker input device• Take a look at ARToolKit Plus, what are the advantages?• Can you manage to interconnect ARToolKit Plus with OpenSG or
inVRs?

LabMixed Reality Systems 44
Useful Links• ARToolKit
• http://artoolkit.sourceforge.net/ - Sourceforge Entry page • http://www.hitl.washington.edu/artoolkit/ - GPL version• http://www.artoolworks.com/ - commercial version
• ARToolKit Plus Web Page• http://studierstube.icg.tu-graz.ac.at/handheld_ar/
artoolkitplus.php
• OpenSG Web Page• http://www.opensg.org/
• inVRs Web Page• http://www.invrs.org/

Thank You !