Vipul divyanshu documentation on Kinect and Motion Tracking

17
Skeletal Tracking and Facial Tracking and Animation Project Documentation Vipul Divyanshu IIL/2012/14 Summer Internship Mentor: Imagineer India Innovation Labs Tasks at hand : *Skeletal Tracking *Facial Tracking *Skeletal animation and integration with the model (additional) Tools Explored: OpenNI Microsoft Kinect SDK Ogre 3D Unity Blender Analysis of the tools and what was explored: OpenNI :- OpenNI (Open Natural Interaction) is a multi- language, cross-platform framework that defines APIs for writing applications utilizing Natural Interaction. OpenNI APIs are composed of a set of interfaces for writing NI

Transcript of Vipul divyanshu documentation on Kinect and Motion Tracking

Page 1: Vipul divyanshu documentation  on Kinect and Motion Tracking

Skeletal Tracking and Facial Tracking and Animation

Project Documentation

Vipul Divyanshu IIL/2012/14 Summer Internship Mentor: Imagineer India Innovation LabsTasks at hand:

*Skeletal Tracking

*Facial Tracking

*Skeletal animation and integration with the model (additional)

Tools Explored:

OpenNI Microsoft Kinect SDK Ogre 3D Unity Blender

Analysis of the tools and what was explored:

OpenNI:- OpenNI (Open Natural Interaction) is a multi-language, cross-platform framework that defines APIs for writing applications utilizing Natural Interaction. OpenNI APIs are composed of a set of interfaces for writing NI applications. The main purpose of OpenNI is to form a standard API that enables communication with both:

Vision and audio sensors (the devices that ‘see’ and ‘hear’ the figures and their surroundings.)

Vision and audio perception middleware (the software components that analyse the audio and visual data that is recorded from the scene, and comprehend it). For example, software that receives visual data, such as an image, returns the location of the palm of a hand detected within the image.

OpenNI supplies a set of APIs to be implemented by the sensor devices, and a set of APIs to be implemented by the middleware components.

Page 2: Vipul divyanshu documentation  on Kinect and Motion Tracking

A major plus point of OpenNI is that we can run the skeletal tracking work and work on the recorded Kinect video in .oni format.

Constrains of OpenNI:-

What I found out was for an easier start of project Microsoft Kinect SDK would be more useful for a full body skeletal tracking and its animation C#. This is because OpenNI cannot easily load a model in OpenNI directly from Maya or Blender, instead SDKs like Ogre 3D must be used.

Microsoft Kinect SDK: - Microsoft Kinect come with classes for skeletal tracking

and it is much easier to initialise Kinect and get the skeletal joint points just by calling the skeletal.joint in C#. For developing in C++ both Microsoft Kinect and OpenNI stand at same level.

I developed in C# and found XNA and Microsoft Kinect very useful in both getting the skeletal data and interacting with the 3D model in .fbx format.

Still the exploration of Kinect SDK in the facial animation field is left as it wasn’t a part my work package.

FBX :- Fbx format used with C# is very flexible and is has no problem with object collision.

Constrains:-

Working directly with the recorded data from the Kinect Studio .xed hasn’t been well explored due to lack of material online and that can be restricting.

For facial tracking, it cannot be detected if the eyes are open or not, rest other features are detected properly.

Ogre 3D: - Ogre can be used in animation and in sync in the OpenNI or Kinect

SDK .It wasn’t explored much by me. It has a vast amount of potential for facial animation.

Constrains:- Requires a bit of detailed study about the model and the class hierarchy to perform the animation. It is easier for experienced person.

Things Learnt and coded:

OpenNI: I learnt basic of extracting the skeletal joints and how to work with the recorded data and the class hierarchy of the OpenNI and there attributes, and understood the data flow. Wrote and analysed the code for the getting the skeletal data of the user.

Microsoft Kinect SDK: I learnt basic of extracting the skeletal joints and how to work with the recorded data and the class hierarchy of the Microsoft Kinect and there attributes, and understood the data flow. And

Page 3: Vipul divyanshu documentation  on Kinect and Motion Tracking

then learnt C# .Using Kinect SDK I wrote few code blocks to add the sample and three complete programs for skeletal tracking and user interface. Wrote the code for controlling the cursor and colouring the player detected with different colour. There screenshots can be found below. Understood how to load and work with 3D model into C# and control it with user motion and locked the default to the knee so the model doesn’t slip and has a more natural motion flow. Learnt and understood the sample for facial tracking and understood the data structure for getting the facial points.

Ogre: Learnt the data flow of Ogre and explored just the tip of the ice burg of Ogre as my part was just the tracking.

Initial Software Development tools (SDKs) suggested: OpenNI, Ogre and Maya

Installation of OpenNI:

In Ubuntu:-

First the following requirements need to be installed to install OpenNI.

Requirements:

1. GCC 4.x ( http://gcc.gnu.org/releases.html )

sudo apt-get install g++

2) Python 2.6+/3.x ( http://www.python.org/download/ ) This may already be installed, based on the linux distro being used.

sudo apt-get install python

3) LibUSB 1.0.8 ( http://sourceforge.net/projects/libusb/ )

sudo apt-get install libusb-1.0-0-dev

4) FreeGLUT3 ( http://freeglut.sourceforge.net/index.php#download )

sudo apt-get install freeglut3-dev

5) JDK 6.0 ( http://www.oracle.com/technetwork/java/javase/downloads/jdk-6u26-download-400750.html )

sudo add-apt-repository "deb http://archive.canonical.com/ lucid partner"

sudo apt-get update

sudo apt-get install sun-java6-jdk

Optional Requirements (To build the documentation):

1) Doxygen ( http://www.stack.nl/~dimitri/doxygen/download.html#latestsrc )

Page 4: Vipul divyanshu documentation  on Kinect and Motion Tracking

sudo apt-get install doxygen

2) GraphViz ( http://www.graphviz.org/Download_linux_ubuntu.php )

sudo apt-get install graphviz

Optional Requirements (To build the Mono wrapper):

1) Mono ( http://www.go-mono.com/mono-downloads/download.html )

sudo apt-get install mono-complete

Download OpenNI Modules:

Download the OpenNI modules appropriate for your operating system fromhttp://75.98.78.94/Downloads/OpenNIModules.aspx

Download the latest unstable binaries for these:

OpenNI binaries

OpenNI compliant middleware binaries

OpenNI compliant hardware binariesFor Ubuntu Linux 12.04 64bit, the files will be: (as of 19 June 2012)

openni-bin-dev-linux-x64-v1.5.4.0.tar

nite-bin-linux-x64-v1.5.2.21.tar.bz2

sensor-bin-linux-x64-v5.1.2.1.tar.bz2Make a new folder called kinect

mkdir kinect

cd kinect

Extract the downloaded files into it. (3 folders are now created).

1. OpenNI-Bin-Linux64-v1.5.4.0

2. Sensor-Bin-Linux64-v5.1.2.1

3. Nite-1.5.2.1Rename the folders as, OpenNI, Sensor, Nite respectively.

Install OpenNI and Sensor Kinect: (run sudo ./install.sh )

cd kinect

cd OpenNI

sudo ./install.sh

(Every step should now show OK! )

cd ../Sensor

sudo ./install.sh

(Every step should now show OK! )

Page 5: Vipul divyanshu documentation  on Kinect and Motion Tracking

Install NITE:

cd ../Nite

Enter the kinect/Nite/Data folder and edit each of the three xml files in there changing the key data from

key=""

to

key="0KOIk2JeIBYClPWVnMoRKn5cdY4="

(use Gedit)

sudo ./install.sh

(Every step should now show OK! )

Test if install succeeded:

Test out some samples from OpenNI. Run Niviewer

cd ~/kinect/OpenNI/Samples/Bin/x64-Release/

./NiViewer

(If a Kinect is connected, this will show depth map and image stream in a window)

Download Sample streams from OpenNI.

If kinect is not connected, you can run NiViewer on some pre-recorded .oni files from OpenNI.

ONI Files (OpenNI sample streams recorded onto a file)

http://75.98.78.94/Downloads/OpenNIModules.aspx (extract as skeletonrec.oni and MultipleHands.oni)

Now run NiViewer from the ~/kinect/OpenNI/Samples/Bin/x64-Release/ folderwith the oni file as argument.

Page 6: Vipul divyanshu documentation  on Kinect and Motion Tracking

./NiViewer ~/skeletonrec.oni

(This will show a window with the sample)

(skeletonrec.oni from NiViewer)

Sample Program: (Sample-NiUserTracker)

Run similar to NiViewer

./Sample-NiUserTracker ~/skeletonrec.oni

Page 7: Vipul divyanshu documentation  on Kinect and Motion Tracking

(skeletonrec.oni from Sample-NiUserTracker)

*********************************************************************************

*Installation in windows:

OpenNI and NITE installation can be painful if not done properly. Let's do it step by step:

Step 0

Uninstall any previews drivers, such as CLNUI. Look at the end of this post if you want to see how you can have multiple drivers installed.

Step 1

Download Kinect Drivers and unzip.  Open the unzipped folder and navigate to Bin folder.  Run the msi Windows file.

Drivers are now installed in your PC.

Step 2

Download and install the latest stable or unstable OpenNI Binaries from OpenNI website.

Step 3

Download and install the latest stable or unstable OpenNI Compliant Middleware Binaries (NITE) from OpenNI website.

During installation, provide the following (free) PrimeSense key: 0KOIk2JeIBYClPWVnMoRKn5cdY4=

Step 4

Download and install the latest stable or unstable OpenNI Compliant Hardware Binaries from OpenNI website.

Both stable and unstable releases have worked for me. If you have trouble installing the unstable releases, just try the stable ones.

Step 5

Plug in your Kinect device and connect its USB port with your PC. Wait until the driver software is found and applied. Navigate to the Device Manager (Control Panel). You should see something like

the following:

Step 6 

Navigate to C:\Program Files\OpenNI\Samples\Bin\Release (or C:\Program Files (x86)\OpenNI\Samples\Bin\Release) and try out the existing demo applications. Try the demos found in C:\Program Files\Prime Sense\

Page 8: Vipul divyanshu documentation  on Kinect and Motion Tracking

NITE\Samples\Bin\Release (or C:\Program Files (x86)\Prime Sense\NITE\Samples\Bin\Release), too. If they work properly, then you are done! Congratulations! 

Step 7 

You have successfully installed Kinect in your Windows PC! Read the documentation and familiarize yourself with the OpenNI and NITE API. You'll find the proper assemblies in:

C:\Program Files\OpenNI\Bin (or C:\Program Files (x86)\OpenNI\Bin) and C:\Program Files\Prime Sense\NITE\Bin (or C:\Program Files (x86)\

Prime Sense\NITE\Bin)

OpenNI is the primary assembly you'll need when developing Natural User Interfaces applications.

Installation of OGRE :

For installation of Ogre in Linux and Windows the following link can be very useful .

http://www.ogre3d.org/tikiwiki/Installing+the+Ogre+SDK

With these suggested tools the Kinect can be very well exploited for Skeletal tracking and its animation. These are limited to C++ till what I understood.

For starters the Microsoft Kinect SKD also is very useful and a very well developed tool .

For using the Kinect SDK just follow the link:

http://www.microsoft.com/en-us/kinectforwindows/develop/developer-downloads.aspx

In the toolkit I went through the different examples and understood them and the code flow.

I decided of doing the project in C# rather than in C because in C# it was easily possible to import a Maya or blender model easily in .fbx format and can be easily transformed in the sense of orientation and bone movement.

Doing that I used Visual studio C# to track the Skeletal .

The model was imported and animated with the help of Microsoft XNA Game Studio SDK which is basically used for and ease in importing the model.

This was done by the use of the libraries in the SDK called Microsoft.Kinect.

Other tools, SDKs Installed and used:

Page 9: Vipul divyanshu documentation  on Kinect and Motion Tracking

Blender: Link http://www.blender.org/download/get-blender/

Fbx Converter:Link http://usa.autodesk.com/adsk/servlet/pc/item?id=10775855&siteID=123112

Unity: http://unity3d.com/unity/download/

The first skeletal tracking code written after going through the documentations and understanding the class hierarchy, by me was for simple head and hand tracking, in C#.

Here is a snapshot of it in action.

Now Using C# I built on it the code for tracking a full player and colouring him in a specific colour when detected. Note the different colour for the user and the background.

Page 10: Vipul divyanshu documentation  on Kinect and Motion Tracking

Using Avataring sample in Kinect SKD as a base in C# , I built on it, the code to track the model and animate it with the motion of the human in front of the Kinect . Here are few snapshots of it.

Snapshot of the default model provide in the toolkit.

Page 11: Vipul divyanshu documentation  on Kinect and Motion Tracking

But notice that it has been looked in the Knee for a better ad natural flow.

This snapshot show the models natural like walking ability with respect to the user.

Page 12: Vipul divyanshu documentation  on Kinect and Motion Tracking

This depicts the neck and hand movement of the model with users movement.

This picture is for animation of a different character ( Zombie).

Page 13: Vipul divyanshu documentation  on Kinect and Motion Tracking

The picture below is for an improved version on my previous simple head and hand tracker. In this improved code I can control the courser by my hand movement, where I have use Euclidian distance between the head joint to the right hand cursor joint for moving the courser left or right.

For Facial tracking the Kinect SKD has been used. The samples were well understood and are quite simple to begin with.The samples were well studied and understood. Here is the facial tracking snapshot.

For facial animation Ogre3D SDK has been studied and the facial points returned by the function are theorised to the animate the following character in the model.

Page 14: Vipul divyanshu documentation  on Kinect and Motion Tracking

But unable to still match the face points with the vector position values of the actual face. This is still something left to work on and can be done with a while of work and with some time in hand.

Vipul Divyanshu

IIL/2012/14