User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

download User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

of 31

Transcript of User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    1/31

    User Guide for Dual Depth Sensor Configuration (printable

    version)

    From iPiSoft WikiCategories: IPi Recorder IPi Mocap Studio

    For the impatient:

    Get two Microsoft Kinect sensors, or ASUS Xtion Live sensors, or PrimeSense Carmine 1.08 sensorsTo choose see Depth Sensors Comparison

    Get computer with DirectX 10 videocard and at least two USB 2.0 or 3.0 controllersDownload iPi Recorder and iPi Mocap Studio (http://www.ipisoft.com/downloads_index.php)Install iPi Recorder and iPi Mocap StudioFind a suitable space (9 by 5 feet minimum)Actor should be dressed in casual slim clothing, avoid shiny fabricsConnect depth sensors to PC

    Note that both sensors can't be connected through one and the same USB 2.0/3.0 controllerPoint them at the actor (the angle between two depth sensors can be in the range 60-90 degrees, or near to 180 degrees)For calibration, take flat rectangular cardboard, and use iPi Recorder to record calibration videoPlease ensure that the calibration board is visible in both sensors. Turn the calibration board to the left and to the right, holding itvertically in front of yourself on outstretched arms.Record video of actor's performance. Please ensure that the whole body including arms and legs is visible during the performancein both sensors. Start from a T-pose. Then goes actors performance.Run iPi Mocap Studio to process actor performance videoImport your character into iPi Mocap Studio(File->Import Target Character) to adapt the animation for your character rigExport your animation in a desired format

    Contents

    1 System Requirements1.1 iPi Recorder1.2 iPi Mocap Studio

    2 Software Installation2.1 iPi Recorder

    2.1.1 Components2.2 iPi Mocap Studio

    3 Recording Video from Two Depth Sensors3.1 Environment3.2 Actor Clothing3.3 Recording Process

    4 Calibration4.1 The first configuration

    4.1.1 Preparations4.1.2 Recording Calibration Video4.1.3 Processing Calibration Video

    4.2 The second configuration4.2.1 Preparations4.2.2 Recording Calibration Video4.2.3 Processing Calibration Video

    5 Recording Actor's Performance6 Performance Tips

    6.1 Recommended layout of an action video6.2 T-pose6.3 Takes6.4 Iterations

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    2/31

    6.5 Ian Chisholm's hints on motion capture6.5.1 Three handy hints for acting out mocap:6.5.2 Takes6.5.3 Naming conventions

    7 Processing Video from Two Depth Sensors8 Clean-up

    8.1 Cleaning up tracking gaps8.2 Cleaning up individual frames8.3 Tracking errors that cannot be cleaned up using iPi Studio

    8.4 Tracking refinement8.5 Post-processing: Jitter Removal8.6 Post-processing: Trajectory Filtering

    9 Export and Motion Transfer9.1 Default iPi Character Rig9.2 Motion Transfer and Custom Rigs9.3 MotionBuilder9.4 3D MAX Biped9.5 Maya9.6 FBX9.7 COLLADA9.8 LightWave9.9 SoftImage|XSI9.10 Poser9.11 DAZ 3D9.12 iClone9.13 Valve Source Engine SMD9.14 Valve Source Filmmaker

    9.14.1 DMX9.14.2 Old way involving Maya

    9.15 Blender9.16 Cinema4D9.17 Evolver9.18 Second Life9.19 Massive9.20 IKinema WebAnimate9.21 Jimmy|Rig Pro

    10 Troubleshooting10.1 Installation problems10.2 Two Kinects don't work together10.3 How to report bugs and issues10.4 How to send a video to iPiSoft tech support

    11 Video Materials12 USB controllers

    12.1 USB hubs12.2 Potential issues with USB controllers12.3 Known compatibility issues

    System Requirements

    iPi Recorder

    Computer (desktop or laptop):CPU:x86 compatible (Intel Pentium 4 or higher, AMD Athlon or higher), dual- or quad- core is preferableOperating system:Windows 8, 7, XP SP3, Vista (x86 or x64)USB: at least two USB 2.0 or USB 3.0 controllers

    For more info see USB controllersExpressCardslot (for laptops)

    Optional, but highly recommended. It allows to install external USB controller in case of compatibility issues betweencameras and built-in USB controllers, or if all USB ports are in fact connected to a single USB controller

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    3/31

    Storage system: HDD or SSD or RAID with write speed at least 55 MByte/secTwo Microsoft Kinect sensors, or ASUS Xtion Live sensors, or PrimeSense Carmine 1.08 sensors

    To choose see Depth Sensors ComparisonOptional: active USB 2.0 extension cables (http://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Daps&field-keywords=active+USB+2.0+extension+cable&x=14&y=22)Optional: Kinect Adjustable Tripod Stands (http://www.amazon.com/gp/product/B004TJSLEK/ref=as_li_tf_tl?ie=UTF8&tag=wwwipisoftcom-20&linkCode=as2&camp=217145&creative=399373&creativeASIN=B004TJSLEK)Minimum required space: 3m by 3m (10 by 10 feet)

    iPi Mocap Studio

    Computer (desktop or laptop):CPU:x86 compatible (Intel Pentium 4 or higher, AMD Athlon or higher), dual- or quad- core is preferableOperating system:Windows 8, 7, XP SP3, Vista (x86 or x64)Video card:Direct3D 10-capable (Shader Model 4.0) gaming-class graphics card

    Intel integrated graphics is not supported

    for more info see Cameras and accessories#Video_Card

    Notethat before you start working with two depth sensors, it is highly recommended to get appropriate results with single depth sensorsolution: User Guide for Single Depth Sensor Configuration.

    Software Installation

    iPi Recorder

    Before installation:

    unplug all cameras from computer

    Download (http://files.ipisoft.com/iPiRecorderSetup.exe) and run the setup package of the latest version of iPi Recorder. You will be

    presented with the following dialog.

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    4/31

    Select needed components1.Read and accept the license agreement by checking appropriate checkbox2.Press the Install button to begin installation3.

    Note. Most of the components require administrative privileges because they install device drivers or write to Program Filesandother system folders. On Windows Vista/7 you will be presented with UAC prompts when appropriate during installation. If you planto use iPi Recorder under user which has no administrative rights, you can pre-install other components separately usingadministrator's account.

    Important!

    You can plug only one MS Kinect / ASUS Xtion / PrimeSense Carmine sensor to one USB controller. 1 USB controller bandwidthis not enough to record from 2 sensors.

    1.

    You can plug not more than 2 Sony PS Eye cameras to one USB controller, otherwise you will not be able to capture at 60 fps with640 x 480 resolution.

    2.

    For more info see USB controllers.

    Upon installation is complete, iPi Recorder will launch automatically. Continue with user's guide to get a knowledge of using thesoftware.

    Components

    If some of the components is already installed, it has no checkbox and is marked with ALREADY INSTALLEDlabel. You should noinstall all optional components in advance, without necessity. All of them can be installed separately at later time. Componentsdescriptions below contain corresponding download links.

    Microsoft .NET Framework 4 - Client. This is required component and cannot be unchecked.This is basic infrastructure for running .NET programs. iPi Recorder is a .NET program.

    Web installer: http://www.microsoft.com/en-us/download/details.aspx?id=17113Standalone installer: http://www.microsoft.com/en-us/download/details.aspx?id=24872

    Playstation3 Eye Webcam :: WinUSB Drivers Registration. Check if you plan to work with Sony PS Eye cameras.Device drivers for PS Eye camera.

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    5/31

    64-bit OS: http://files.ipisoft.com/drivers/PlayStation3_Eye_iPi-x64.msi32-bit OS: http://files.ipisoft.com/drivers/PlayStation3_Eye_iPi-x86.msi

    ASUS Xtion / PrimeSense Carmine :: OpenNI Redistributableand ASUS Xtion / PrimeSense Carmine :: PrimeSenseSensor. Check if you plan to work with ASUS Xtion, or ASUS Xtion Live, or PrimeSense Carmine depth sensors.Device drivers and software libraries for ASUS Xtion / ASUS Xtion Live / PrimeSense Carmine.

    64-bit OS: http://files.ipisoft.com/3dparty/openni-win64-1.5.2.23-redist.msi, http://files.ipisoft.com/3dparty/sensor-win64-5.1.0.41-redist.msi32-bit OS: http://files.ipisoft.com/3dparty/openni-win32-1.5.2.23-redist.msi, http://files.ipisoft.com/3dparty/sensor-win32-5.1.0.41-redist.msi

    (Windows 7, 8) Microsoft Kinect :: MS Kinect SDK 1.5. Check if you plan to work with Microsoft Kinect depth sensors.Device drivers and software libraries for Microsoft Kinect. Requires Windows 7 and later.

    http://www.microsoft.com/en-us/download/details.aspx?id=29866

    (Windows XP, Vista) Microsoft Kinect :: PrimeSense psdrv3.sys Driver Registration. Check if you plan to work with MicrosofKinect depth sensors.Alternative device drivers for Microsoft Kinect.

    64-bit OS: http://files.ipisoft.com/drivers/KinectPsdrv3_iPi-x64.msi32-bit OS: http://files.ipisoft.com/drivers/KinectPsdrv3_iPi-x86.msi

    Note. iPi Recorder does support working with Kinect sensors on Windows 7 using PrimeSense driver. If you have installed andused it for Kinect with iPi Recorder 1.x, you can continue using it with iPi Recorder 2.

    iPi Recorder 2.x.x.x. This is required component and cannot be unchecked.iPi Recorder itself.

    iPi Mocap Studio

    Before installation:

    Ensure your video card supports DirectX 10. Otherwise, iPi Mocap Studio will not run on your system.

    Download (http://files.ipisoft.com/iPiMocapStudioSetup.exe) and run the latest setup package of iPi Mocap Studio. You will bepresented with the following dialog:

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    6/31

    Read and accept the license agreement by checking corresponding checkbox.1.Press the Installbutton to begin installation.2.

    Upon installation is complete, iPi Mocap Studio will launch automatically.

    All components are required for installation. Please note that the installation of Microsoft .NET Framework 3.5 SP1 requires anInternet connection. If needed, you can download offline installer for Microsoft .NET (http://www.microsoft.com/en-us/download/details.aspx?id=25150) separately, and run it before iPi Mocap Studio setup. Other components are included with iPi Mocap Studiosetup.

    As a result of installation, iPi Studio will be executed. You will be prompted to enter your license key or start 30-days free trial period:

    For more info about license protection see License.

    Recording Video from Two Depth Sensors

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    7/31

    Environment

    For a single or dual depth sensor configuration, you need a minimum of 10 feet by 10 feet space (3 meters by 3 meters). At smallerspace, actor simply wont fit into view of cameras. Capture area is about 7 by 7 feets (or 2 by 2 meters), both for single and dualconfiguration.

    It is convenient to put each Kinect Sensor on a chair or a table. The picture below will help you to understand sensor field of view andpossible distance from actor:

    Side view Top view

    Below in #Calibration section you will find information on 2 recommended sensors' mutual configuratioins.

    Actor Clothing

    Current version uses only depth information to track motions. So clothes requirements are:

    no restrictions on clothes colors (just avoid shiny fabrics)please use slim clothes to reduce noise in resulted animation

    Recording Process

    Please record a video using iPi Recorderapplication. It supports recording with Sony PlayStation Eye cameras, depth sensors (Kinect)and DirectShow-compatible webcams (USB and FireWire).

    iPi Recorderis a stand-alone application and does not require a powerful video card. You may choose to install it on a notebook PC forportability. Since it is free, you can install it on as many computers as you need.

    Please run iPi Recorderand complete setup and background recording steps following the instructions: iPi Recorder Setup

    Calibration

    Calibration is the process of computing accurate camera positions and orientations from a video of user waving a small glowing object

    (called marker). This step is essential and required for multi-camera system setup.

    Important.Once you calibrated the camera system, you should not move your cameras for subsequent video shoots. If you move aleast one camera, you need to perform calibration again.

    There are two possible arrangements of the two sensors:

    angle between sensors is between 60 and 90 degrees;1.angle between sensors in near to 180 degrees that means that sensors are placed opposite to each other.2.

    The first configuration

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    8/31

    Top view Side view

    Preparations

    Please note that starting from version 2.4.1.156, iPi Mocap Studio also supports calibration with the aid of glowing markerlike flashlight. For additional information please see this article.

    For calibration purposes flat rectangular veneer/plywood/cardboard/pasteboard/foamboard is used:

    horizontal size should be at least 0.5m (1m - 1.3m is recommended);vertical size should be at least 0.7m (1m - 1.5m is recommended);hold it vertically in front of yourself on outstretched arms;make sure that this calibration boardis good visible in both sensors (the amount of yellow is low in the area of the board).

    Good Bad

    good sizegood depth data

    too many yellow (no depth data) pointstoo small

    too close to human body

    Important!Yellow color in the depth map means "depth here is unknown". Thus it is important to minimize the amount of yellow pointusing appropriate materials and clothes.

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    9/31

    Recording Calibration Video

    After you finished preparations and sensors setup you're ready to record the calibration video:

    Start recording by pressing the "Start" button.Go to the capture area with the prepared calibration board.Move the board back and forth, slight tipping and banking, plus slight side-to-side rotation.Stop the recording by pressing the "Stop" button.

    Notethat is important to have variety in board positions: closer and more far from the sensors.

    Good Bad

    the calibration plane is good visible in both sensors the calibration plane is not visible in the left sensor

    Please open the captured video in the iPi Studioand check the following:

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    10/31

    The floor (ground) is good visible on the depth maps from both sensorsThe calibration board is good visible on the depth maps most the timeThe calibration board is moving and has variety in distances to the sensorsThe amount of yellow points is relatively small

    Processing Calibration Video

    Please make sure that the ground plane is determined correctly for both sensors after calibration video opening in the iPi Studio

    (to switch between sensors use buttons "Camera 1" and "Camera 2" in the top toolbar).

    After that it is convenient to turn off visualization of background points (View -> Hide Background).

    Set the beginning of Region of Interest (ROI) to the point where calibration plane is good visible in both sensors.

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    11/31

    Adjust the end of ROI:the length of ROI should be at leat 5 seconds (150 frames)ROI should contain good variety of horizontal view angles for the calibration plane

    Go to the Calibrationtab and click "Calibrate Based on 3D Plane" button

    Please wait for the end of calibration process

    Evaluate the result:turn on "Show Depth From All Sensors" option in the "View" menurotate the scene and check how 3D-points from different sensors fit with each other

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    12/31

    Save the result:go to the Scenetab

    click "Save scene..." buttonchoose file name for your scene parameters

    The second configuration

    Top view Side view

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    13/31

    Preparations

    In this case flat rectangular veneer/plywood/cardboard/pasteboard is also used. But this box is held in one near-straight arm side by sidewith body. Each Kinect sensor sees different sides of board. Thus thickness of the board should not exceed 3 cm.

    Recording Calibration Video

    It is also important to have variety in board positions: closer to one sensor and more far from another and vise versa.

    Good Bad

    the calibration plane is good visible in both sensors the calibration plane is not visible in the left sensor

    Processing Calibration Video

    It's mostly similar to processing of the first configuration.

    Recording Actor's Performance

    After completing Setup and Background recording steps, press Record button to begin video recording.

    As soon as recorder starts, go to the capture area and stand in a T-pose (http://www.trinity3d.com/media/axyz-design/metropoly2/T-Pose_models/Casual/MeCaT0005/image4.jpg) :

    After that you can act desired motions. If you make several takes of one actor, you do not need to record T-pose again.

    To stop recording, press Stop button.

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    14/31

    Performance Tips

    Recommended layout of an action video

    Enter the actor.Strike a T-pose.Action

    T-pose

    It is preferable to have actor strike a T-pose before the actual action. The software will need T-pose for building actor appearancemodel during tracking. If you make several takes of one actor you do not need to re-record T-pose before each take.

    When using the depth sensors, it is recommended to face the palms down, as it corresponds to the default orientation of the model's handbones. When using color cameras, it is recommended to face the palms forward, as it helps the software in determining the right color fothe model's hands.

    Takes

    Take (http://en.wikipedia.org/wiki/Take) is a concept originating from cinematography. In a nutshell, take is a single continuous recordedperformance.

    Usually it is a good idea to record multiple takes of the same motion, because a lot of things can go wrong for purely artistic reasons.

    Iterations

    A common problem with motion capture is clipping in resulting 3D character animation. For example, arms entering the body ofanimated computer-generated character. Many CG characters have various items and attachments like a bullet-proof vest, a fantasyarmor or a helmet. It can be easy for an actor to forget about the shape of the CG model.

    For this reason, you may need to schedule more than one motion capture session for the same motions. Recommended approach is:

    Record the videosProcess the videos in iPiStudioImport your target character into iPiStudio and review the resulting animationGive feedback to the actorSchedule another motion capture session if needed

    Ian Chisholm's hints on motion capture

    Ian Chisholm is a machinima (http://en.wikipedia.org/wiki/Machinima) director and actor and the creator of critically acclaimed ClearSkies (http://www.clearskiesthemovie.com/) machinima series. Below are some hints from his motion capture guide

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    15/31

    (http://www.ipisoft.com/forum/viewtopic.php?f=3&t=5421) based on his experience with motion capture for Clear Skies III.

    Three handy hints for acting out mocap:

    Dont weave and bob around like youre in a normal conversation it looks terrible when finally onscreen. You need to be fairly(but not completely) static when acting.

    1.

    If you are recording several lines in one go, make sure you have lead in and lead out between each one, i.e. stand still! Otherwise,

    the motions blend into each other and its hard to pick a start and end point for each take.

    1.

    Stand a bit like a gorilla have your arms out from your sides:

    Well, obviously not quite that much. But anyway, if you dont, youll find the arms clip slightly into the models and they look daft.

    1.

    If you have a lot of capture to do, you need to strike a balance between short and long recordings. Aim for 30 seconds to 2 minutes. Toolong is a pain to work on later due to the fiddlyness of setting up takes, and too short means you are forever setting up T-poses.

    Takes

    Because motion capture is not a perfect art, and neither is acting, its best to perform multiple takes. I found that three was the bestamount for most motion capture. Take less if its a basic move, take more if its complex and needs to be more accurate. It will make lifeeasier for you in the processing stage if you signal the break between takes I did this by reaching out one arm and holding up fingers to

    show which take it was.

    Naming conventions

    As its the same actor looking exactly the same each and every time, and there is no sound, and the capture is in lowres 320*200, youreally need to name the files very clearly so that you later know which act, scene, character, and line(s) the capture is for.

    My naming convention was based on act, scene, character, page number of the scene, line number, and take number. You end up withsomething unpleasant to read like A3S1_JR_P2_L41_t3 but its essential when youve got 1500 actions to record.

    Processing Video from Two Depth Sensors

    Processing of performance video is generally the same as for single Kinect solution: User Guide for Single Kinect SensoConfiguration#Processing Video from Depth Sensor.

    The main difference is that after loading captured video into iPi Studioyou should load your scene parameters:

    switch to the Scenetabclick "Load scene..." buttonselect file with scene parameters which has been saved during calibration process

    That's all. After that you're ready for the processing:

    Position timeline slider to the frame where actor is in T-poseAdjust actor height using appropriate slider on tab Actor

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    16/31

    Select Move tool on toolbar.Move actor model to left or right to match roughly actor silhouette on video.etc.

    Clean-up

    Once initial tracking is performed on all (or part) of your video, you can begin cleaning out tracking errors (if any). Post-processingshould be applied after clean-up.

    Cleaning up tracking gaps

    Tracking errors usually happen in a few specific video frames and propagate to multiple subsequent frames, resulting in tracking gapsExamples of problematic frames:

    Occlusion (like one hand not visible in any of the cameras)Indistinctive pose (like hands folded on chest).Very fast motion with motion blur.

    To clean up a sequence of incorrect frames (a tracking gap), you should use backward tracking:

    Go toward the last frame of tracking gap, to a frame where actor pose is distinctive (no occlusion, no motion blur etc.).1.If necessary, use Rotate, Move and Inverse Kinematics tools to edit character pose to match actor pose on video.2.

    Turn off Trajectory Filtering (set it to zero) so that it does not interfere with your editing.3.Click Refit Pose button to get a better fit of character pose.4.Click Track Backward button.5.Stop backward tracking as soon as it comes close to the nearest good frame.6.If necessary, go back to remaining parts of tracking gap and use forward and backward tracking to clean them up.7.

    Cleaning up individual frames

    To clean up individual frames you should use a combination of editing tools (Rotate, Move and Inverse Kinematics) and Refit Posebutton.

    Note: after Refit Pose operation iPiStudio automatically applies Trajectory Filtering to produce a smooth transition between frames. As

    the result, pose in current frame is affected by nearby frames. This may look confusing. If you want to see exact result of Refit Poseoperation in current frame you should turn off Trajectory Filtering (set it to zero), but do not forget to change it back to suitable valuelater.

    Tracking errors that cannot be cleaned up using iPi Studio

    Not all tracking errors can be cleaned up in iPiStudio using automatic tracking and Refit Posebutton.

    Frames immediately affected by occlusion sometimes cannot be corrected. Recommended workarounds:Manually edit problematic poses (not using Refit Posebutton).Record a new video of the motion and try to minimize occlusion.Record a new video of the motion using more cameras.

    Frames immediately affected by motion blur sometimes cannot be corrected. Recommended workarounds:

    Manually edit problematic poses (not using Refit Posebutton).Edit problematic poses in some external animation editor.Record a new video of the motion using higher framerate.

    Frames affected by strong shadows on the floor sometimes cannot be corrected. Typical example is push-ups. This is a limitation ofcurrent version of markerless mocap technology. iPiSoft is working to improve tracking in future versions of iPiStudio.Some other poses can be recognized incorrectly by iPiStudio. This is a limitation of current version of markerless mocaptechnology. iPiSoft is working to improve tracking in future versions of iPiStudio.

    Tracking refinement

    After the primary tracking and cleanup are complete, you can optionally run the Refine pass (see Refine Forward and RefineBackwardbuttons). It slightly improves accuracy of pose matching, and can automatically correct minor tracking errors. However, i

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    17/31

    takes a bit more time than the primary tracking, so it is not recommended for quick-and-dirty tests.

    Important.Refineshould be applied with the same tracking parameters (e.g. feet tracking, head tracking) as the primary tracking inorder not to lose previously tracked data.

    Important.Refineshould be applied before motion controller data. Also, if you plan to manually edit the animation (not related toautomatic cleanup with Refit Pose), then also do this after applying Refine.

    In contrast to the primary tracking, this pass does no pose prediction, and bases its computations solely on the current pose in a frameEssentially, running Refineis equal to automatically applying Refit Poseto a range of frames which were previously tracked.

    Post-processing: Jitter Removal

    Jitter Removal filter is a powerful post-processing filter. It should be applied after cleaning up tracking gaps and errors. It irecommended that you always apply Jitter Removal filter before exporting animation.

    Jitter Removal filter suppresses unwanted noise and at the same time preserves sharp, dynamic motions. By design, this filter should beapplied to relatively large segments of animation (no less than 50 frames).

    Range of frames affected by Jitter Removal is controlled by current Region of Interest.

    You can configure Jitter Removal options for specific body parts. Default setting for Jitter Removal aggressiveness is 1 (one tick ofcorresponding slider). Oftentimes, you can get better results by applying a slightly more aggressive Jitter Removal for torso and legsAlternatively, you may want to use less aggressive Jitter Removal settings for sharp motions like martial arts moves.

    Jitter Removal filter makes an internal backup of all data produced by tracking and clean up stages. Therefore, you can re-apply JitterRemoval multiple times. Each subsequent run works off original tracking/clean-up results and overrides previous runs.

    Post-processing: Trajectory Filtering

    Trajectory Filter is a traditional digital signal filter. Its purpose is to filter out minor noise that remains after Jitter Removal filter.

    Trajectory Filter is very fast. It is applied on-the-fly to current Region of Interest.

    Default setting for Trajectory Filter is 1. Higher settings result in multiple passes of Trajectory Filter. It is recommended that you leave it

    at the default setting.

    Trajectory Filter can be useful for gluing together multiple segments of animation processed with different Jitter Removal optionschange the Region of Interest to cover all of your motion (e.g. multiple segments processed with different jitter removal setting); changeTrajectory Filtering setting to 0 (zero); then change it back to 1 (or other suitable value).

    Export and Motion Transfer

    Use File->Export Animation menu item to export all animation frames from within Region of Interest.

    To export animation for specific take, right-click on take and select Export Animation item from pop-up menu.

    Default iPi Character RigThe default skeleton in iPi Studio is optimized for markerless motion capture. It may or may not be suitable as a skeleton for yourcharacter. Default iPi skeleton in T-pose has non-zero rotations for all joints. Please note that default iPi skeleton with zero rotations doenot represent a meaningful pose and looks like a random pile of bones.

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    18/31

    Default rig Bone names

    By default iPi Studio exports a T-pose (or a reasonable default pose for custom rig after motion transfer) in the first frame of animationIn case when it is not desired switch off "Export T-pose in first frame" checkbox.

    Motion Transfer and Custom Rigs

    iPi Studio has integrated motion transfer technology. You can import your character into iPi Studio via File->Import Target Charactermenu item and your motion will be transferred to your character. You may need to assign bone mappings on the Export tab for motiontransfer to work correctly. You can save your motion transfer profile to XML file for future use. iPi Studio has pre-configured motiontransfer profiles for many popular rigs (see below). If you export animation to format different from format your target character wasimported in, only rig will be exported. If you use the same format for export, skin will be exported as well.

    MotionBuilder

    Use the Export Animation for MotionBuilder menu item to export your motion in MotionBuilder-friendly BVH format. MotionBuilder-friendly skeleton in T-pose has zero rotations for all joints, with bone names consistent with MotionBuilder conventions. This format mayalso be convenient for use with other apps like Blender.

    3D MAX Biped

    Use the Export Animation for 3D MAX menu item to export your motion in 3D MAX-friendly BVH format.

    Create a Biped character in 3D MAX (Create->Systems->Biped). Go to Motion tab. Click Motion Capture button and imporyour BVH file.

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    19/31

    Step 1 Step 2

    Step 3

    Step 4

    Our user Cra0kalo created an example Valve Biped rig for use with 3D MAX (http://www.ipisoft.com/forum/viewtopic.php?f=2&t=4928) . It may be useful if you work with Valve Source Engine characters.

    Maya

    Latest versions of Maya (starting with Maya 2011) have a powerful biped animation subsystem called "HumanIK". Animations exportedfrom iPiStudio in MotionBuilder-friendly format (the Export Animation for MotionBuilder menu item) should work fine with Maya2011 and HumanIK. The following video tutorials can be helpful:

    Maya HumanIK Mocap retarget with iPi Mocap Studio, by Wes McDermott (http://vimeo.com/57823600)Non-Destructive Live Retargeting Maya 2011 New Features (http://www.youtube.com/watch?v=l1hV8BqsCEA)Motion Capture Workflow With Maya 2011 (http://vimeo.com/24351989)Humanik Maya 2012 Part 6 (http://www.youtube.com/watch?v=IP6AwFNkmm8)

    For older versions of Maya please see the #Motion Transfer and Custom Rigs section. Recommended format for import/export with oldeversions of Maya is FBX.

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    20/31

    FBX

    iPi Studio supports FBX format for import/export of animations and characters. By default, iPiStudio exports animations in FBX 6.0format using FBX SDK 2012. If your target character is in FBX 7.0 or newer format, iPiStudio will export retargeted animation in FBX2012 format.

    Some applications do not use latest FBX SDK and may have problems importing newer version FBX files. In case of problems, your canuse Autodesk's free FBX Converter (http://usa.autodesk.com/adsk/servlet/pc/item?siteID=123112&id=10775855) to convert youranimation file to appropriate FBX version.

    COLLADA

    iPi Studio supports COLLADA (http://collada.org) format for import/export of animations and characters. Current version of iPi Studioexports COLLADA animations as matrices. In case if you encounter incompatibilities with other applications' implementation ofCOLLADA format, we recommend using Autodesk's free FBX Converter (http://usa.autodesk.com/adsk/servle/pc/item?siteID=123112&id=10775855) to convert your data between FBX and COLLADA formats. FBX is known to be moreuniversally supported in many 3D graphics packages.

    LightWave

    Recommended format for importing target characters from LightWave to iPi Studio is FBX. Recommended format for bringinganimations from iPi Studio to LightWave is BVH or FBX.

    SoftImage|XSI

    Our user Eric Cosky published a tutorial on using iPiStudio with SoftImage|XSI:

    http://www.ipisoft.com/forum/viewtopic.php?f=13&p=9660#p9660

    Poser

    Export your poser character in T-pose in BVH format (File->Export). Import your Poser character skeleton into iPi Studio (File->Impor

    Target Character). Your animation will be transferred to your Poser character. Now you can use File->Export Animation to export youranimation in BVH format for Poser.

    Poser 8 has a bug with incorrect wrists animation import. The bug can be reproduced as follows: export Poser 8 character in T-pose inBVH format; import your character back into Poser 8; note how wrists are twisted unnaturally as the result.

    A workaround for wrists bug is to chop off wrists from your Poser 8 skeleton (for instance using BVHacker) before importing Poser 8target character into iPi Studio. Missing wrists should not cause any problems during motion transfer in iPi Studio if your BVH file isedited correctly. Poser will ignore missing wrists when importing resulting motion so the resulting motion will look right in Poser (wristin default pose as expected).

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    21/31

    Step 1 Step 2

    Step 3 Step 4

    DAZ 3D

    The workflow for DAZ 3D is very similar to Poser. Import your DAZ 3D character skeleton into iPi Studio (File->Import TargetCharacter). Your animation will be transferred to your DAZ 3D character. Now you can use File->Export Animation to export youranimation in BVH format for DAZ 3D.

    IMPORTANT: You can use DAZ character in COLLADA (.dae) format for preview, but it is strongly recommended that you use DAZcharacter in BVH format for motion transfer. DAZ3D has a problem with COLLADA (.dae) format: DAZ3D Studio does not export albones into COLLADA (.dae). In particular, the following bones are not exported: eyeBrow, bodyMorphs. DAZ3D Studio does not use

    bone names when importing motions; instead, DAZ3D Studio just takes rotations from the list of angles as though it was a flat list withexactly the same positions as in DAZ3D internal skeleton. As the result, when you transfer the motion to a COLLADA character andimport it back into DAZ3D, the motion will look wrong. iPiStudio displays a warning about this. To avoid this problem, import your DAZtarget character in BVH format - DAZ3D Studio is known to export characters in BVH format correctly (with all bones).

    You can improve accuracy of motion transfer by doing some additional preparation of your DAZ 3D skeleton in BVH format. For DAZ3D Michael 4.0 and similar characters, you may need to clamp thigh joint rotation to zero to avoid unnatural leg bending. For DAZ 3DVictoria 4.0, you may need to adjust foot joint rotation to change the default high heels foot pose to a more natural foot pose.

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    22/31

    Step 1 Step 2

    Step 3 Step 4

    Step 5

    iClone

    Current version of iPi Studio can only export animation in iClone-compatible BVH format. The iMotion format is not supported as of yet

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    23/31

    Step 1 Step 2

    Step 3 Step 4

    That means you will need iClone PRO to be able to import the motion into iClone. Standard and EX versions of iClone do not have BVHConverter and therefore cannot import BVH files.

    Workflow for iClone is straightforward. Export your animation using Export Animation for iClone menu item. Go to Animation tab iniClone and launch BVH Converter. Import your BVH file with Default profile, click Convert and save the resulting animation iniMotion format. Now your animation can be applied to iClone characters.

    iClone expects an animation sampled at 15 frames per seconds. For other frame rates, you may need to create a custom BVH Converterprofile by copying Default profile and editing Frame Rate setting.

    BVH Converted in iClone 4 has a bug that causes distortion of legs animation. iPi Studio exports an iClone-optimized BVH correctly ascan be verified by reviewing exported BVH motion in BVHacker or MotionBuilder or other third-party application. No workaround isknown. We recommend that you contact iClone developers about this bug as it is out of control of iPi Soft.

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    24/31

    Step 5

    Valve Source Engine SMD

    Import .smd file for your Valve Source Engine character into iPi Studio via File->Import Target Character menu item. Your animationwill be transferred to your character. Now you can use File->Export Animation to export your animation in Valve Source Engine SMDformat.

    Our user Cra0kalo created an example Valve Biped rig for use with 3D MAX (http://www.ipisoft.com/forum/viewtopic.php?f=2&t=4928) . It may be useful if you wish to apply more then one capture through MotionBuilder or edit the custom keyframes in MAX.

    Valve Source Filmmaker

    DMX

    First, you need to import your character (or its skeleton) into iPi Mocap Studio, for motion transfer.

    There are currently 3 ways of doing this:

    You can import an animation DMX (in default pose) into iPi Mocap Studio. Since it has a skeleton, it should be enough for motiontransfer. To create an animation DMX with default pose, you can add your character to your scene in Source Filmmaker andexport DMX for corresponding animation node:

    open "Animation Set Editor Tab";click "+" -> "Create Animation Set for New Model";choose a model and click "Open";export animation for your model, in ASCIIDMX format;

    There is a checkbox namedAsciiin the top area of the export dialog.

    1.

    Alternatively, you can just import an SMD file with your character into iPi Mocap Studio. For example, SMD files for all TeamFortress 2 characters can be found in your SDK in a location similar to the following (you need to have Source SDK installed):C:\Program Files (x86)\Steam\steamapps\\sourcesdk_content\tf\modelsrc\player\pyro\parts\smd\pyro_model.smd).

    2.

    If you created a custom character in Maya, you should be able to export it in DMX model fromat. (Please see Valvedocumentation (https://developer.valvesoftware.com/wiki/Maya) on how to do this).

    3.

    Then you can import your model DMX into iPi Mocap Studio. Current version of iPi Mocap Studio cannot display character skin, but itshould display the skeleton. Skeleton should be enough for motion transfer.

    To export animation in DMX, you should just use "General..." export menu item in iPi Mocap Studio and choose DMX from the list ofsupported formats. You may also want to uncheck "Export T-pose in first frame" option on the "Export" tab in iPi Mocap Studio.

    Now you can import your animation into Source Filmmaker. There will be some warnings about missing channels for face bones but youcan safely ignore them.

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    25/31

    Step 1 Step 2

    Step 3 Step 4

    Step 5 Step 6

    Old way involving Maya

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    26/31

    This was used until iPi Mocap Studio got DMX support. And still may be useful in case of any troubles with DMX. Please see thefollowing video tutorial series:

    http://www.youtube.com/playlist?list=PLD4409518E1F04270

    Blender

    iPiStudio can export animations in Blender-friendly BVH format (File->Export animation for Blender).

    Cinema4D

    If you have experience with Cinema4D please help to expand this Wiki by posting Cinema4D import/export tips to Community Tutorial(http://www.ipisoft.com/forum/viewforum.php?f=3) section of our user forum.

    Evolver

    iPi Studio supports importing of skinned Evolver (http://www.evolver.com) characters in COLLADA or FBX format. Import yourEvolver character skeleton into iPi Studio (File->Import Target Character). Your animation will be retargeted to your Evolver characterNow you can use File->Export Animation to export your animation.

    Evolver offers several different skeletons for Evolver characters. Here is an example motion transfer profile for Evolver "Gaming"skeleton: evolver_game.profile.xml (http://www.ipisoft.com/downloads/motion_transfer_profiles/evolver_game.profile.xml)

    Second Life

    Import your Second Life character skeleton into iPi Studio (File->Import Target Character). Your animation will be transferred to yourSecond Life character. Now you can use File->Export Animation to export your animation in BVH format for Second Life.

    SecondLife documentation (http://wiki.secondlife.com/wiki/How_to_create_animations#Setting_up_Poser) contains a link to useful SLavatar files (http://static-secondlife-com.s3.amazonaws.com/downloads/avatar/avatar_mesh.zip) . The ZIP file includes a BVH of the"default pose". Be sure to have that.

    See the discussion on our Forum for additional details: http://www.ipisoft.com/forum/viewtopic.php?f=2&p=7845

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    27/31

    Massive

    Please see our user forum for a discussion of animation import/export for Massive (http://www.massivesoftware.com/) :

    http://ipisoft.com/forum/viewtopic.php?f=12&t=3233

    IKinema WebAnimate

    Please see the following video tutorial on how to use iPi Studio with IKinema WebAnimate (http://www.ikinema.com/webanimate) :

    http://www.youtube.com/watch?v=a-yJ-O02SLU

    Jimmy|Rig Pro

    Please see the following video tutorial on how to use iPi Studio with Jimmy|Rig Pro (http://www.origamidigital.com/typolight/index.php/home.html) :

    http://www.youtube.com/watch?v=wD1keDh3fCk

    Troubleshooting

    Installation problems

    Potential problem:after installation, iPi Mocap Studio crashes on first start.

    Possible cause:very often, this is caused by incompatible video card. Another possible reason is broken .NET Framework installation obroken DirectX installation.

    Solution:check system requirements and make sure your operating system and .NET Framework is up to date.

    Two Kinects don't work together

    Potential problem:two Kinect sensors do not work togethter.

    Possible cause: Most probably, both Kinects were plugged into one USB controller. In this case 1 USB controller bandwidth is notenough to handle video from 2 Kinects.

    Solution: Each Kinect should be plugged into separate USB controller. Please refer to documentationUser_Guide_for_Dual_Kinect_Sensor_Configuration#Software_Installation

    How to report bugs and issues

    When reporting bugs and issues, please specify the following info:

    exact version of your operating system;

    exact model of your video card (you can use GPU-Z (http://www.techpowerup.com/downloads/SysInfo/GPU-Z/) to find out themodel of your video card);the number and models of your cameras.

    You can post your bug reports on our User Forum (http://www.ipisoft.com/forum/) or send them to iPiSoft tech support email.

    How to send a video to iPiSoft tech support

    Sending your videos to iPiSoft tech support can be helpful if you experience a problem with iPiSoft's system. iPiSoft promises to use yourvideo only for debugging and not to disclose it to third parties.

    To send a video, please upload it to some file sharing server like filefactory.com (http://www.filefactory.com/) and send us the link. The

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    28/31

    other way is using peer-to-peer technology.

    If you cannot send a video because of its huge size, consider sending screenshots. Screenshots are less informative then video but stilthey are helpful for diagnosing various problems with tracking.

    Video Materials

    From Jimer Lins: Part 1 - Setting up your Kinects and Calibrating

    From Jimer Lins: Part 2 - Recording the Action

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    29/31

    From Jimer Lins: Part 3 - Processing the Recorded Action

    From Jimer Lins: see other parts (http://www.youtube.com/playlist?list=PLD4409518E1F04270)

    USB controllers

    All modern computers (e.g. dual-core and better) based on Intel, AMD and Nvidia chipsets have two high-speed USB (USB 2.0)controllers on board. That should give you enough bandwidth to be able to record with:

    2 depth sensors (MS Kinect, or ASUS Xtion, or PrimeSense Carmine)1.or 4 cameras at 640x480 (raw Bayer format) at 60 FPS2.or 6 cameras at 640x480 (raw Bayer format) at 40 FPS.3.

    You can check how many USB 2.0 controllers you have in the Universal Serial Bus controllers section in Device Manager. Operatingsystem will display USB 2.0 controllers as USB Enhanced Host Controllers (not to be mistaken with USB 1.0 USB Universal Hostcontrollers).

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    30/31

    You should make sure that you have depth sensors or cameras evenly distributed between available USB 2.0 controllers. For example, iyou have 4 cameras, you should connect 2 cameras to first controller and 2 cameras to second controller. Device Manager usually showUSB cameras as USB Composite Device at the Advanced tab of USB controller properties. If you have too many cameraconnected to single USB 2.0 controller, you should re-plug them to different USB ports.

    USB hubs

    If several devices are connected on the USB hub and working in parallel, then the USB bandwidth is shared among the devicesConnecting all cameras via single USB 2.0 hub is OK only for low resolution video recording (320x240). To be able to record video at640x480 and 60 frames per second, you should avoid connecting more than 2 cameras via single USB controller.

    Potential issues with USB controllers

    Some USB chips have compatibility issues with specific cameras which may result in not detecting cameras, low frame rate, etc. See thenext section on known issues.

    Many modern laptops have one of the integrated USB 2.0 controllers reserved for docking station. That may effectively halve your USB

    bandwidth. So laptops with a large number of USB ports (4 or more) are preferrable. Desktop PCs should not have such problem.

    Important.Before buying a specific laptop model it is strongly recommended to ensure it has the sufficient number of available built-inUSB controllers and they are compatibile with cameras you intend to use. Find it at your local retailer, and bring cameras with you totest. Though it may be easier if your friend/neighbour has such a laptop. Or ask at our forum (http://forum.ipisoft.com) , maybe someonealready used this model with our software or can recommend specific model for your camera configuration.

    If you plan getting an additional USB controller, please be aware of potential problems with USB 2.0 controllers available on the marketMany USB 2.0 controllers are based on defective NEC USB 2.0 chip, not capable of standard USB 2.0 speeds. That's a well-known bugspecific only to NEC USB 2.0 chip. By contrast, all NEC USB 3.0 chips are good and do not have this problem.

    Some manufacturers sell what they call a "USB 2.0 ExpressCard" controller for laptops. Many of such ExpressCard controllers are infact internal USB hubs in ExpressCard form factor. That means they do not add actual USB bandwidth to the system. All USB 3.0ExpressCard are controllers.

    If you plan getting additional high-speed USB controller, we strongly recommend that you get a USB 3.0 controller, just to be on the safeside.

    When setting up a USB 3.0 controller, make sure you configure it to handle USB 2.0 traffic (as opposed to passing USB 2.0 to integratedUSB 2.0 controllers in your chipset). There should be a corresponding setting in driver properties or in BIOS.

    Known compatibility issues

    Combinations of camera & USB chip listed below have known compatibility issues. Unfortunately, there is no guarantee that othercombinations will work for sure, but most will. If you need an additional USB controller for connecting specific camera, be sure not toget the one based on incompatible USB chip.

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con

    31 18/12/2013 0

  • 8/10/2019 User Guide for Dual Depth Sensor Configuration (Printable Version) - IPiSoft Wiki

    31/31

    Camera USB chip Symptoms Comments / possible solution

    ASUS Xtion(Live)

    NEC PD720200

    (USB3)Camera is not properlyrecognized by the system.

    This firmware update (http://reconstructme.net/2012/10/13/asus-xtion-usb-3-0-hotfix/) makes device behave the same asPrimeSense Carmine 1.08 with this controller.

    ASUS Xtion(Live)

    VIA VT6212L(USB2)

    Frame drops of color picture in640x480@30 (depth + color).

    PrimeSenseCarmine 1.08

    NEC PD720200

    (USB3)Frame drops in 640x480@30(depth + color).

    Stable work in 640x480@30 (depth only)and 320x240 (depth +color).

    PrimeSenseCarmine 1.08 VIA VT6212L(USB2) Frame drops in 640x480@30(depth + color). Stable work in 640x480@30 (depth only)and 320x240 (depth +color).

    PrimeSenseCarmine 1.08

    VIAVL800(USB3)

    Frame drops, reduced frame rate,freezes in depth + colormodes.

    Stable work in depth onlymodes.

    SonyPlaystation Eye

    VIA VL800(USB3)

    Great number of bad frames andframe drops. Camera is invisibleto iPi Recorder.

    SonyPlaystation Eye

    VIA VT6212L(USB2)

    Great number of frame drop in640x480@60 mode.

    2 cameras can work smoothly at lower FPS (up to 640x480@50).

    Retrieved from "http://wiki.ipisoft.com/index.php?title=User_Guide_for_Dual_Depth_Sensor_Configuration_(printable_version)&oldid=1177"Categories: IPi Recorder IPi Mocap Studio

    This page was last modified on 11 September 2012, at 14:57.[0 watching users]

    Guide for Dual Depth Sensor Configuration (printable version) - iPiS... http://wiki.ipisoft.com/User_Guide_for_Dual_Depth_Sensor_Con