Difference between revisions of "User Guide for Multiple PS Eye Cameras Configuration"

From iPi Docs
Jump to: navigation, search
Line 16: Line 16:
== Recording Video with Multiple PS Eye Cameras ==
== Recording Video with Multiple PS Eye Cameras ==
{{:Environment and Actor Clothing}}
{{:Environment and Actor Clothing}}

Revision as of 11:56, 3 November 2014

System Requirements

iPi Recorder

  • Computer (desktop or laptop):
    • CPU: x86 compatible (Intel Pentium 4 or higher, AMD Athlon or higher, 2GHz), dual- or quad- core is preferable
    • Operating system: Windows 10 / 8.1 / 8 / 7 (32-bit or 64-bit)
    • USB: at least two USB 2.0 or USB 3.0 controllers
      For more info see USB controllers
    • ExpressCard or eSATA slot (for laptops)
      Optional, but highly recommended. It allows to install external USB controller in case of compatibility issues between cameras and built-in USB controllers, or if all USB ports are in fact connected to a single USB controller
    • Storage system: HDD or SSD or RAID with write speed:
      • For 4 cameras at 60 fps, 640 x 480 resolution: not less than 17 MByte/sec
      • For 6 cameras at 60 fps, 640 x 480 resolution: not less than 25 MByte/sec
      • For 8 cameras at 60 fps, 640 x 480 resolution: not less than 35 MByte/sec
      • For 12 cameras at 60 fps, 640 x 480 resolution: not less than 50 MByte/sec
      • For 16 cameras at 60 fps, 640 x 480 resolution: not less than 70 MByte/sec
Note: These numbers are valid for "background subtraction" compression mode (default compression mode for PS Eye cameras in iPi Recorder). But this compression involves additional CPU load. If your CPU is a bottleneck, you can use non-compressed mode that gives 3-5 times higher required write speed, but the CPU load will be reduced.

iPi Mocap Studio

  • Computer (desktop or laptop):
    • CPU: x86/x64 compatible (Intel Pentium 4 or higher, AMD Athlon or higher), dual- or quad- core is preferable.
    • Operating system: Windows 10 / 8.1 / 8 / 7 (32-bit or 64-bit).
    • Video card: DirectX 11 capable gaming-class graphics card.
GPUz example.gif

Software Installation

iPi Recorder

Important! Please unplug all cameras from computer before installation.

Download and run the setup package of the latest version of iPi Recorder. You will be presented with the following dialog.

iPi Recorder 3 Setup.png
  1. Select needed components
  2. Read and accept the license agreement by checking appropriate checkbox
  3. Press the Install button to begin installation
Note: Most of the components require administrative privileges because they install device drivers or write to Program Files and other system folders. You will be presented with UAC prompts when appropriate during installation. If you plan to use iPi Recorder under user which has no administrative rights, you can pre-install other components separately using administrator's account.
  1. You can plug only one depth sensor to one USB controller. Single USB controller bandwidth is not enough to record from 2 sensors.
  2. You can plug not more than 2 Sony PS Eye cameras to one USB controller, otherwise you will not be able to capture at 60 fps with 640 x 480 resolution.
For more info see USB controllers.

Upon installation is complete, iPi Recorder will launch automatically. Continue with user's guide to get a knowledge of using the software.


If some of the components is already installed, it has no checkbox and is marked with ALREADY INSTALLED label. You should not install all optional components in advance, without necessity. All of them can be installed separately at later time. Components descriptions below contain corresponding download links.

  • (Windows 8, 8.1, 10) Microsoft Kinect 2:: MS Kinect SDK 2.0. Check if you plan to work with Kinect 2 for Windows or Kinect for Xbox One depth sensors, but do not plan to connect multiple Kinects to a single PC.
    Device drivers and software libraries for Microsoft Kinect 2. Requires 64-bit Windows 8+ and USB 3.0.
  • iPi Recorder 4.x.x.x. This is required component and cannot be unchecked.
    iPi Recorder itself.

iPi Mocap Studio

Download and run the latest setup package of iPi Mocap Studio. You will be presented with the following dialog:

iPi Mocap Studio 3 Setup.png

All components are required for installation.

Note: The installation of Microsoft .NET Framework 4.5.1 requires an Internet connection. If needed, you can download offline installer for Microsoft .NET separately, and run it before iPi Mocap Studio setup.

Other components are included with iPi Mocap Studio setup.

Note: Shell Extensions for Video and Project files are needed to show thumbnails and preview for iPi video and project files in Windows Explorer
  1. Press the Install button to begin installation.
  2. You will be prompted to read and accept the license agreement(s) by checking corresponding checkbox.
    iPi Mocap Studio 3 Setup Accept License.png
    Click to enlarge
    Click to enlarge
  3. Press the Install button to begin installation.
  4. Upon installation is complete, you will be prompted to launch iPi Mocap Studio.
    iPi Mocap Studio 3 Setup Launch.png
  5. As soon as iPi Mocap Studio launches, you will be prompted to enter your license key or start 30-days free trial period.
    For more info about license protection see Licensing Policy.
    Welcome to ipistudio dlg.png
  6. Ensure that your graphics hardware is set to maximum performance with iPi Mocap Studio.

Recording Video with Multiple PS Eye Cameras



For a multiple PlayStation Eye configuration, you need a minimum of 13 feet by 13 feet space (4 meters by 4 meters). At smaller space, actor simply won’t fit into view of cameras.

For 640 by 480 camera resolution, capture area can be as big as 20 feet by 20 feet (7 meters by 7 meters). That should be enough for capturing motions like running, dancing etc.


Light-color background (light walls and light floor) is recommended for markerless motion capture. iPi Desktop Motion Capture is designed to work with real-life backgrounds. A multi-camera configuration (3 cameras and up) can handle certain amount of background clutter. Please keep in mind that the system can be confused if your background has large objects of the same color as actor clothes.

Environment and clothing.jpg

Using a green or a blue backdrop may improve results, but you are not required to use a backdrop if you have reasonable office or home environment with light-color walls and bright lighting.


For best results, your environment should have multiple light sources for uniform, ambient lighting. Typical office lighting with multiple light sources located on ceiling should be quite suitable for markeless motion capture. In a home environment, you may need to use additional light sources to achieve more uniform lighting.

Please note that the system cannot work in direct sunlight. If you plan a motion capture session outdoors you should choose a cloudy, overcast day.

Actor Clothing

Actor should be dressed in solid-color long-sleeve shirt, solid-color trousers (or jeans) and solid-color shoes. Deep, saturated colors are preferable. Casual clothes like jeans should be OK for use with markerless mocap system. iPi Desktop Motion Capture uses clothing color for separating actor from background and therefore cannot work with totally arbitrary clothing.

Recommended shirt (torso) colors are black, blue or green. Red is not recommended because red can blend with human skin color making it difficult for the system to see hands placed over torso. Black color is useful for reducing self-shadows on torso. If you have bright uniform lighting you can get better results with a primary-color (blue or green) shirt.

Recommended jeans/trousers color is blue.

Recommended shoe color is black.

iPi Desktop Motion Capture has an option of using T-shirt over long-sleeve shirt for actor clothing. However, simple long-sleeve shirt may result in more accurate motion capture.

Recording Process

Video recording

Scene Set-up

The general rule of thumb is to place most of the cameras at 1 - 1.5m height and one per each 4 at a greater height of 2 - 2.5m. However, for specific motions other setups may be more beneficial. For instance, when an actor is lying on the floor or crawling, you can improve tracking quality by placing more (like half) cameras higher, up to the ceiling level.

Anyway, the system is flexible and you can get decent results with any reasonable setup. Even when you cannot follow all recommendations due to limitations in space or room's geometry.

5 and More Cameras Configuration

You can set up 5 or more cameras in a full-circle or a half-circle configuration, depending on available space. You can improve accuracy by placing one or two cameras high over the ground (like 3 meters high).

Recommended configuration for 6-camera full-circle setup:

Click to enlarge

Four Camera Configuration

You can set up 4 cameras in a half-circle or a full-circle configuration, depending on available space. You can improve accuracy by placing one of the cameras high over the ground (like 3 meters high).

Recommended configuration for 4-camera setup in half circle:

Click to enlarge


Click to enlarge

Three Camera Configuration

Recommended configuration for 3-camera setup is a half-circle:

Click to enlarge


Click to enlarge

Virtual view of the same scene:

Click to enlarge

Camera Setup

Install the cameras on tripods and connect cables.

Click to enlarge

Sony PlayStation Eye cameras do not have standard tripod mounting screw, so you will have to use some kind of ad hoc solution. The simplest approach is to fix the cameras to tripods using sticky tape.

When mixing active and passive USB cables, make sure cable connection order is correct (computer->active cable->passive cable->camera).

If you're using the PlayStation Eye camera, make sure you have the lens set to the wide setting.

Click to enlarge



Recording Actor's Performance

Recording actor's performance

Processing Video from Multiple PlayStation Eye Cameras

Processing video file in iPiStudio

Manual Clean-up

Once initial tracking is performed on all (or part) of your video, you can begin cleaning out tracking errors (if any). Automatic Refinement and Filtering should be applied after clean-up.

Cleaning up tracking gaps

Clean-up Steps

Tracking errors usually happen in a few specific video frames and propagate to multiple subsequent frames, resulting in tracking gaps. Examples of problematic frames:

  • Occlusion (like one hand not visible in any of the cameras)
  • Indistinctive pose (like hands folded on chest).
  • Very fast motion with motion blur.

To clean up a sequence of incorrect frames (a tracking gap), you should use backward tracking:

  1. Go toward the last frame of tracking gap, to a frame where actor pose is distinctive (no occlusion, no motion blur etc.).
  2. If necessary, use Rotate, Move and IK (Inverse Kinematics) tools to edit character pose to match actor pose on video.
  3. Turn off Trajectory Filtering (set it to zero) so that it does not interfere with your editing.
  4. Click Refit Pose button to get a better fit of character pose.
  5. Click Track Backward button.
  6. Stop backward tracking as soon as it comes close to the nearest good frame.
  7. If necessary, go back to remaining parts of tracking gap and use forward and backward tracking to clean them up.

Individual body parts tracking

Tracking tab individual body parts.png

In most cases tracking errors affect some of limbs. Individual Body Parts Tracking settings on Tracking tab allow to redo tracking specified body parts.

  • Tracking will be done for selected body parts only.
  • Unselected body parts will keep the same rotations.

Cleaning up individual frames

To clean up individual frames you should use a combination of editing tools (Rotate, Move and Inverse Kinematics) and Refit Pose button.

Note: after Refit Pose operation iPi Mocap Studio automatically applies Trajectory Filtering to produce a smooth transition between frames. As the result, pose in current frame is affected by nearby frames. This may look confusing. If you want to see exact result of Refit Pose operation in current frame you should turn off Trajectory Filtering (set it to zero), but do not forget to change it back to suitable value later.

Tracking errors that cannot be cleaned up using iPi Studio

Not all tracking errors can be cleaned up in iPi Mocap Studio using automatic tracking and Refit Pose button.

  • Frames immediately affected by occlusion sometimes cannot be corrected. Recommended workarounds:
    • Manually edit problematic poses (not using Refit Pose button).
    • Record a new video of the motion and try to minimize occlusion.
    • Record a new video of the motion using more cameras.
  • Frames immediately affected by motion blur sometimes cannot be corrected. Recommended workarounds:
    • Manually edit problematic poses (not using Refit Pose button).
    • Edit problematic poses in some external animation editor.
    • Record a new video of the motion using higher framerate.
  • Frames affected by strong shadows on the floor sometimes cannot be corrected. Typical example is push-ups. This is a limitation of current version of markerless mocap technology. iPi Soft is working to improve tracking in future versions of iPi Mocap Studio.

Automatic Refinement and Filtering

Automatic Refinement and Filtering should be applied after Manual Clean-up, if there were tracking errors.

Also, this final step is called Post-Processing and includes:

Clean-up Steps
  1. Tracking Refinement
  2. Jitter Removal
  3. Trajectory Filtering

Tracking refinement

After the primary tracking and cleanup are complete, you can optionally run the Refine pass (see Refine Forward and Refine Backward buttons). It slightly improves accuracy of pose matching, and can automatically correct minor tracking errors. However, it takes a bit more time than the primary tracking, so it is not recommended for quick-and-dirty tests.

Important! Refine should be applied:
  • Using the same tracking parameters as the primary tracking (e.g. feet tracking, head tracking) in order not to lose previously tracked data.
  • Before motion controller data.
  • If you plan to manually edit the animation (not related to automatic cleanup with Refit Pose).

In contrast to the primary tracking, Refine does no pose prediction. It is based on the current pose in a frame only. Essentially, running Refine is equal to automatically applying Refit Pose to a range of frames which were previously tracked.

Post-processing: Jitter Removal

  • Jitter Removal filter is a powerful post-processing filter. It should be applied after cleaning up tracking gaps and errors.
  • It is recommended that you always apply Jitter Removal filter before exporting animation.
  • Jitter Removal filter suppresses unwanted noise and at the same time preserves sharp, dynamic motions. By design, this filter should be applied to relatively large segments of animation (no less than 50 frames).
  • Range of frames affected by Jitter Removal is controlled by current Region of Interest (ROI).
  • You can configure Jitter Removal options for specific body parts. Default setting for Jitter Removal “aggressiveness” is 1 (one tick of corresponding slider). Oftentimes, you can get better results by applying a slightly more aggressive Jitter Removal for torso and legs. Alternatively, you may want to use less aggressive Jitter Removal settings for sharp motions like martial arts moves.
  • Jitter Removal filter makes an internal backup of all data produced by tracking and clean up stages. Therefore, you can re-apply Jitter Removal multiple times. Each subsequent run works off original tracking/clean-up results and overrides previous runs.

Post-processing: Trajectory Filtering

  • Trajectory Filter is a traditional digital signal filter. Its purpose is to filter out minor noise that remains after Jitter Removal filter.
  • Trajectory Filter is very fast. It is applied on-the-fly to current Region of Interest (ROI).
  • Default setting for Trajectory Filter is 1. Higher settings result in multiple passes of Trajectory Filter. It is recommended that you leave it at the default setting.
  • Trajectory Filter can be useful for “gluing” together multiple segments of animation processed with different Jitter Removal options: change the Region of Interest (ROI) to cover all of your motion (e.g. multiple segments processed with different jitter removal setting); change Trajectory Filtering setting to 0 (zero); then change it back to 1 (or other suitable value).

Export and Motion Transfer

Animation Export

To export tracked motion, follow simple steps below.

  1. Select Export tab
  2. Select rig from the list of available rigs or import your custom model
    Note: The motions will be automatically transferred to selected rig (except Default iPi Rig, that does not require motion transfer). See details on motion transfer below)
  3. Press Export button or use File > Export Animation menu item to export all animation frames from within Region of Interest (ROI).
    Note: To export animation for specific take, right-click on take and select Export Animation item from pop-up menu.
  4. Select output file format

Motion Transfer

Default iPi Character Rig

The default skeleton in iPi Mocap Studio is optimized for markerless motion capture. It may or may not be suitable as a skeleton for your character. Default iPi skeleton in T-pose has non-zero rotations for all joints. Please note that default iPi skeleton with zero rotations does not represent a meaningful pose and looks like a random pile of bones.

Default rig
Bone names

By default iPi Mocap Studio exports a T-pose (or a reasonable default pose for custom rig after motion transfer) in the first frame of animation. In case when it is not desired switch off Export T-pose in first frame checkbox.

Other rigs

iPi Mocap Studio has integrated motion transfer technology that allows to automatically transfer motion to a custom rig.

  1. Select Export tab
  2. Select rig from the list of available rigs or import your custom model
    Note: The motions will be automatically transferred to selected rig (except Default iPi Rig, that does not require motion transfer). You will be able to see the transferred motion in the viewport
  3. You may need to assign bone mappings on the Export tab for motion transfer to work correctly.
  4. You can save your motion transfer profile to XML file for future use.
Tip: iPi Mocap Studio has pre-configured motion transfer profiles for many popular rigs (see below).
Note: If you export animation to format different from format your target character was imported in, only rig will be exported. If you use the same format for export, skin will be exported as well.

Starting with version 3.5, iPi Mocap Studio supports rotation of imported character into proper orientation. This is useful for many popular characters, including Unreal Engine standard character.


Starting with version 3.5, iPi Mocap Studio can map hips motion either to Root/Ground or to Hips/Pelvis. This is useful for game engine characters, including standard Unity 3D Engine and Unreal Engine characters.


Export Pipelines for Popular 3D Packages


Select Motion Builder target character on Export tab and export animation to BVH or FBX.

export mb.png

3D MAX Biped

  1. Select 3ds Max Biped target character on Export tab and export animation to BVH or FBX.
  2. Create a Biped character in 3D MAX (Create > Systems > Biped).
  3. Put your Biped character to your 3d scene.
  4. Go to Motion tab. Click Motion Capture button and import your BVH or FBX file.
Step 1
Step 2
Step 3
Step 4

Our user Cra0kalo created an example Valve Biped rig for use with 3D MAX. It may be useful if you work with Valve Source Engine characters.


Latest versions of Maya (starting with Maya 2011) have a powerful biped animation subsystem called "HumanIK". Animations exported from iPi Mocap Studio in MotionBuilder-friendly format should work fine with Maya 2011 and HumanIK. The following video tutorials can be helpful:

For older versions of Maya please see the #Other_rigs section. Recommended format for import/export with older versions of Maya is FBX.


iPi Mocap Studio supports FBX format for import/export of animations and characters. When exporting animation, you are presented with several options:

  • Which version of FBX format to use, ranging from 6.1 (2010 product line) to 7.4 (2015 product line)
  • Produce text or binary file

The default values are defined by an imported character (if any), otherwise set to recently used values.

Some applications do not use the latest FBX SDK and may have problems importing FBX files of newer versions. In case of such problems, your can use Autodesk's free FBX Converter to convert your animation file to an appropriate FBX version.


iPi Mocap Studio supports COLLADA format for import/export of animations and characters. Current version of iPi Mocap Studio exports COLLADA animations as matrices. In case if you encounter incompatibilities with other applications' implementation of COLLADA format, we recommend using Autodesk's free FBX Converter to convert your data between FBX and COLLADA formats. FBX is known to be more universally supported in many 3D graphics packages.


Recommended format for importing target characters from LightWave to iPi Studio is FBX. Recommended format for bringing animations from iPi Mocap Studio to LightWave is BVH or FBX.


Our user Eric Cosky published a tutorial on using iPi Mocap Studio with SoftImage|XSI:



  1. Export your poser character in T-pose in BVH format (File > Export).
  2. Import your Poser character skeleton into iPi Mocap Studio. Your animation will be transferred to your Poser character.
  3. Export your animation to BVH format.
  4. Import exported BVH to Poser
Tip: Poser 8 has a bug with incorrect wrists animation import. The bug can be reproduced as follows: export Poser 8 character in T-pose in BVH format; import your character back into Poser 8; note how wrists are twisted unnaturally as the result.
A workaround for wrists bug is to chop off wrists from your Poser 8 skeleton (for instance using BVHacker) before importing Poser 8 target character into iPi Mocap Studio. Missing wrists should not cause any problems during motion transfer in iPi Mocap Studio if your BVH file is edited correctly. Poser will ignore missing wrists when importing resulting motion so the resulting motion will look right in Poser (wrists in default pose as expected).
Step 1
Step 2
Step 3
Step 4


  1. In DAZ Studio reset your character to T-pose and correct feet positions as shown on picture below.
  2. Export you DAZ character to BVH with default options.
  3. Import your DAZ character skeleton into iPi Mocap Studio. Your animation will be transferred to your DAZ character.
  4. Export your animation to BVH format.
  5. Import exported BVH to DAZ Studio.
Note: You can use DAZ character in COLLADA (.dae) format for preview, but it is strongly recommended that you use DAZ character in BVH format for motion transfer. DAZ3D has a problem with COLLADA (.dae) format: DAZ3D Studio does not export all bones into COLLADA (.dae). In particular, the following bones are not exported: eyeBrow, bodyMorphs. DAZ3D Studio does not use bone names when importing motions; instead, DAZ3D Studio just takes rotations from the list of angles as though it was a flat list with exactly the same positions as in DAZ3D internal skeleton. As the result, when you transfer the motion to a COLLADA character and import it back into DAZ3D, the motion will look wrong. iPi Mocap Studio displays a warning about this. To avoid this problem, import your DAZ target character in BVH format - DAZ3D Studio is known to export characters in BVH format correctly (with all bones).
Tip: You can improve accuracy of motion transfer by doing some additional preparation of your DAZ 3D skeleton in BVH format. For DAZ 3D Michael 4.0 and similar characters, you may need to clamp thigh joint rotation to zero to avoid unnatural leg bending. For DAZ 3D Victoria 4.0, you may need to adjust foot joint rotation to change the default “high heels“ foot pose to a more natural foot pose.
Step 1
Step 2
Step 3
Step 4
Step 5

iClone 3

Note: Modern versions of iClone work smoothly with iPi Soft. Instructions below are only for iClone 3
Note: Current version of iPi Studio can only export animation in iClone-compatible BVH format. The iMotion format is not supported. That means you will need iClone PRO to be able to import the motion into iClone. Standard and EX versions of iClone do not have BVH Converter and therefore cannot import BVH files.

Workflow for iClone is straightforward.

  1. Select iClone target character on Export tab and export animation to BVH.
  2. Go to Animation tab in iClone and launch BVH Converter.
  3. Import your BVH file with Default profile, click Convert
  4. Save the resulting animation in iMotion format. Now your animation can be applied to iClone characters.
Tip: iClone expects an animation sampled at 15 frames per seconds. For other frame rates, you may need to create a custom BVH Converter profile by copying Default profile and editing Frame Rate setting.
Note: BVH Converted in iClone 4 has a bug that causes distortion of legs animation. iPi Mocap Studio exports an iClone-optimized BVH correctly as can be verified by reviewing exported BVH motion in BVHacker or MotionBuilder or other third-party application. No workaround is known. We recommend that you contact iClone developers about this bug as it is out of control of iPi Soft.
Step 1
Step 2
Step 3
Step 4
Step 5

Valve Source Engine SMD

Transfer motions to your Valve Source Engine character (stored in .smd file) and export your animation in Valve Source Engine SMD format.

Our user Cra0kalo created an example Valve Biped rig for use with 3D MAX. It may be useful if you wish to apply more then one capture through MotionBuilder or edit the custom keyframes in MAX.

Valve Source Filmmaker


First, you need to import your character (or its skeleton) into iPi Mocap Studio, for motion transfer.

There are currently 3 ways of doing this:

  1. You can import an animation DMX (in default pose) into iPi Mocap Studio. Since it has a skeleton, it should be enough for motion transfer. To create an animation DMX with default pose, you can add your character to your scene in Source Filmmaker and export DMX for corresponding animation node:
    • open Animation Set Editor Tab;
    • click + > Create Animation Set for New Model;
    • choose a model and click Open;
    • export animation for your model, in ASCII DMX format;
      There is a checkbox named Ascii in the top area of the export dialog.
  2. Alternatively, you can just import an SMD file with your character into iPi Mocap Studio. For example, SMD files for all Team Fortress 2 characters can be found in your SDK in a location similar to the following (you need to have Source SDK installed): C:\Program Files (x86)\Steam\steamapps\<your steam name>\sourcesdk_content\tf\modelsrc\player\pyro\parts\smd\pyro_model.smd).
  3. If you created a custom character in Maya, you should be able to export it in DMX model fromat. (Please see Valve documentation on how to do this).

Then you can import your model DMX into iPi Mocap Studio. Current version of iPi Mocap Studio cannot display character skin, but it should display the skeleton. Skeleton should be enough for motion transfer.

To export animation in DMX, press Export Animation button on the Export tab in iPi Mocap Studio and choose DMX from the list of supported formats. You may also want to uncheck Export T-pose in first frame option on the Export tab in iPi Mocap Studio.

Now you can import your animation into Source Filmmaker. There will be some warnings about missing channels for face bones but you can safely ignore them.

Step 1
Step 2
Step 3
Step 4
Step 5
Step 6
Old way involving Maya

This was used until iPi Mocap Studio got DMX support. And still may be useful in case of any troubles with DMX. Please see the following video tutorial series:



Select Blender target character on Export tab and export animation to BVH format.

export blender.png


If you have experience with Cinema4D please help to expand this Wiki by posting Cinema4D import/export tips to Community Tutorials section of our user forum.


Transfer motions to your Evolver character (stored in COLLADA or FBX file) and export your animation.

Evolver offers several different skeletons for Evolver characters. Here is an example motion transfer profile for Evolver "Gaming" skeleton: evolver_game.profile.xml

Second Life

Transfer motions to your Second Life character (stored in BVH file) and export your animation in BVH format.

SecondLife documentation contains a link to useful SL avatar files. The ZIP file includes a BVH of the "default pose". Be sure to have that.

See the discussion on our Forum for additional details: http://www.ipisoft.com/forum/viewtopic.php?f=2&p=7845


Please see our user forum for a discussion of animation import/export for Massive:


IKinema WebAnimate

Please see the following video tutorial on how to use iPi Mocap Studio with IKinema WebAnimate:


Jimmy|Rig Pro

Please see the following video tutorial on how to use iPi Mocap Studio with Jimmy|Rig Pro:


Video Materials

For video materials please refer to our Gallery

Video tutorials are available