The rig should be orientated to the FRONT view with its face, along the UP axis with its spine, standing on the floor: Blender: looking along the -Y, +Z is up, the floor is X, Y Maya: looking along the +Z, +Y is up, the floor is X, Z It doesn't matter if the rig is A, T, or any other pose as long as the conditions above are met
We preserve the rest/bind pose of the uploaded rigs so they're back-compatible, i.e. you can apply the animation directly in the 3D app of your choice.
move.ai_ rigs are scaled to proportions of the actor, i.e. if the actor's height is 1.8m move.ai_ rig will also be 1.8m in height. That's the case for all the limbs as well (0.3m move.ai_ rig's forearm length will be equal to the actor's forearm length of 0.3m)
Hips should be the bone that is located between the heads of RightUpLeg and LeftUpLeg. Translation will be applied to it. Do not confuse this bone with the Root you may have in your rig which is usually located between Feet bones! However, if your Root is located between RightUpLeg and LeftUpLeg heads, feel free to use it.
Yes, make sure to map all bones (including fingers) so they become green. We do not have the option to bypass any of the bones yet but the ability to bypass finger mapping is in our TODO list.
Let's say you want to create a missing pinky. Duplicate the ring finger joint chain, and move it a little bit to the position where the pinky finger should be. Rename the bones to something which makes more sense, like pinky_1.L, etc And that's it. Double-check if the first Pinky joint has the same parent joint as the first Ring joint.
Pre-retarget is our internal rig exported directly from the system. It always has the same bone names and bone hierarchy. The initial pose is always the T-pose. However, the placement of the bones (proportions, bone lengths) is adjusted to the rig selected on the run creation step. Use this rig if you want to retarget yourself in 3rd party apps (in that case, use it with move.ai male/female/child rigs for the best user experience) Retargeted contains the animation retargeted to the rig selected on the run creation step. You can apply that animation directly to the rig you selected on the run creation step in the 3D DCC/game engine of your choice.
We don't support transformation nodes (e.g. Maya's locators) in the .fbx. Just bones (or as it's called joints in Maya) and meshes. Plus embedded textures.
There should be just one rig (tree of joints) in the .fbx and as many mesh objects as you like attached to that rig. Mesh objects should be bound to the rig.
Try to avoid uploading the .fbx files with animation data stored in them. We accept clear .fbx rigs with their bind/rest pose. Avoid keyframed transformations
That's because there are no textures provided in the .fbx. They should be embedded into the .fbx
No, we export only the animation, i.e. bones (or joints) of the rig.
What's the FPS of the output results? It's exactly the same as the FPS of input videos. We'll have an option of upscaling/downscaling the FPS in near future.
Can I delete uploaded rig(s) I don't need? Unfortunately, not yet. But we'll have this option in the near future.
Can I use the move.ai system just to retarget animation without reprocessing the take? Unfortunately, not yet. But we'll have this option in the near future.
What are the output formats? Currently, we support only .blend and .fbx formats. .USD, .GLTF, .BVH, .C3D will be added soon.
Do you support rigs of any height? We can guarantee the result for rigs from 0.25m to 3m. Other heights should work too but keep in mind that they can be either too small or too large for the web preview (out of the frame/too small). Also, very small rigs ~0.01m-0.10m are unpredictable sometimes.