Troubleshooting & FAQs
Learn how to resolve issues across all Move AI products
Move Pro
- Move Pro 2.0 - Troubleshooting
- Move Pro - Shooting with multiple actors
- Move Pro - Finger Tracking Best Practice
- Move Pro - Exporting, Uploading and Mapping (own rig)
- What types of data stored in .fbx do you support?
- How do I retarget to a larger rig? I.e. a giant
- What's the difference between Pre-retarget and Retargeted .fbx?
- Your default skeleton looks to be lacking some spine bones. Why is this?
- I don't have all 5 finger bones in my rig. What to do?
- What is a good pixel score for a calibration?
- Do I need to map all bones?
- Do you deal with IK or FK controllers?
- How can I improve my finger-tracking quality?
- Can I capture hands without the body?
- Does every actor have to do a calibration?
- What lighting conditions can you deal with?
- How does your accuracy compare to an optical marker-based system?
- What is the optimal calibration protocol?
- What size capture area do I need and how many people can I capture?
- How well does your system deal with occlusion?
- Does the actor's clothing affect the motion capture quality?
- What is the importance of performing the T-pose?
- How can I check the calibration quality?
- Can I capture select people within a scene?
- Can the cameras be mounted at different levels?
- Do the cameras have to be stationary?
- How to remove a camera from a session?
- I want to capture multiple actors. Does the calibration process need to be performed by each actor?
- What if somebody goes out of the capture volume?
- What are the output formats?
- What's the FPS of the output results?
- How to choose the hip bone?
- What's the difference between the rigs with the move.ai_ prefix and all others?
- How many rigs and objects may there be in the .fbx?
- How does retargeting work in your system?
- What if one of the cameras is bumped?
- Can the actor still be tracked if they're holding an object?
- Do you export meshes?
- May I upload the already animated .fbx?
- Can I use the Move AI system just to retarget animation without reprocessing the take?
- Do you transform uploaded rigs in any way?
- Why is my rig pink on a preview?
- Do you support rigs of any height?
- Root Motion in Move Animations
- Move Pro FAQs Overview
Move Live
Move One
- Move One - How to place the camera to achieve optimal results with Move One.
- Move One - How to position the actor in front of the camera
- Move One - How to shoot in landscape mode
- Move One - How to improve hand and finger tracking
- Transitioning to Move Platform
- Can I upload a rig to use for my animations?
- Can I generate animations from a still image?
- Can I process data using my mobile device?
- How can I get the best quality animation?
- How far away from the camera can I be?
- How many people can Move One capture at once?
- Why is the app tracking people in the background?
- What rig is the Move One data processed on?
- What frame rate will animations be in?
- Does processing animations with Move One affect Move Pro subscription minutes?
- Why Can't I See My Move One Takes/Subscription Credits?
- Are there any ongoing issues with the Move One servers?
- How can I make accurate lying-down animations?
- Can I process data using my mobile device?
- Does Move One have built-in finger tracking?
- Is it possible to animate videos that already exist?
- How do I delete my account?
- Can you tell me how to access the s2 model on Move One?
- What should the pose of the uploaded rig be?
- Can you generate animation data from prompts with your chatbot?