Dollars MONO is a real-time motion capture solution that uses a web camera or video files. It captures facial expressions, finger movements, and full-body motions. The software runs locally, offers flexible character binding, and integrates with major game engines. It supports VMC protocol and BVH export, and provides a lifetime license with no machine binding.
No pre-setup required with full support for various webcams and video files.
Runs entirely locally, allowing you to see results immediately without any waiting.
Captures facial expressions, finger movements, and full-body motions using one device at the same time.
At just 1% of the cost, delivers 80% of the performance of traditional inertial motion capture systems.
Seamlessly binds to any humanoid characters without the need for additional skeletal modifications.
Dollars MoCap's plugins for real-time streaming to popular game engines such as Unity, Unreal Engine, iClone, Vtuber, and other DCC software for instant animation and interactive experiences.
Certified support for VMC (Virtual Motion Capture) protocol, enabling data transfer with common Vtuber software for real-time motion and facial capture.
Allows exporting of motion data in BVH format, ensuring compatibility with a wide range of animation and modeling software.
Pay once and use forever, with no restrictions on machine binding.
Single-Camera Motion and Facial Capture system.
Motion and Facial Capture Using a Depth Camera.
Full-body Motion Capture using VR Equipment.
Free Webcam Facial Capture.
NVIDIA-powered camera facial capture.
Use Live Link Face in iClone for facial capture.
Generate motion from text input.
A completely free Motion VFX software.
Dollars DEEP uses a depth camera to offer more stable and accurate waist and foot positioning, providing higher-quality real-time motion capture.
Supports Microsoft Azure Kinect and Orbbec Femto Bolt depth cameras. Will support Intel RealSense series and Microsoft Kinect V2, though they are not currently included in the support plan.
The system requirements for Dollars MONO apply to Dollars DEEP, but a USB3.0 interface is necessary to connect the depth camera. Other cameras can be used, but enhanced waist and foot positioning are only available with depth cameras.
Reads tracker information from SteamVR, performs motion solving, and sends skeletal information in real-time to Unity or Unreal Engine for animation.
Ability to record motion capture data as BVH files in Dollars MoCap for editing in 3D software like Maya, Blender, 3DS Max, MotionBuilder, etc.
Compares the relative hardware cost needed for each product: MONO (★★★), DEEP (★★★), VIVA (★★☆).
Compares the environmental setup required: MONO (★★★), DEEP (★★★), VIVA (★★☆).
Compares the fidelity and capability of capturing upper body motion: MONO (☆☆☆), DEEP (★★☆), VIVA (★★★).
Compares the fidelity and capability of capturing lower body motion: MONO (☆☆☆), DEEP (★★☆), VIVA (★★★).
Indicates the ability to track finger motion: MONO (★★★), DEEP (★★★), VIVA (★★★).
Indicates whether facial capture is supported: MONO (●), DEEP (●), VIVA (×).
Indicates whether video files can be used as input: MONO (●), DEEP (●), VIVA (×).
Dollars EGAO provides a completely free facial capture solution using a webcam, suitable for video files as well.
The software enables facial capture with a monocular camera, focusing on user accessibility by using regular webcams.
Utilizes NVIDIA’s AI technology for high precision real-time facial expression capture for performance and ease of use.
Outputs facial data compatible with ARKit, excluding TongueOut, for most facial capture requirements.
Supports video files or camera feeds as input for capturing facial expressions.
Streams facial expression data in real-time to software like Unity, Unreal Engine, iClone, and VirT-A-Mage, and can integrate with Blender.
Requires NVIDIA RTX series or higher graphics cards for operation.
Utilizes Live Link Face app developed by Unreal Engine for facial capture. It's based on Apple's ARKit and can be downloaded for free from the Apple Store. Provides real-time, high-precision facial capture.
Allows synchronous video recording with the front camera for post-production reference and editing.
Offers adjustment of capture data based on performer’s individual situation and enhancement of facial animation quality through calibration.
Three levels of capture quality: EGAO with 2 stars, NVIS with 2 stars, and LINK with 3 stars.
EGAO and NVIS support input from Webcam and Video Files. LINK requires an iPhone.
EGAO and NVIS stream to Unity, UE, iClone, VAM, and Blender (with plugins). LINK streams to iClone.
EGAO supports TongueOut, CheekSquint, NoseSneer, CheekPuff. NVIS supports TongueOut. LINK supports all ARKit expressions.
Uses T2M-GPT to transform text into human motions, allowing real-time motion data reception in engines like Unity, Unreal Engine, iClone, Virt-A-Mate.
Enables recording of motion data as a BVH file, which can be edited in DCC software like Maya, Blender, and 3DS Max.
Integrate motion data through Dollars MoCap products like Dollars MONO, DEEP, VITA, and MOTS to drive visual effects.
Support for Unity (MVC) streaming with a specific port setting for connecting and using in conjunction with motion data.
Allows you to bring up UI options by pressing the ESC key after launching the software.
Trial account users must log in every time the program starts, whereas premium account users do not need an internet connection after a successful login.
Trial accounts have pauses for 10 seconds every 20 seconds, while premium accounts have unlimited live streaming.
Trial accounts can record for 20 seconds each time the program starts, while premium accounts have unlimited recording capability.
Dollars MoCap offers cutting-edge motion capture solutions that are easy to use and integrate seamlessly into various platforms.
Earn up to 10% commission for every sale made through your referral link.
Affiliates can receive payments via PayPal for secure direct payments, or USDT for cryptocurrency payments, providing flexibility and speed.
Content creators, bloggers, educators, or industry professionals with an audience interested in motion capture can join the program.