[P] I open-sourced a synth framework for creating physics-simulated humanoids in Unity with MuJoCo -- train them with on-device RL and interact in VR
Train AI characters with on-device RL in Unity and physically interact with them in mixed reality on Quest.
Developer Arghya Sur has open-sourced 'Synth,' a comprehensive framework for creating and training physics-simulated humanoid characters within the Unity engine, aiming to build autonomous virtual beings. The system comprises three core Apache 2.0-licensed packages: 'synth-core' converts Daz Genesis 8 or Mixamo characters into fully articulated MuJoCo rigid-body simulations; 'synth-training' enables on-device reinforcement learning using the SAC algorithm via TorchSharp, running directly on Mac, Windows, or Quest hardware without a Python server; and 'synth-vr' adds mixed reality interaction for Meta Quest, using physics-based hand tracking to let users physically push and pull the AI characters in their own space with passthrough rendering.
The technical stack leverages Unity 6, a patched MuJoCo plugin for high-fidelity physics, and TorchSharp with an IL2CPP bridge to run neural network training natively on Quest's CPU. The workflow allows developers to import a model, run a one-click wizard to create a physics-ready 'Synth,' and immediately begin on-device RL training with features like prioritized experience replay. The long-term vision is to create a foundation for embodied AI with integrated perception and reasoning, but the immediate release provides a solid, open-source infrastructure for researchers and developers to experiment with physically realistic humanoid AI that can learn and interact in real-time, immersive environments.
- 'synth-core' package automates conversion of Daz/Mixamo characters into MuJoCo-based physics humanoids with configurable joints and mass.
- 'synth-training' uses TorchSharp for on-device SAC reinforcement learning directly in Unity on Quest (CPU), Mac, or Windows, eliminating Python servers.
- 'synth-vr' enables mixed reality interaction on Meta Quest with physics-based hand tracking, letting users physically manipulate AI characters in their room.
Why It Matters
Democratizes creation of trainable, physically realistic AI humanoids for VR/AR, merging high-end simulation with accessible on-device machine learning.