Open Source

HY-World 2.0 released

One-click generation creates editable 3D worlds for Unity and Unreal Engine with physics support.

Deep Dive

HY-World 2.0 represents a significant leap in AI-powered 3D content creation, moving beyond static assets to generate fully interactive worlds. The system's core innovation is its one-click workflow that transforms simple text prompts or reference images into complete, navigable 3D environments. This eliminates the traditional, labor-intensive pipeline of modeling, texturing, and scene assembly, potentially reducing world-building time from weeks to minutes. The technology leverages a unified world model system that works across both synthetic generation and real-world scene reconstruction, offering versatility for various creative and practical applications.

For developers, the most practical feature is the pipeline-ready output. HY-World 2.0 exports in standard formats compatible with major game engines—Unity and Unreal Engine—including editable meshes, 3D Gaussian Splatting representations, and point clouds. This means artists and developers can import the AI-generated worlds directly into their existing workflows for further refinement. The interactive character mode adds another layer of utility, allowing for real-time exploration with physics-aware movement and collision support, which is essential for testing game mechanics and user experience before committing to full production.

Key Points
  • One-click generation from text or images to interactive 3D worlds with physics
  • Exports pipeline-ready assets for Unity/Unreal in mesh, 3DGS, and point cloud formats
  • Unified model handles both synthetic world generation and real-world scene reconstruction

Why It Matters

Dramatically accelerates 3D world creation for game dev, VR, and simulation, turning concept art into playable prototypes.