Robust Camera-to-Mocap Calibration and Verification for Large-Scale Multi-Camera Data Capture
New system detects calibration drift in fisheye cameras automatically...
Researchers at Meta, including Tianyi Liu, Christopher Twigg, Patrick Grady, Kevin Harris, Shangchen Han, and Kun He, have published a paper on arXiv titled 'Robust Camera-to-Mocap Calibration and Verification for Large-Scale Multi-Camera Data Capture.' The work addresses critical challenges in optical motion capture (mocap) systems used for ground-truth capture in AR/VR, SLAM, and robotics datasets. These systems require extrinsic calibration to align mocap coordinates to external camera frames, a step prone to errors from board-to-marker attachment variation, optimization initialization ambiguity, and session-to-session calibration drift. The problem is exacerbated for fisheye cameras, which have spatially non-uniform distortion that complicates both calibration and verification.
The proposed calibration system jointly estimates camera extrinsics and the board-to-marker transform, using a staged solver to improve convergence reliability under ambiguous initialization. The verification component, named 'lollypop,' provides fast, operator-independent assessment through a measurement chain entirely independent of the calibration data. In experiments on a Meta Quest 3 headset with fisheye cameras, the calibration outperforms existing benchwork, and lollypop reliably detects calibration degradation over time. The system has already been deployed in production data collection pipelines, indicating its practical utility for large-scale multi-camera setups.
- Joint estimation of camera extrinsics and board-to-marker transform improves calibration robustness.
- Staged solver handles optimization initialization ambiguity for better convergence.
- Lollypop verification tool independently detects calibration drift over time, tested on Meta Quest 3.
- System deployed in production pipelines for large-scale multi-camera data capture.
Why It Matters
Enables reliable, automated calibration for AR/VR and robotics datasets, reducing manual checks and data corruption.