Proprioceptive-only State Estimation for Legged Robots with Set-Coverage Measurements of Learned Dynamics
A novel 'set-coverage' approach eliminates drift in robot state estimation without cameras or lidar.
A team from the University of Delaware and Johns Hopkins University has published a breakthrough paper on arXiv titled 'Proprioceptive-only State Estimation for Legged Robots with Set-Coverage Measurements of Learned Dynamics.' The research tackles a core challenge in robotics: enabling legged machines like quadrupeds to accurately estimate their position and movement (state estimation) using only internal, proprioceptive sensors—joint encoders and IMUs—without relying on external cameras or lidar. This is crucial for operation in visually degraded conditions like smoke, dust, or darkness, where traditional vision fails.
The key innovation is replacing the standard but fragile 'Gaussian noise assumption' used in filtering algorithms with a more robust 'set-coverage' characterization of measurement uncertainty. Previous methods that use learned models to infer dynamics from joint data often break down with limited training, leading to inconsistent estimates and catastrophic drift. The new framework systematically integrates these non-Gaussian, set-bounded measurements into a practical and computationally cheap filter.
The method was rigorously validated in simulation and, critically, on two real-world quadrupedal robot datasets. Comparisons showed that while Gaussian-based baselines became inconsistent and drifted under real noise conditions, the proposed set-coverage approach maintained reliable and drift-free estimation. This represents a significant step toward truly resilient legged robot autonomy in challenging, unstructured environments where sensor failure is not an option.
- Uses only proprioceptive data (joint angles, IMU), making it immune to visual degradation like smoke or darkness.
- Replaces error-prone Gaussian noise models with robust 'set-coverage' measurements for learned dynamics.
- Demonstrated on real quadruped robots, eliminating drift and maintaining consistency where prior methods fail.
Why It Matters
Enables search-and-rescue or inspection robots to navigate reliably through smoke, dust, and complete darkness where cameras are useless.