Wearable environmental sensing to forecast how legged systems will interact with upcoming terrain
This breakthrough could make robotic legs and prosthetics move like natural limbs.
Deep Dive
Researchers developed a wearable AI system that forecasts exactly how a foot will land on upcoming terrain 250 milliseconds before impact. Using a shank-mounted RGB-D camera and a lightweight CNN-RNN model, it predicts the foot's center-of-pressure location within 23.72mm and time-of-impact within 17.73ms of accuracy. The model runs at 60 FPS on consumer laptops or edge devices, enabling real-time, anticipatory control for assistive robotic systems and prosthetics.
Why It Matters
This enables next-gen prosthetics and exoskeletons to anticipate terrain changes, making movement more natural and stable for users.