TinyDEVO: Deep Event-based Visual Odometry on Ultra-low-power Multi-core Microcontrollers
Researchers slash AI model size 11.5x to run visual navigation on ultra-low-power chips.
A team from ETH Zurich has developed TinyDEVO, a breakthrough deep learning model that brings sophisticated visual odometry (VO)—the AI that estimates camera motion—to ultra-low-power microcontrollers. By combining neural network architectural optimizations and hyperparameter tuning, they dramatically compressed the leading event-based VO model, DEVO. The result is an 11.5x reduction in memory footprint (down to 63.8 MB) and a 29.7x cut in computational operations (to 5.2 billion MACs per frame), with only a modest increase in trajectory error.
This efficiency breakthrough allowed the team to deploy TinyDEVO on a commercially available, ultra-low-power 9-core RISC-V microcontroller. The system runs at approximately 1.2 frames per second while consuming a mere 86 milliwatts of power. This demonstrates, for the first time, the feasibility of running a complete event-based VO pipeline directly on a resource-constrained MCU, a feat previously impossible due to the massive memory and compute demands of state-of-the-art models.
The work is significant because it directly addresses a critical bottleneck in autonomous edge devices. Visual odometry is a core component for navigation in robots, drones, and augmented reality glasses, but traditional implementations are too power-hungry for long battery life. TinyDEVO's microcontroller-level efficiency opens the door to a new generation of tiny, intelligent, and energy-autonomous machines that can understand and navigate their environment without relying on cloud connectivity or large batteries.
- Achieves 11.5x memory reduction (63.8 MB) and 29.7x compute reduction (5.2B MACs/frame) vs. state-of-the-art DEVO model.
- Runs on a 9-core RISC-V MCU at 1.2 fps with ultra-low 86 mW average power consumption.
- Maintains functional performance with an average trajectory error of 27 cm, enabling practical navigation for micro-robotics.
Why It Matters
Enables real-time AI navigation for tiny, battery-powered robots and wearables, moving intelligence to the extreme edge.