Saranga: MilliWatt Ultrasound for Navigation in Visually Degraded Environments on Palm-Sized Aerial Robots
A new bio-inspired system uses milliwatt ultrasound and deep learning to replace cameras and LiDAR.
A research team from the University of Maryland and other institutions has unveiled Saranga, a novel perception system designed to solve a critical limitation for palm-sized aerial robots. These agile, low-cost drones are ideal for navigating confined spaces but are severely constrained by payload and power, making traditional sensors like cameras, LiDAR, and power-hungry RADAR impractical. Inspired by bat echolocation, Saranga uses a dual sonar array that consumes only milliwatts of power, a fraction of what other sensors require.
The core innovation tackles the extremely weak ultrasound echoes in noisy environments. The team combats a low Peak Signal-to-Noise Ratio of -4.9 dB with a two-pronged approach: a physical method to block propeller-induced noise and a deep learning model trained to find signal patterns in high noise where classical methods fail. The neural network was trained using a synthetic data generation pipeline supplemented with limited real-world noise data, enabling effective generalization.
In real-world tests, Saranga successfully enabled a palm-sized drone to perform autonomous navigation in visually degraded conditions—including dense fog, darkness, and snow—within a cluttered environment containing thin and transparent obstacles. All processing was done using only onboard sensing and computation, proving the system's practicality for deployment on resource-constrained platforms where GPS and vision are unavailable.
- Uses a dual sonar array consuming only milliwatts of power, making it viable for tiny drones with severe payload limits.
- Combats a -4.9 dB signal-to-noise ratio with physical noise blocking and a deep learning denoiser trained on synthetic/real data.
- Enabled real-world drone navigation in dense fog, darkness, and snow through cluttered spaces with thin, transparent obstacles.
Why It Matters
It unlocks reliable drone operations for search & rescue, infrastructure inspection, and military reconnaissance in conditions where cameras and LiDAR fail.