Compact single-shot ranging and near-far imaging using metasurfaces
A 15mm-thick system captures three images at once, enabling precise 1mm depth sensing for edge devices.
A team of researchers has unveiled a novel imaging system that uses a metasurface—an engineered surface that manipulates light—to capture three distinct images in a single snapshot. The system simultaneously records two close-range images focused at 1.4 cm and 2.0 cm, forming a focal stack, alongside a third image at a much longer range of approximately 40 cm, all projected onto a single shared photosensor. This compact design achieves a total track length of just 15 mm, making it exceptionally thin and suitable for integration into small devices.
This single-shot capability enables passive depth sensing through a computationally efficient depth-from-defocus algorithm. By analyzing the differences in focus between the two close-range images, the system can calculate distance with an accuracy of ±1 mm within a range of 12 mm to 20 mm. The technology is specifically designed for edge computing platforms where size, power, and processing resources are constrained, opening doors for advanced applications in defense, robotics, and industrial inspection that require precise 3D perception in a tiny form factor.
- Captures three images (two near, one far) simultaneously on one sensor using a metasurface.
- Enables passive depth sensing with ±1mm accuracy from 12-20mm via a depth-from-defocus algorithm.
- Ultra-compact 15mm total track length designed for integration into resource-constrained edge devices.
Why It Matters
Enables high-precision 3D sensing and imaging in ultra-thin devices for robotics, defense, and industrial automation.