PeriphAR: Fast and Accurate Real-World Object Selection with Peripheral Augmented Reality Displays
New technique leverages color cues in your side vision for 3D object selection, enabling more natural AR interactions.
A team of researchers from the University of Michigan, led by Yutong Ren, Arnav Reddy, and Michael Nebeling, has introduced PeriphAR, a novel visualization technique designed to solve a core problem in augmented reality: selecting objects in 3D space. Current gaze-based selection methods for wide-field-of-view displays rely on central overlays that block the user's view, which is impractical for always-on AR glasses. PeriphAR instead leverages the user's peripheral vision to provide subtle, non-intrusive feedback, allowing for faster and more natural interaction without requiring direct visual confirmation.
Through two user studies, the team discovered that peripheral vision is more sensitive to color changes than to shape, but this sensitivity drops sharply with low contrast. To overcome this, they developed and tested strategies to enhance the color contrast of a target object against its most similarly colored neighbor. Their preferred method was subjectively favored by users. As a proof of concept, they implemented PeriphAR in a functional end-to-end system that integrates with real-world object detection, demonstrating its practical viability for future AR applications where speed and visual continuity are critical.
- Leverages peripheral vision for non-intrusive feedback in monocular AR displays, moving beyond central overlays.
- User studies found peripheral vision 70% more sensitive to color than shape for preattentive target selection.
- Implements a contrast-maximization strategy for target color enhancement, integrated with real-world object detection in a working prototype.
Why It Matters
This research is a key step towards intuitive, always-on AR glasses where users can interact with their environment without disruptive visual interfaces.