MRReP: Mixed Reality-based Hand-drawn Reference Path Editing Interface for Mobile Robot Navigation
A new mixed reality interface replaces 2D maps, letting users sketch precise robot routes directly onto the physical floor.
A team of researchers from Japan, including Takumi Taki, Masato Kobayashi, and Yuki Uranishi, has published a paper on MRReP (Mixed Reality-based Hand-drawn Reference Path editing interface). This novel system tackles a core limitation in mobile robot navigation: conventional path planners optimize for geometric efficiency but lack a simple way for human operators to specify routes based on nuanced spatial intentions, such as maintaining social distance from pedestrians or avoiding specific zones.
MRReP solves this by letting users put on a mixed reality headset and literally draw a path on the physical floor with their hand. The system captures this Hand-drawn Reference Path (HRP) as a sequence of points. A custom planner then converts this gesture-based input into a viable global path for the robot's autonomous navigation system. In a within-subject experiment, MRReP was evaluated against a standard 2D map interface. The results demonstrated clear advantages: users achieved greater path specification accuracy, reported higher usability scores, and experienced a lower perceived workload. The interface also enabled more stable and consistent path planning directly within the context of the real environment.
The findings suggest that direct, in-situ path specification via mixed reality is a highly effective method for integrating human spatial reasoning and intention into robotic workflows. This moves beyond abstract map coordinates to a more intuitive, embodied interaction. The technology could transform how robots are deployed in dynamic, human-shared spaces like hospitals, warehouses, and offices, where optimal paths are about more than just the shortest distance.
- MRReP uses mixed reality to let users draw navigation paths for robots directly onto the physical floor with hand gestures.
- In testing, it outperformed a 2D interface, improving path accuracy and usability while reducing operator workload.
- The system translates hand-drawn paths into commands for the robot's navigation stack, embedding human spatial intent.
Why It Matters
It enables intuitive, human-centric control of robots in shared spaces, making complex navigation intentions easy to communicate and execute.