Robotics

OA-NBV: Occlusion-Aware Next-Best-View Planning for Human-Centered Active Perception on Mobile Robots

New robotics algorithm enables robots to strategically reposition themselves to see occluded humans, achieving over 90% success in trials.

Deep Dive

A research team led by Boxun Hu has developed OA-NBV (Occlusion-Aware Next-Best-View Planning), a novel algorithm that enables mobile robots to autonomously select optimal viewpoints to see around obstacles when their view of a human is blocked. Unlike traditional Next-Best-View (NBV) methods that focus on generic exploration or long-term coverage, OA-NBV specifically targets the immediate goal of obtaining a single usable observation of a partially occluded person. The system integrates perception and motion planning by scoring candidate viewpoints using a target-centric visibility model that accounts for three key factors: occlusion, target scale, and target completeness, while restricting candidates to feasible robot poses that respect real-world motion constraints.

In both simulation and real-world trials, OA-NBV demonstrated remarkable performance, achieving over 90% success rate in obtaining complete views of occluded humans. This represents a significant improvement over baseline NBV methods, which degraded sharply under occlusion conditions. Beyond success rate, the algorithm substantially improved observation quality: compared to the strongest baseline, it increased normalized target area by at least 81% and keypoint visibility by at least 58% across various settings. The researchers designed OA-NBV as a drop-in view-selection module that can be integrated into diverse human-centered downstream tasks including search, triage, and disaster response operations where cluttered environments and partial visibility frequently degrade perception systems.

The development addresses a critical gap in robotics for human-centered operations, where robots need to mimic human behavior—like stepping sideways or leaning—to recover informative observations when views are obstructed. By explicitly modeling occlusion and focusing on immediate observation quality rather than long-term exploration, OA-NBV enables more effective human-robot interaction in complex, real-world environments where complete visibility cannot be guaranteed.

Key Points
  • Achieved over 90% success rate in obtaining complete views of occluded humans in both simulation and real-world trials
  • Increased normalized target area visibility by at least 81% and keypoint visibility by at least 58% compared to strongest baseline methods
  • Integrates perception and motion planning with a target-centric visibility model that accounts for occlusion, scale, and completeness while respecting robot motion constraints

Why It Matters

Enables robots to perform more effectively in search, rescue, and healthcare scenarios where seeing around obstacles is critical for human interaction.