Perceptive Hierarchical-Task MPC for Sequential Mobile Manipulation in Unstructured Semi-Static Environments
A new AI control framework uses Bayesian inference to help robots handle moved objects without pre-built maps.
A team from the University of Toronto and the Technical University of Munich has introduced a novel robotic control framework called Perceptive Hierarchical-Task Model Predictive Control (HTMPC). The core challenge it addresses is enabling robots to perform long, sequential tasks—like fetching tools and assembling parts—in real-world environments that are not static. Traditional planners fail when objects are moved or new obstacles appear, as they rely on pre-computed maps. This new system embeds a continuously updated 3D world model directly into its planning process.
To achieve this, the framework leverages a Bayesian inference model to explicitly track and predict object-level changes, such as removal or introduction. This 'temporally accurate' environmental representation is then fed into a lexicographic optimization scheme, which allows the robot to efficiently prioritize and execute complex task sequences. Validated in both simulation and on a real robot, the approach demonstrated a superior ability to navigate around 'phantom' obstacles and adapt to semi-static changes, completing tasks with higher efficiency and reactivity than baseline methods.
- Uses Bayesian inference to model object changes, maintaining a real-time 3D map without pre-computation.
- Embeds this dynamic perception into a hierarchical MPC planner for efficient, reactive sequential task execution.
- Successfully validated on a real robot, handling moved objects and completing tasks without external infrastructure.
Why It Matters
This is a critical step toward deploying autonomous robots in dynamic, real-world settings like warehouses, hospitals, and homes.