Home Bot: Autonomous navigation with voice interaction - Nav2 how-to ?
A DIY home robot project is moving from basic obstacle avoidance to full SLAM navigation using ROS Nav2.
A developer from K-Scale Robotics, Sampath Bommakanti, has shared progress on a viral DIY project called 'Home Bot,' an autonomous robot designed for voice-controlled navigation within a home environment. The project, built on an NVIDIA Jetson platform and using ROS 2, is taking a significant technical leap. It's moving beyond simple reactive obstacle avoidance to implement a full navigation stack. The goal is to integrate grounded SLAM (Simultaneous Localization and Mapping) with the ROS 2 Nav2 system, enabling the robot to create a map of its environment and navigate to specific user-requested locations with precision.
The core technical hurdle detailed in the post is achieving reliable odometry—the robot's ability to track its own movement—without using wheel encoders. The rover chassis lacks servo motors with built-in encoders, forcing a sensor-fusion approach. Bommakanti is soliciting community advice on two primary options: implementing Visual-Inertial Odometry (VIO) by fusing data from an HP60C depth sensor and an Inertial Measurement Unit (IMU), or using a dedicated 2D LiDAR scanner for localization and mapping. The choice between these computer vision and laser-based methods is critical for the Nav2 stack's performance in a dynamic home setting.
- The 'Home Bot' project is upgrading from basic collision avoidance to full autonomous navigation using the ROS 2 Nav2 stack and SLAM.
- A major design challenge is solving odometry without wheel encoders, weighing VIO (depth sensor + IMU) against 2D LiDAR.
- The system runs on an NVIDIA Jetson for edge AI, utilizing frameworks like YOLO for perception and is designed for voice interaction.
Why It Matters
It demonstrates the advanced capabilities now accessible to DIY builders, pushing home robotics towards true autonomy.