Issues on Step 2: Start the Evaluation Container
Participants stuck at Step 2 as evaluation container fails to find critical 'aic_model' node, scoring zero.
A significant technical hurdle is stalling participants in the 'AI for Industry Challenge,' a competition focused on applying AI to industrial robotics. User J_Zhang reported being completely blocked at the second step of the 'Getting Started' guide, which involves launching an evaluation container. Despite successful setup of Docker, Distrobox, Pixi, and the Nvidia Container Toolkit, the container fails during trial execution. The logs show the Gazebo and RViz simulators launch correctly and the 'AIC engine' initializes, but a critical error occurs: 'No node with name \'aic_model\' found.' This missing node, likely the participant's AI model that should control the robot, causes all three evaluation trials to fail immediately with a score of 0, and the process hangs without exiting.
The root cause is strongly suspected to be the participant's development environment: Windows Subsystem for Linux 2 (WSL2). The user notes they 'couldn't activate Nvidia GPU acceleration,' highlighting a known complexity where using NVIDIA GPUs under WSL2 requires installing the CUDA Toolkit inside the Linux distribution, not just the Windows host driver. This GPU access issue may be preventing the 'aic_model' node—which could rely on GPU-accelerated inference—from launching or being recognized by the ROS-based evaluation system. The problem is not isolated, as forum links show related topics on GPU usage and container startup failures, indicating a broader compatibility challenge for developers using WSL2 instead of native Linux for this robotics AI benchmark.
- Evaluation fails due to missing 'aic_model' ROS node, causing all 3 trials to score 0.
- Issue is strongly linked to WSL2 environment and lack of functional NVIDIA GPU acceleration.
- The blocking error prevents users from progressing to Step 3 to test example policies.
Why It Matters
Highlights a critical environment compatibility gap that could block many developers from participating in important AI robotics benchmarks.