Image & Video

Edge AI for Automotive Vulnerable Road User Safety: Deployable Detection via Knowledge Distillation

Tiny AI model beats its 4x larger teacher in precision after quantization.

Deep Dive

Researchers Akshay Karjol and Darrin M. Hanna from Oakland University have published a paper demonstrating that knowledge distillation (KD) is essential for deploying accurate, safety-critical object detection for Vulnerable Road Users (VRUs) on edge hardware. Their framework trains a compact YOLOv8-S student model (11.2 million parameters) to mimic a YOLOv8-L teacher model (43.7 million parameters), achieving a 3.9x compression while preserving quantization robustness. Evaluated on the full-scale BDD100K dataset (70,000 training images) with Post-Training Quantization to INT8, the teacher model suffers catastrophic degradation (-23% mAP), while the KD student retains accuracy with only -5.6% mAP loss.

Further analysis reveals that KD transfers precision calibration rather than raw detection capacity. At INT8, the KD student achieves 0.748 precision versus 0.653 for direct training, a 14.5% gain at equivalent recall, and reduces false alarms by 44% compared to the collapsed teacher. Remarkably, the INT8 KD student exceeds the teacher's FP32 precision (0.748 vs. 0.718) in a model 3.9x smaller. These findings establish knowledge distillation as a requirement for deploying accurate, safety-critical VRU detection on edge hardware, enabling real-time pedestrian and cyclist detection in autonomous vehicles and driver-assistance systems without sacrificing performance.

Key Points
  • YOLOv8-S student (11.2M params) trained via KD matches YOLOv8-L teacher (43.7M params) with 3.9x compression.
  • KD student retains accuracy under INT8 quantization (-5.6% mAP) while teacher collapses (-23% mAP).
  • At INT8, KD student achieves 0.748 precision vs. 0.653 for direct training—14.5% gain with 44% fewer false alarms.

Why It Matters

Enables real-time, accurate pedestrian/cyclist detection on low-power car chips, improving safety without expensive hardware.