Event-Driven On-Sensor Locomotion Mode Recognition Using a Shank-Mounted IMU with Embedded Machine Learning for Exoskeleton Control
A new system runs a decision-tree model directly inside an IMU sensor, waking the main processor only when needed.
Researchers Mohammadsaleh Razmi and Iman Shojaei have published a paper detailing a novel system for real-time human activity recognition (HAR) designed specifically for lower-limb exoskeleton control. The key innovation is moving the machine learning inference task from a central microcontroller directly onto the inertial measurement unit (IMU) sensor itself. Using the embedded Machine Learning Core (MLC) of the STMicroelectronics LSM6DSV16X IMU, the system classifies locomotion modes—stance, level walking, and stair ascent—without streaming raw data, enabling an interrupt-driven, low-latency architecture crucial for responsive robotic assistance.
The technical implementation involves configuring a lightweight decision-tree model via ST MEMS Studio and deploying it for on-sensor execution. During operation, the IMU processes motion data internally and only wakes the main microcontroller via an interrupt when a new classification is ready. This approach eliminates the need for custom ML code on the host processor, drastically reduces communication overhead, and preserves battery energy. The result is a more robust and power-efficient system that improves the exoskeleton controller's ability to distinguish between complex activities like level walking and stair ascent, paving the way for more autonomous and adaptive wearable robotics.
- Uses the STMicroelectronics LSM6DSV16X IMU's embedded Machine Learning Core (MLC) to run inference directly on the sensor.
- Classifies three locomotion modes (stance, level walking, stair ascent) with a lightweight decision-tree model, requiring no custom ML code on the main microcontroller.
- Adopts an event-driven, interrupt-based architecture that keeps the host processor in a low-power state, reducing latency and energy consumption for exoskeleton control.
Why It Matters
Enables lower-power, more responsive exoskeletons by moving AI processing to the sensor's edge, reducing system latency and complexity.