Bias in Surface Electromyography Features across a Demographically Diverse Cohort
New research reveals AI-powered prosthetic limbs may work better for some demographics than others due to signal bias.
A team of researchers from UC Davis and UC Berkeley has published a groundbreaking study revealing significant demographic biases in surface electromyography (sEMG) signals used for AI-powered prosthetic control. Analyzing data from 81 demographically diverse individuals performing discrete hand gestures, the researchers extracted 147 common sEMG features and found that 33% (49 features) showed statistically significant associations with demographic variables including age, sex, height, weight, skin properties, subcutaneous fat, and hair density. This means the electrical signals from muscles that AI systems use to decode movement intentions vary substantially based on who's using the device.
Using mixed-effects linear models and partial least squares analysis, the study demonstrates how current sEMG technology doesn't perform consistently across users, making characteristics highly idiosyncratic. This variability has particular importance for sEMG-based assistive devices and neural interfaces, where demographic biases could undermine broad and fair deployment. The findings suggest that without addressing these biases, AI systems controlling prosthetic limbs, virtual reality interfaces, and household electronics may require laborious personalization and iterative tuning to achieve reliable performance for different demographic groups.
The research team, led by Aditi Agrawal and including experts from biomedical engineering and neuroscience, analyzed what they describe as "the largest demographically diverse sEMG dataset to date." Their work provides concrete evidence that factors like body composition and skin properties significantly affect signal quality, which in turn impacts machine learning-based gesture decoding accuracy. This represents a critical fairness issue in human-computer interaction, as biased signal processing could lead to assistive technologies that work better for some populations than others.
The study's methodology represents a significant advancement in how we understand and measure bias in biomedical AI systems. By identifying exactly which features show demographic associations, the research provides a roadmap for developing more equitable neural interfaces. The authors suggest their findings should guide future development of sEMG-based systems to ensure they work reliably across diverse populations without requiring extensive individual calibration, moving toward truly accessible assistive technologies.
- 33% of 147 common sEMG features show significant demographic bias affecting AI gesture decoding
- Study analyzed 81 diverse individuals, finding age, BMI, and skin properties alter signal quality
- Biases could make AI prosthetics require extensive personalization for different user demographics
Why It Matters
Ensuring AI-powered prosthetics work equally well for all users regardless of age, body type, or skin properties is crucial for equitable healthcare technology.