BiMoE: Brain-Inspired Experts for EEG-Dominant Affective State Recognition
A new brain-inspired AI framework interprets EEG signals with region-specific experts, improving emotion recognition accuracy.
A research team led by Hongyu Zhu has introduced BiMoE (Brain-Inspired Mixture of Experts), a novel AI framework designed to significantly improve how brain-computer interfaces (BCIs) recognize human emotional states. The system specifically addresses three major shortcomings in current Multimodal Sentiment Analysis (MSA): it treats EEG signals as region-specific rather than homogeneous, provides interpretable neural representations instead of black-box processing, and effectively fuses EEG data with complementary peripheral physiological signals (PPS). BiMoE's architecture is its key innovation, partitioning EEG inputs according to brain topology and assigning specialized 'experts' to analyze each region.
Each expert within BiMoE employs a dual-stream encoder to extract both local and global spatiotemporal features from the brain signals. A dedicated expert module handles PPS data using multi-scale large-kernel convolutions. All components are dynamically integrated through an adaptive routing mechanism and a joint loss function, creating a cohesive system that mimics how different brain regions specialize in affective processing. The framework was rigorously evaluated under strict subject-independent conditions, ensuring its robustness for real-world BCI applications where the system encounters entirely new users.
The results demonstrate a clear performance leap. On the benchmark DEAP and DREAMER datasets, BiMoE consistently outperformed state-of-the-art baselines across various affective dimensions. It delivered average accuracy improvements ranging from 0.87% to 5.19% in multimodal sentiment classification tasks. This advancement is not just a marginal gain; it represents a meaningful step toward more reliable, interpretable, and personalized emotion-aware computing systems, with the code made publicly available to accelerate further research in affective BCIs.
- BiMoE improves multimodal sentiment classification accuracy by 0.87% to 5.19% on DEAP/DREAMER datasets.
- The framework uses brain-topology-aware partitioning and dual-stream encoders for interpretable, region-specific EEG analysis.
- It dynamically fuses EEG experts with a dedicated PPS module via adaptive routing for robust subject-independent performance.
Why It Matters
Enables more accurate, interpretable brain-computer interfaces for mental health monitoring, neurofeedback, and human-computer interaction.