ECG-MoE: Mixture-of-Expert Electrocardiogram Foundation Model
A new hybrid AI foundation model for ECG analysis runs 40% faster than multi-task baselines.
A team of researchers including Yuhao Xu, Xiaoda Wang, and Carl Yang has introduced ECG-MoE, a novel foundation model designed to revolutionize the analysis of electrocardiograms (ECGs). The model addresses a critical gap in existing AI systems, which often fail to capture the periodicity and diverse features required for varied clinical diagnostic tasks. By proposing a hybrid architecture that integrates multi-model temporal features with a cardiac period-aware expert module, ECG-MoE sets a new benchmark for accuracy and efficiency in automated cardiac analysis.
The technical innovation lies in its dual-path Mixture-of-Experts (MoE) design, which separately and more effectively models beat-level morphology and heart rhythm. These specialized pathways are combined using a hierarchical fusion network that leverages Low-Rank Adaptation (LoRA) for efficient, task-specific inference without retraining the entire model. Evaluated across five public clinical tasks, ECG-MoE not only achieves state-of-the-art performance but does so with a 40% faster inference speed compared to conventional multi-task baselines. This combination of high accuracy and computational efficiency paves the way for its integration into real-time clinical decision support systems and telehealth platforms, potentially transforming how cardiac conditions are screened and monitored.
- Uses a dual-path Mixture-of-Experts (MoE) to separately model beat morphology and rhythm for superior accuracy.
- Achieves state-of-the-art performance on five public clinical ECG tasks, demonstrating broad diagnostic capability.
- Runs with 40% faster inference than multi-task baselines due to efficient LoRA-based hierarchical fusion.
Why It Matters
Enables faster, more accurate AI-assisted cardiac diagnosis, which can improve patient outcomes and streamline clinical workflows.