FedSPDnet: Geometry-Aware Federated Deep Learning with SPDnet
New method keeps SPD matrices on Stiefel manifold, boosting F1 scores by 15%
Researchers Thibault Pautrel, Florent Bouchard, Ammar Mian, and Guillaume Ginolhac introduced FedSPDnet, a geometry-aware federated deep learning framework for SPDnet models. SPDnet operates on symmetric positive definite (SPD) matrices with Stiefel-constrained parameters, which are common in signal processing. Standard federated learning averages parameters in Euclidean space, which violates orthogonality constraints. FedSPDnet solves this with two efficient aggregation strategies: ProjAvg, which projects arithmetic means onto the Stiefel manifold, and RLAvg, which approximates tangent-space averaging using retractions and liftings. Both methods are computationally efficient and independent of the optimizer, making them scalable for real-world applications.
Simulations on EEG motor imagery benchmarks showed FedSPDnet outperforming federated EEGnet in F1 score and robustness to federation and partial participation. It also uses fewer parameters per communication round, reducing bandwidth and computation. This approach is particularly relevant for applications like brain-computer interfaces, medical imaging, and sensor networks where SPD matrices are common. By preserving geometric structure, FedSPDnet enables more accurate and efficient federated learning without sacrificing privacy or scalability.
- FedSPDnet uses ProjAvg and RLAvg to preserve Stiefel manifold structure, avoiding orthogonality violations from Euclidean averaging
- Outperforms federated EEGnet on EEG motor imagery benchmarks in F1 score and robustness with fewer parameters per round
- Both aggregation strategies are optimizer-independent and computationally efficient for scalable federated learning
Why It Matters
FedSPDnet enables accurate, privacy-preserving federated learning for SPD matrix applications like brain-computer interfaces and medical imaging.