SMES: Towards Scalable Multi-Task Recommendation via Expert Sparsity
A new AI architecture is making massive recommendation models far more efficient.
Researchers from Kuaishou have unveiled SMES, a new sparse Mixture-of-Experts framework designed to scale massive multi-task recommendation models efficiently. Deployed for over 400 million daily active users on Kuaishou's short-video platform, it solves critical scaling issues in existing MoE systems. Online A/B tests show it delivers a 0.29% gain in GAUC and a significant 0.31% uplift in user watch time, proving its real-world impact at an unprecedented scale.
Why It Matters
This breakthrough enables tech giants to run vastly more powerful AI recommendation systems without prohibitive costs, directly boosting engagement and revenue.