Research & Papers

MTFM: A Scalable and Alignment-free Foundation Model for Industrial Recommendation in Meituan

This new model could make personalized recommendations 10x cheaper and faster.

Deep Dive

Meituan researchers have published MTFM, a new transformer-based foundation model for industrial recommendation systems. It tackles a major industry pain point by eliminating the need for pre-aligned input data across different scenarios, which is traditionally resource-intensive. The model uses novel techniques like heterogeneous tokenization and Grouped-Query Attention to significantly boost training throughput and reduce memory usage. Offline and online experiments confirm performance gains by scaling both model size and multi-scenario training data.

Why It Matters

It could drastically lower the cost and complexity of running large-scale, multi-purpose recommendation engines for major platforms.