Meta Is Developing 4 New Chips to Power Its AI and Recommendation Systems
Meta's custom silicon roadmap includes four new chips, with the MTIA 300 already in production.
Meta has detailed its ambitious roadmap for custom AI silicon, announcing four new chips under its MTIA (Meta Training and Inference Accelerator) banner. Developed in partnership with Broadcom and built on the open-source RISC-V architecture, the chips are being fabricated by TSMC. The MTIA 300, already in production, is designed for training the recommendation algorithms that power Facebook and Instagram feeds. The other three—the MTIA 400, 450, and 500—are inference chips slated to ship between now and late 2027, with the MTIA 500 featuring innovations in low-precision data formats.
This rapid, iterative chip development cycle is a strategic shift for a social media company, driven by the need to keep hardware aligned with fast-evolving AI models. Meta's VP of Engineering, YJ Song, stated the approach uses modular chiplets to incorporate the latest workload insights. The announcement also serves to counter recent reports that Meta was scaling back its custom silicon ambitions. However, the enormous cost and complexity mean Meta isn't going all-in; the company recently signed multibillion-dollar deals to buy chips from Nvidia and AMD and will rent hardware from Google, indicating a hybrid strategy for the foreseeable future.
- Meta's MTIA 300 chip is in production for training content ranking algorithms, with three more inference chips (400, 450, 500) shipping through 2027.
- The chips are developed with Broadcom on RISC-V architecture and fabricated by TSMC, using an iterative, modular design to adapt to AI workloads.
- Despite the custom silicon push, Meta's recent multi-billion dollar deals with Nvidia and AMD confirm a hybrid hardware procurement strategy.
Why It Matters
This move signals a major shift as tech giants vertically integrate to control AI infrastructure, reducing reliance on Nvidia while accelerating model development cycles.