Arm’s first CPU ever will plug into Meta’s AI data centers later this year
Arm's new AGI CPU boasts double the performance per watt of x86 chips, with Meta as its lead partner.
In a major strategic shift, UK-based Arm, historically a licensor of chip designs, has announced its first-ever in-house CPU. The new chip, dubbed the Arm AGI CPU, is specifically engineered for AI inference—the process of running trained AI models in production. Meta has been revealed as the lead partner and co-developer, with plans to integrate the CPUs into its data centers later this year. This move comes as Meta has reportedly faced challenges launching its own custom AI silicon.
The technical specifications reveal a powerhouse built for efficiency. The AGI CPU, based on Arm's Neoverse platform, can be configured with up to 136 cores per CPU and 64 CPUs per air-cooled server rack. Arm claims this design delivers double the performance per watt compared to traditional x86 CPUs while reducing memory bottlenecks. The partnership aims for "multiple generations" of data center CPUs, which Meta intends to use alongside hardware from other vendors like Nvidia and AMD.
Beyond Meta, the announcement signals Arm's ambition to become a direct supplier in the competitive AI infrastructure market. Other announced customers include Cerebras, Cloudflare, and OpenAI. Arm's cloud AI head, Mohamed Awad, positioned the chip as an option for companies that cannot afford to develop their own custom processors. The deal's financial terms and volume were not disclosed, but the partnership marks a significant new front in the battle for AI chip supremacy, especially following recent legal tensions between Arm and other licensees like Qualcomm.
- Arm's first proprietary CPU, the AGI CPU, features up to 136 cores and claims 2x the performance per watt of x86 chips.
- Meta is the lead partner and co-developer, planning multi-generational deployment in its AI data centers starting later this year.
- The chip targets AI inference workloads, aiming to serve companies that cannot develop custom AI silicon in-house.
Why It Matters
This move disrupts the AI chip market, giving cloud giants a powerful, efficient alternative to x86 and custom silicon, intensifying competition.