GLM-5 Officially Released
China's new open-source AI model just doubled in size, targeting complex engineering tasks.
Zhipu AI has officially released GLM-5, a massive open-source model targeting complex systems engineering and long-horizon agentic tasks. It scales from GLM-4.5's 355B parameters to 744B parameters (with 40B active), and increases pre-training data from 23T to 28.5T tokens. A key upgrade is the integration of DeepSeek's Sparse Attention (DSA), which significantly reduces deployment costs while preserving long-context capabilities. The model is available on Hugging Face and GitHub.
Why It Matters
This represents a major leap in open-source AGI development, pushing the frontier for complex, real-world AI applications.