DeepSeek V4 Disrupts AI Market with Open-Weight Frontier Performance and Aggressive Pricing
Frontier AI performance now open-source with 1M token context and 90% lower costs.
DeepSeek (the Chinese AI lab from High-Flyer hedge fund) released DeepSeek V4, an open-weight LLM that rivals GPT-4o and Claude 3.7 Sonnet on benchmarks. It uses a Mixture of Experts architecture with about 37 billion active parameters per forward pass and supports a 1 million token context window. Weights are publicly available under a permissive commercial license. API pricing is dramatically lower than equivalent closed models, enabling enterprises to run, fine-tune, and deploy locally at scale.
- Frontier-level performance on par with GPT-4o and Claude 3.7 Sonnet, with open weights
- Mixture of Experts architecture: 685B total params, 37B active per inference, enabling low-cost deployment
- 1 million token context window, permissive commercial license, and dramatically lower API pricing than closed models
Why It Matters
Democratizes frontier AI by making it accessible, affordable, and controllable for enterprises and developers.