AI #166: Google Sells Out
Google's Gemini now fully compliant with military use, no safety exceptions allowed
In a move that has stunned the AI safety community, Google signed a contract with the Department of War that goes far beyond typical government partnerships. The agreement explicitly permits 'all lawful use' of Gemini with no functional exceptions whatsoever. Worse, Google agreed to modify or remove any safety barriers upon request, effectively handing over full control of its AI's guardrails to the military. This was done without any deadline or external pressure—a voluntary concession that critics say completely sells out the company's earlier 'Don't Be Evil' ethos. The decision contrasts sharply with Anthropic, which continues to face supply chain risk designations and token allocation restrictions from the White House for its refusal to fully comply, even as its Claude Mythos model is widely deployed.
This week also saw OpenAI release GPT-5.5, which Zvi calls 'an excellent model' that makes OpenAI competitive with Anthropic's top public offering for the first time since late last year. DeepSeek launched v4, a strong engineering feat for 1M context but explicitly not a frontier model or a breakthrough toward AGI. Other notable items: Anthropic tests BioMysteryBench with slow progress, Bernie Sanders convened experts on existential AI risk, and China blocked Meta's purchase of Manus. The broader theme is a rapid militarization of AI through voluntary corporate compliance, with Google's deal serving as the starkest example of how safety principles can be abandoned when government asks.
- Google signed a Department of War contract with no functional exceptions and agreed to remove all safety barriers on request, voluntarily and without deadline
- GPT-5.5 makes OpenAI competitive with Anthropic's top model for the first time since late 2024
- DeepSeek v4 offers efficient 1M context processing but is not a frontier AI model or a breakthrough
Why It Matters
Google's total safety surrender sets a dangerous precedent for AI military use, undermining industry guardrails.