OpenAI’s surveillance language has many potential loopholes and they can do better
Legal experts identify ambiguous 'intentional' wording that could permit mass surveillance of U.S. persons.
OpenAI's recently amended contract with the U.S. Department of Defense contains surveillance language that legal experts say creates significant loopholes, despite company claims it puts the issue to rest. The new clauses prohibit 'intentional' use of AI systems for domestic surveillance of U.S. persons and 'deliberate' tracking through commercially acquired data, but analysts note these terms align with intelligence community definitions that have historically permitted mass incidental collection. Experts including government procurement law specialist Jessica Tillipman warn the 'intentional' language leaves room for broad interpretation, while tech reporter Mike Masnick notes OpenAI has effectively adopted the intelligence community's redefined dictionary of surveillance terms.
Legal analysis reveals multiple ambiguities that could permit surveillance activities under the contract's current wording. The restriction applies only to 'intentional' and 'deliberate' actions, which in intelligence community parlance doesn't preclude incidental collection of Americans' communications during foreign targeting. Experts emphasize that without seeing the full contract—which may contain modifying clauses—it's impossible to assess the true limitations. OpenAI employees are being advised to demand clearer language that explicitly prohibits what they're concerned about, as the current wording leaves too much room for interpretation that could enable surveillance activities contrary to public understanding.
- Contract prohibits only 'intentional' and 'deliberate' surveillance, terms that intelligence agencies have historically interpreted broadly
- Legal experts warn the language aligns with definitions that permit incidental collection of Americans' communications
- Analysis suggests employees should demand clearer protections before AI systems are deployed under the agreement
Why It Matters
Sets precedent for AI surveillance contracts and could enable mass data collection under technical compliance.