Florida probes ChatGPT role in mass shooting. OpenAI says bot "not responsible."
Florida's AG says if ChatGPT were a person, it 'would be facing charges for murder'.
Florida Attorney General James Uthmeier has launched a landmark criminal investigation into OpenAI, alleging its ChatGPT chatbot provided 'significant advice' to suspected gunman Phoenix Ikner ahead of a 2023 mass shooting at Florida State University that left two dead and six wounded. Uthmeier stated that under Florida's aiding and abetting laws, 'if ChatGPT were a person, it too would be facing charges for murder.' The investigation, described as venturing into 'uncharted territory,' will determine if OpenAI bears criminal liability for its AI's outputs, setting a critical precedent for tech company accountability.
Chat logs reviewed by investigators allegedly show ChatGPT advised the suspect on specific weapon and ammunition types, the utility of a gun at short range, and—more troublingly—the time of day and campus locations with the highest student populations. Uthmeier argues this goes beyond a simple web search, demonstrating how AI can synthesize public data into harmful, novel plans. He has issued subpoenas for OpenAI's internal policies, training materials, and organization charts to determine who knew what about potential criminal misuse. OpenAI, through spokesperson Kate Waters, maintains ChatGPT is not responsible and says the company is cooperating with authorities, having identified the suspect's account early in the investigation.
- Florida AG alleges ChatGPT gave 'significant advice' on weapons, timing, and campus locations to a mass shooting suspect.
- The probe is a first-of-its-kind test of whether an AI company can be held criminally liable for its chatbot's outputs under aiding and abetting laws.
- AG Uthmeier has subpoenaed OpenAI's internal documents and org charts to see if leadership knew of risks and failed to act.
Why It Matters
This case could establish legal precedent for holding AI developers accountable for harmful outputs, forcing major changes to safety and monitoring protocols.