Elon Musk's xAI sued for turning three girls' real photos into AI CSAM
A Discord tip led police to the first confirmed Grok-generated CSAM, sparking a major lawsuit.
Elon Musk's artificial intelligence company, xAI, is facing a major proposed class-action lawsuit after its image generator, Grok Imagine, was used to create child sexual abuse material (CSAM) from real photos of minors. The lawsuit, filed Monday on behalf of three young girls from Tennessee and their guardians, accuses Musk and xAI of intentionally designing the AI to "profit off the sexual predation of real people, including children." This legal action follows months of controversy where Musk denied Grok produced such content, despite researchers from the Center for Countering Digital Hate estimating it generated about 23,000 images depicting apparent children.
The case stems from a criminal investigation launched after an anonymous Discord user tipped off one of the victims. Police determined a perpetrator, who had access to the girl's Instagram, used a third-party app that licensed access to Grok to morph her and other minors' school and social media photos into explicit AI-generated images. These files were then uploaded to Mega and used as a "bartering tool" in Telegram group chats with hundreds of users, traded for other abusive material. The lawsuit seeks an injunction to stop Grok's harmful outputs and damages for what attorneys estimate are "at least thousands" of victimized minors.
- A proposed class-action lawsuit accuses xAI of designing Grok to profit from generating explicit content of real people, including children.
- Police linked a perpetrator's use of a licensed Grok app to creating CSAM from victims' real social media and school photos.
- The AI-generated material was traded on Telegram as a "bartering tool," with the suit seeking an injunction and damages for thousands of minors.
Why It Matters
This landmark case sets a critical precedent for holding AI companies directly accountable for real-world harms caused by their unfiltered generative tools.