These Men Allegedly Profit Off Teaching People How to Make AI Porn
AI-generated fake nudes sold as courses—victims had no control over their images.
A lawsuit filed in Arizona alleges that Jackson Webb, Lucas Webb, and Beau Schultz built a lucrative operation around non-consensual AI-generated pornography. They scraped social media photos of unsuspecting young women, used AI tools like CreatorCore to create nude or scantily clad images, and marketed these as 'AI influencers' on platforms like Fanvue. The men also sold online courses for $24.95/month on Whop, claiming to teach subscribers how to replicate the process and earn money. According to the suit, the CreatorCore platform had over 8,000 users and generated more than 500,000 images and videos, with the defendants allegedly making over $50,000 in a single month.
The victims, including plaintiff MG, discovered their likenesses being used without permission after followers alerted them. MG described the images as indistinguishable from real photos, leaving her feeling powerless. The lawsuit names 50 John Does beyond the three main defendants, targeting a network of individuals who allegedly used the Blueprint tutorials to target women who 'can't defend themselves.' Attorney Nick Brand called the prevalence of such schemes shocking, noting that these profiteers are exploiting everyday social media users for profit, turning AI into a tool for harassment and exploitation.
- Three Arizona men allegedly made $50,000/month by selling AI-generated nude images of real women without their consent.
- Their CreatorCore platform had 8,000+ subscribers and produced 500,000+ AI images, with courses on Whop for $24.95/month.
- Victims' photos were scraped from Instagram and TikTok; the lawsuit includes 50 unnamed defendants who used the taught methods.
Why It Matters
Highlights the urgent need for legal protections against non-consensual AI-generated imagery and digital identity theft.