ChatGPT Criminals: AI Scams Target Dating and Legal Impersonators
AI-generated fluency and fake credentials are making old scams more convincing and harder to detect.
OpenAI's internal threat intelligence findings, reported by Business Insider, reveal a troubling trend: criminals are systematically using ChatGPT to scale and polish traditional scams, particularly in romance fraud and professional impersonation. The AI doesn't invent new crimes but removes the traditional 'tells' that flagged scams, such as awkward phrasing or inconsistent tone. This allows bad actors to run multiple, fluent, and tailored conversations simultaneously, building trust faster and moving targets to private platforms like Telegram where platform safety features no longer apply. The shift off-app is a critical hinge point, giving scammers full control over the environment and escalation into financial demands.
In romance scams, actors use AI to create 'luxury' dating personas and supporting materials before pushing for payments via 'tasks.' In legal impersonation, clusters of accounts present as law firms or even US law enforcement, using AI to generate fake credibility signals like New York State Bar Association membership cards. The danger lies in manufacturing legitimacy long enough to extract money or sensitive data from stressed individuals. The report emphasizes that while AI can write convincing words, verification is key: a quick video call or independent check of an attorney through a state bar directory can instantly puncture an AI-polished facade. The rise of these tactics signals a new era of high-volume, low-friction social engineering that demands increased user vigilance.
- Scammers use ChatGPT to generate fluent messages and fake credentials, removing linguistic 'tells' that previously flagged fraud.
- The critical scam tactic is moving conversations from monitored apps to private platforms like Telegram to evade detection systems.
- OpenAI's report identified fake legal professionals using AI to create bogus bar association cards and law enforcement personas.
Why It Matters
AI is democratizing sophisticated social engineering, forcing professionals and consumers to adopt stricter verification habits to protect assets and data.