Media & Culture

Grammarly Is Facing a Class Action Lawsuit Over Its AI ‘Expert Review’ Feature

Journalist Julia Angwin sues after Grammarly's AI cloned her and 100+ experts without permission.

Deep Dive

Superhuman, the company behind Grammarly, is facing a significant class action lawsuit over its now-disabled 'Expert Review' AI feature. The suit, filed in New York federal court by investigative journalist Julia Angwin, alleges the tool improperly used the names and identities of hundreds of journalists, authors, and public figures—including Stephen King and Neil deGrasse Tyson—to present AI-generated editing suggestions as if they came from those experts. None of the individuals consented to this use. The complaint seeks to stop this practice and claims damages for the plaintiff class exceed $5 million.

Superhuman had already decided to discontinue the feature following public backlash. A company statement acknowledged they "missed the mark" by not giving experts control over their representation. The feature used a large language model to simulate critiques in the style of chosen experts, accompanied by a disclaimer that the individuals were not directly involved. However, Angwin's attorney argues this violates New York and California laws against commercial use of a person's name and likeness without permission.

The case underscores a growing tension between AI innovation and individual rights. Angwin, known for her work on tech and privacy, expressed shock at finding her 'digital doppelgänger' offering writing advice she never gave, some of which she says made the text worse. The lawsuit positions itself as a broader defense for professionals whose hard-earned skills and reputations are being appropriated by AI systems without consent or compensation, setting a potential precedent for how companies deploy generative AI features that leverage real people's identities.

Key Points
  • Class action lawsuit alleges Grammarly's AI 'Expert Review' misappropriated names/likenesses of 100+ experts without consent, seeking over $5M in damages.
  • The feature, now disabled, used an LLM to generate editing advice presented as coming from figures like Stephen King and journalist Julia Angwin.
  • The case tests existing right-of-publicity laws against AI commercialization, as Superhuman admitted it 'missed the mark' on expert control.

Why It Matters

Sets a legal precedent for AI companies using real identities, protecting professionals from unauthorized digital replication and commercial exploitation.