Meta sued over AI smart glasses’ privacy concerns, after workers reviewed nudity, sex, and other footage
Contractors in Kenya reviewed sensitive user content, including nudity and sex, from 7M smart glasses sold in 2025.
Meta is confronting a major privacy lawsuit in the United States following revelations that human contractors, working for a subcontractor in Kenya, manually reviewed sensitive footage captured by users of its Ray-Ban Meta AI smart glasses. The lawsuit, filed by the Clarkson Law Firm on behalf of two customers, alleges false advertising and violations of consumer protection laws, centering on Meta's marketing claims that the glasses were 'designed for privacy, controlled by you.' Plaintiffs argue this messaging led them to believe their data, including potentially intimate moments like nudity or people using the toilet, would remain private, not subject to overseas human review. The UK's data regulator has also opened an investigation into the matter.
The core technical issue is the data pipeline for the glasses' AI features: when users share content with Meta AI, that media is sent for review to improve the system. Meta states this is explained in its Supplemental Terms of Service, which note reviews may be 'automated or manual (human).' However, the complaint highlights that over seven million glasses were sold in 2025, and users cannot opt out of this review process. Sources disputed Meta's claim that it consistently blurs faces in images before review. This case spotlights the growing tension between 'luxury surveillance' wearable tech and consumer privacy expectations, setting a potential legal precedent for how AI-powered devices handle sensitive, always-on data collection.
- Lawsuit alleges Meta's 'designed for privacy' marketing for its Ray-Ban Meta AI glasses was false advertising.
- Contractors at a Kenya-based firm manually reviewed sensitive user footage, including nudity and sex, to train the AI.
- Over 7 million units were sold in 2025, and users reportedly cannot opt out of the human review data pipeline.
Why It Matters
Sets a critical precedent for privacy and transparency in always-on AI wearables, impacting future product design and marketing.