Microsoft says Office bug exposed customers’ confidential emails to Copilot AI
A security flaw let Copilot summarize sensitive emails, bypassing data loss prevention policies.
Deep Dive
Microsoft confirmed a bug (tracked as CW1226324) in its Microsoft 365 Copilot Chat that allowed the AI to read and summarize customers' confidential emails for weeks, starting in January. The flaw bypassed data loss prevention policies designed to block sensitive information from being ingested. Microsoft began rolling out a fix in February. The incident follows the European Parliament blocking similar AI tools over data security concerns.
Why It Matters
This breach erodes trust in enterprise AI, highlighting critical security risks for confidential business communications.