Microsoft Says Office Bug Exposed Customers’ Confidential Emails To Copilot AI

A Microsoft Copilot bug let AI access confidential emails for weeks. Here's what happened, who's affected, and how to protect your data.
Matilda
Microsoft Says Office Bug Exposed Customers’ Confidential Emails To Copilot AI
A critical bug in Microsoft 365 Copilot recently allowed the AI assistant to access and summarize confidential emails without user consent. The issue, active since January 2026, bypassed data loss prevention policies designed to protect sensitive information. Microsoft has confirmed the flaw and begun rolling out a fix, but organizations need to understand the scope and take immediate steps to secure their data. If you use Copilot Chat in Office apps, here's what you need to know about the exposure, the resolution, and how to verify your account's safety. Administrators should check their tenant logs for reference code CW1226324 to assess potential impact.  Credit: Rafael Henrique/SOPA Images/LightRocket / Getty Images What the Microsoft Copilot Bug Did to Confidential Emails The Microsoft Copilot bug specifically affected how the AI processed emails marked with confidential labels in Microsoft 365 environments. Instead of respecting data loss prevention rules, Copilot Chat could…