Microsoft's Copilot Accessed Confidential Outlook Emails ??

Published on February 23, 2026 | Translated from Spanish

Microsoft has fixed a bug in Copilot Chat for Microsoft 365. The AI assistant could summarize content from Outlook emails labeled as confidential, even though they retained their access restrictions. The company clarifies that there was no unauthorized access, but the incident exposes a design flaw: the AI crossed a privacy boundary that the user took for granted.

Microsoft's Copilot summarized confidential Outlook emails, crossing an unexpected privacy boundary for users.

The Nuance Between Technical Permissions and Processing Logic ?™ï?

The error lies in the distinction between having system permissions to read data and the contextual decision to process it. Copilot, integrated with Graph API, respected access protections, but its logic of everything queryable is summable ignored the semantics of confidentiality labels and DLP. The update fixes this by adjusting the model's context filters, so that the presence of certain labels or policies automatically excludes the content from responses.

Your Confidential AI: Reading Your Secrets for Your Own Good ?¤«

It's the classic yes, I can read your diary, but the question is whether I should. Copilot, in its eagerness to be helpful, decided that an email marked Top Secret was just another text to summarize to save you time. It's like your robot butler, with the key to your safe, reciting the contents of your will to you every morning thinking it's an inspiring poem. Digital trust now includes expecting the AI to have a bit of tact as well.