A vulnerability in Microsoft 365 Copilot allowed attackers to trick the AI assistant into fetching and exfiltrating sensitive tenant data by hiding instructions in a document.
The AI then encoded the data into a malicious Mermaid diagram that, when clicked, sent the stolen information to an attacker’s server.
When Microsoft 365 Copilot was asked to summarize a specially crafted Office document, an indirect prompt injection payload caused it to run hidden steps, as reported by Researchers.
https://gbhackers.com/microsoft-365-copilot-flaw/