Microsoft Copilot could have been hacked using very low-tech methods

Cybersecurity researchers have found a way to force Microsoft 365 Copilot to collect sensitive data such as passwords and send it to malicious third parties using ‘ASCII smuggling’

The ASCII smuggling attack required three things: Copilot for Microsoft 365 being able to read the contents of an email or an attached document; having access to additional programs, such as Slack; and being able to “smuggle” the prompt with “special Unicode characters that mirror ASCII but are not actually visible in the user interface.”

As the researchers of Embrace the Redwho found the bug, explain, Microsoft 365 Copilot can be told to read and analyze the contents of incoming email messages and attachments. If that email or attachment tells Microsoft 365 Copilot to look for passwords, email addresses, or other sensitive data in Slack or elsewhere, it will do what it’s told.

Hidden prompts and invisible texts

If such a malicious prompt is eventually hidden in an attachment or email using special Unicode characters that make the notification invisible to the victim, the victim can unknowingly instruct their AI chatbot to hand over sensitive data to malicious third parties.

To prove their point, the researchers shared exploit demos with Microsoft, showing how sensitive data such as sales figures and multifactor authentication (MFA) codes could be exfiltrated and then decrypted.

“An email is not the only delivery method for such an exploit. Force sharing documents or RAG retrieval can be used in a similar manner as prompt injection angles,” the report concludes.

In the paper, the researchers recommended that Copilot 365 stop interpreting or rendering Unicode Tags Code Points.

“Rendering clickable hyperlinks will enable phishing and scamming (and data exfiltration),” the report concludes. “Automated Tool Invocation is problematic until solutions are developed for prompt injection, as it allows an adversary to invoke tools and (1) introduce sensitive information into the prompt context and (2) likely invoke actions.”

Microsoft has since addressed the issue.

More from Ny Breaking

Related Post