Many security researchers have highlighted the vulnerabilities of chatbots. Michael Bargury, CEO of Zenity, has illustrated five ways to manipulate the version of Copilot integrated into Microsoft 365 and perform various unwanted tasks, such as stealing company data. The AI assistant can also become an automatic phishing machine.
Copilot diventa una spear phishing machine
Copilot per Microsoft 365 allows you to speed up many activities, thanks to access to company data, offering a higher level of security than the free version. During the recent Black Hat conference in Las Vegas, Bargury showed cinque “proof-of-concept” that allow the chatbot to be exploited for illicit purposes.
Using tools called LOLCopilot you can transform Copilot into a spear phishing machine. All you need is a compromised corporate account. A cybercriminal could impersonate an employee and use the chatbot to generate emails in their writing style, attaching links to infected sites or malware. This is possible because Copilot has access to files, emails, and messages exchanged within the company.
The researcher also showed how it is possible to ask Copilot to reveal some sensitive datasuch as employee salaries. The chatbot can also be used to manipulate a financial transaction and obtain information about upcoming financial results. Following the report, Microsoft announced that it will fix all discovered security issues.