Article Details
Scrape Timestamp (UTC): 2024-06-04 14:03:41.009
Source: https://www.bleepingcomputer.com/news/security/four-steps-to-use-microsoft-copilot-genai-securely/
Original Article Text
Click to Toggle View
Four Steps to Use Microsoft Copilot GenAI Securely. Article written by Hananel Livneh, Head of Product Marketing at Adaptive Shield. Microsoft Copilot can supercharge productivity. The Microsoft 365 GenAI assistant integrates with Word, PowerPoint, Excel, Teams, and other applications within the Microsoft Suite and performs the roles of analyst, copywriter, notetaker, and designer. Not only that, but it also produces high-quality content in a fraction of the time it would take you to do so. It’s a dream come true for most employees. However, those benefits come with a significant caveat for enterprises. Can Microsoft Copilot be trusted not to access or share confidential information? At the core of that question is this one: Is Microsoft Copilot, or any other GenAI assistant, secure? If you watched the Indy 500 over Memorial Day Weekend, you would already understand that only a skilled operator should be at the wheel when using sensitive, dangerous equipment running at high speed. An inexperienced driver would not have been capable of safely passing Pato O’Ward at over 220 MPH, as Josef Newgarden did. While no one is at risk of physical harm when using GenAI, the principle is the same. Using hyper-productive tools to generate a continual stream of materials using a vast reservoir of corporate information has a dark side. The ease with which data can leak and fall into the wrong hands is remarkably high. Data Access is Only a Query Away Microsoft Copilot generates materials based on data it can access within the Microsoft product suite. Data that is hard to locate can now be correlated with hundreds of data points, and is only a query away. If the employee doesn’t realize the sensitivity of the response, or trusts Microsoft Copilot without carefully reading through the response, sensitive customer and competitor information can be shared with outsiders. To borrow once more from our Indy 500 analogy, users need proper guardrails put in place to prevent data GenAI-driven leakage. Copilot relies on existing Microsoft 365 access controls. If users have broad access to sensitive data, Copilot does as well, and can expose it. Companies should also label files and folders as sensitive, to prevent Copilot from accessing it. Any Data is Fair Game for GenAI There is no question that Microsoft Copilot can improve employee performance. However, organizations that refrain from implementing a security structure around its usage do so at their peril. Copilot access should only be granted to employees who require it for their jobs. These employees must be properly trained in the risks of sending out materials to external users without reading them carefully. It bears repeating that Copilot’s access mirrors user access. Employees are often granted wide swaths of permissions. No one anticipates that an employee will dig into long-lost files stored on a drive and use that information. However, GenAI sees anything accessible as fair game. To truly prevent data leaks coming through Copilot, admins must do a far more precise job when defining user access and roles to files stored in corporate drives. Otherwise, you will always be at risk of leaking sensitive information. To summarize, for companies to feel secure using Microsoft Copilot, they need to follow these guidelines: I hope this article helps you better understand the importance of protecting your Microsoft Copilot and allows you to use it safely. Learn more about securing GenAi in SaaS Sponsored and written by Adaptive Shield.
Daily Brief Summary
Microsoft Copilot boosts employee productivity by integrating with Microsoft 365 tools like Word, PowerPoint, and Excel, acting as an analyst, copywriter, notetaker, and designer.
While enhancing efficiency, there is a significant risk that Copilot could access and share sensitive corporate information unintentionally.
Copilot generates content based on the data it can access within the Microsoft suite, potentially exposing sensitive data if not properly controlled.
Organizations must implement stringent access controls and label sensitive data to prevent unwanted data exposure through Copilot.
Employees with Copilot access should receive training on the risks of inadvertent data sharing and the importance of reviewing materials before sharing externally.
Admins need to rigorously define user access and roles concerning file access on corporate drives to mitigate the risk of data leaks through GenAI use.
Enterprises should take careful measures to establish security around GenAI tools like Microsoft Copilot to maintain confidentiality and data integrity in their operations.