Some Data Safety Concerns to Word Earlier than Deployment Begins
Copilot for Microsoft 365 redefines the way in which individuals work with the Workplace apps, or in order that’s the promise made by Microsoft. The AI assistant introduces new challenges for directors by way of safeguarding delicate info. As directors, our purpose is to make sure Copilot utilization is secure, safe and complies with group requirements.
Maybe you’ve heard that Copilot for Microsoft 365 can entry content material that particular person customers have (at minimal) view entry to. For instance, if a person can view a SharePoint doc, Copilot, underneath that person’s context, also can course of it. Whereas that is true, it’s an oversimplification: with protected content material, view entry for customers doesn’t all the time assure entry for Copilot.
This text examines how Copilot for Microsoft 365 handles protected content material, together with the way it interacts with Microsoft Purview sensitivity labels. Hopefully, this info will assist directors to configure info safety controls and perceive their influence in relation to Copilot for Microsoft 365.
What Does Copilot for Microsoft 365 Have Entry To?
Copilot for Microsoft 365 positive factors entry to content material in Microsoft 365 through Microsoft Graph (Determine 1 ) and is designed to stick to the data entry restrictions inside a tenant. This Microsoft article explains how Copilot for Microsoft 365 works in depth, together with the way it accesses content material.

Directors and finish customers management entry to info in Microsoft 365 by means of permissions and privateness settings (SharePoint On-line and OneDrive permissions, Microsoft 365 Group membership and privateness settings, i.e., Public or Non-public). Optionally, organizations can use encryption (Microsoft Data Rights Administration (IRM), Microsoft Purview Message Encryption, S/MIME e-mail encryption, and Microsoft Workplace password safety), however Copilot doesn’t assist each methodology that a corporation would possibly use to guard knowledge
Let’s discover the encryption strategies that current challenges for Copilot for Microsoft 365…
Encrypting Content material in Microsoft 365 with Copilot for Microsoft 365
To make sure that Copilot for Microsoft 365 and encryption work easily collectively, directors ought to think about which strategies are used to encrypt content material in Microsoft 365 – some generally used encryption controls are partially or fully unsupported by Copilot and can influence its capability to return content material.
The Azure Rights Administration service (Azure RMS) is absolutely supported by Copilot for Microsoft 365. Whereas Microsoft Data Rights Administration (IRM), Microsoft Purview Message Encryption (OME), and sensitivity labels all use Azure RMS, organizations ought to concentrate on OME and sensitivity labels as the premise for defense of delicate materials in tenants the place Copilot is energetic.
S/MIME encryption and Double Key Encryption (DKE) are fully unsupported for Copilot for Microsoft 365: Copilot doesn’t deal with S/MIME-encrypted emails, and DKE-encrypted knowledge just isn’t accessible at relaxation to any Microsoft 365 companies. Copilot won’t return content material protected by S/MIME or DKE, and can’t be utilized in apps when that protected content material is open.
Microsoft Workplace password safety is just partially supported for Copilot for Microsoft 365. Password-protected paperwork can solely be accessed by Copilot whereas they’re open within the app. For instance: customers can open a password-protected Excel doc and use Copilot within Excel, however they can’t open Microsoft Phrase and use Copilot to create a doc utilizing the info that’s open in Excel. Directors can use GPOs or Intune to limit customers from having the ability to password-protect Workplace recordsdata.
How Copilot for Microsoft 365 Works with Sensitivity Labels
Sensitivity labels assist the classification and safety of Workplace and PDF content material (learn this text for a deeper dive). Protected paperwork have metadata for the label that protects the doc, together with the utilization rights granted to completely different customers. When a person opens a doc, the Data Safety service grants the utilization rights assigned to the person primarily based on the set of rights definitions within the label.
To ensure that Copilot to make use of info to floor a immediate – whether or not a person offers a reference doc for express grounding, or Copilot finds the data through Graph queries – labels should grant the VIEW and EXTRACT utilization rights to the person on whose behalf Copilot is processing a protected doc.
In a few circumstances, Copilot can course of content material with solely the VIEW utilization proper enabled:
- If a person applies restrictions to content material with Data Rights Administration, and the content material additionally has a sensitivity label that doesn’t apply encryption, content material may be returned by Copilot in chat.
- In Copilot in Edge and Copilot in Home windows, Copilot can reference encrypted content material from the energetic browser tab in Edge (together with inside Workplace net apps). To limit this fully, configure Endpoint DLP to dam “Copy to clipboard” exercise for delicate content material. Even with this setting configured to “Block” or “Block with override”, copying just isn’t blocked when the vacation spot is throughout the identical Microsoft 365 Workplace app. The consequence: customers can not leverage Copilot in Edge or Copilot in Home windows to avoid restrictions when they’re solely granted VIEW utilization rights to a doc, and Copilot for Microsoft 365 just isn’t hindered.
Copilot makes use of sensitivity labels for the output it produces. In Copilot for Microsoft 365 chat conversations, sensitivity labels are displayed for citations and gadgets listed in responses (Figures 2 & 3).


A number of situations don’t assist sensitivity labels:
- Copilot can’t create PowerPoint shows from encrypted recordsdata or generate draft content material in Phrase from encrypted recordsdata.
- In chat messages with Copilot for Microsoft 365, the Edit in Outlook choice doesn’t seem when reference content material has a sensitivity label utilized.
- Sensitivity labels and encryption which are utilized to knowledge from exterior sources, together with Energy BI knowledge, are usually not acknowledged by Microsoft 365 Copilot chat. If this poses a safety threat in your group, you possibly can disconnect connections that use Graph API connectors and disable plugins for Copilot for Microsoft 365, though this can stop you from extending Copilot for Microsoft 365, limiting the worth your group will get from Copilot.
- Sensitivity labels that shield Groups conferences and chat are usually not acknowledged by Copilot (this doesn’t embrace assembly attachments or chat recordsdata that are saved in OneDrive or SharePoint).
In Phrase, Copilot can use as much as 3 reference paperwork to floor a immediate. Whereas Copilot can not generate draft content material from encrypted recordsdata, it will possibly generate content material from labeled content material that isn’t encrypted.
While you use Copilot for Microsoft 365 to create content material primarily based on knowledge with a sensitivity label utilized, the content material you create inherits the sensitivity label from the supply knowledge. When a number of reference paperwork are used which have completely different sensitivity labels utilized, the sensitivity label with the very best precedence is chosen.
This desk outlines the outcomes when Copilot applies safety with sensitivity label inheritance. Figures 4, 5, and 6 present this expertise:



Consumer-Outlined Permissions are Not Really useful
Encryption settings for sensitivity labels limit entry to content material that the label will probably be utilized to. When configuring a sensitivity label to use encryption, the choices for permissions are:
- Assign permissions now (decide which customers get permissions to content material with the label as you create the label)
- Let customers assign permissions (when customers apply the label to content material, they decide which customers get permissions to that content material)
When gadgets are encrypted with “Let customers assign permissions” (Determine 7), nobody besides the writer may be assured to be granted utilization rights enough for Copilot entry.

When customers can assign permissions, they could not know or keep in mind to allow EXTRACT utilization rights. They might merely apply VIEW rights to content material, or use predefined permissions that don’t embrace EXTRACT rights, like Do Not Ahead. As a result of the person who applies encryption will get all utilization rights mechanically, once they work together with Copilot for Microsoft 365, it returns content material to them as meant. The person who assigned the permissions won’t understand that the utilization rights are misconfigured, whereas different customers, who they could have meant to grant entry to, will be unable to work together with the content material by means of Copilot for Microsoft 365.
Strike a Steadiness Between Safety and Usability
By understanding and managing Copilot’s capabilities throughout the realm of sensitivity labels and guarded content material, we will harness its full potential with out compromising on info safety.
My suggestions for fellow directors coping with Copilot in Microsoft 365 are simple: Use OME and sensitivity labels wherever encryption is required, and ensure to maintain up with the newest on Copilot for Microsoft 365 and tweak your safety sport plan as issues evolve.
Lastly, don’t overlook to loop in your customers. A bit of schooling goes a great distance in ensuring they know the significance of labeling knowledge accurately, being cautious when Copilot references delicate info, and understanding Copilot’s limitations with encrypted content material.
+ There are no comments
Add yours