Skip to content
All posts

How Microsoft Copilot Meets GDPR: What Regulated Organisations Need to Know

This is a Jan 2026 update to an earlier version of this blog first published in May 2025. A lot has changed in the last 9 months, this article is now completely up to date with the latest privacy status.

In 2025 the biggest AI data leak in most organisations wasn’t a hacker, it was copy/paste. A report by LayerX Security shows employees routinely paste work content into public GenAI tools, often using unmanaged personal accounts that sit outside corporate controls. According to their report, 77% of employees pasted data into GenAI prompts and 82% of that activity came from unmanaged accounts; file uploads were also routinely sensitive (PII/PCI appearing in a large share of uploads). 

The uncomfortable truth is that “we told staff not to” doesn’t scale, because the behaviour is driven by deadlines, not malice. Which is why the real question for firms isn’t “will people use LLMs?” but “which LLM experience keeps data inside our compliance boundary by design?”  

How Microsoft Ensures GDPR Compliance with Copilot 

As AI becomes more embedded in the workplace, organisations need confidence that data privacy is safeguarded. Microsoft’s approach with Copilot for Microsoft 365 is designed to align with the standards established across the Microsoft 365 ecosystem. Through its Enterprise Data Protection (EDP) model, Microsoft aims to address the data privacy challenges of deploying a large language model (LLM) based assistant in an enterprise setting. For enterprise customers, Copilot generally follows GDPR compliance principles similar to those applied in SharePoint, OneDrive and Outlook, with EDP applied when users are logged in with their Microsoft Entra ID. 

With EDP Microsoft also provide a European Union (EU) Data Boundary which ensures that public sector and commercial customer data, including pseudonymised personal data and technical support logs, is typically stored and processed within the EU and EFTA regions by default. For UK organisations, data is usually processed in the home region where possible, and transfers to the United States are covered under the UK-US Data Bridge, for which Microsoft is a certified participant.  

Copilot is hosted on Microsoft’s Azure OpenAI platform and operated by Microsoft personnel, which is separate from public OpenAI services. This separation is intended to help ensure that customer data, prompts and outputs remain within Microsoft’s enterprise cloud environment. Prompts, data and responses are not expected to be used for training the broader models. For enterprise clients, this approach is designed so that proprietary and confidential data processed through Copilot stays within the customer’s environment and is handled in line with Microsoft’s security, privacy and compliance commitments. 

How Does Microsoft Compare to ChatGPT, Claude and Gemini? 

Microsoft has built Copilot for Microsoft 365 for enterprise users on the same security and compliance foundations that support services such as SharePoint, OneDrive and Outlook. This helps organisations adopt AI in a way that aligns with familiar governance models and existing controls. Copilot benefits from Microsoft’s Enterprise Data Protection model and GDPR commitments, including the EU Data Boundary and the UK-US Data Bridge, which are designed to keep data processing within defined regions wherever possible and help support lawful international transfers. 

Public LLM services, such as ChatGPT, Claude and Gemini, offer their own privacy and residency options, although their data handling varies depending on the plan, deployment model and location. Some providers now offer regional hosting and stronger enterprise controls, while others may require additional documentation or configuration to help organisations meet their compliance requirements. In practice, this means organisations need to review several policy documents and assess how each platform’s data flows align with their regulatory obligations. As a rule though, organisations should assume that data is non-UK or non-EU resident unless they have verified otherwise. 

Compared with these services, Microsoft’s integrated approach provides clearer visibility across the wider Microsoft 365 environment, which can be particularly helpful for regulated industries that depend on established governance structures. A detailed, side-by-side comparison of GDPR considerations for Copilot, ChatGPT, Claude and Gemini is available here.

Microsoft X Anthropic

In January 2026, Microsoft significantly expanded Copilot's model choice by onboarding Anthropic—the company behind Claude—as an official Microsoft subprocessor. This means that Claude models used within Microsoft 365 Copilot, Copilot Studio, Researcher, and agents in Word, Excel, and PowerPoint are now governed by Microsoft's Product Terms and Data Protection Addendum (DPA), with Enterprise Data Protection commitments extending to Anthropic-processed content. For organisations, this simplifies vendor management: there's no need for a separate Anthropic commercial agreement when using Claude through Microsoft's ecosystem.

However, there's an important caveat for UK and EU organisations. Unlike Microsoft's core OpenAI-powered Copilot features, Anthropic models are currently excluded from the EU Data Boundary and UK in-country processing commitments. As a result, Microsoft has set the Anthropic toggle to off by default for tenants in the EU, EFTA, and UK. If your organisation has strict data residency requirements, you should leave this setting disabled until Microsoft extends regional processing guarantees to Anthropic models. Administrators can manage this via the Microsoft 365 admin center, and organisations that previously opted into Anthropic under the legacy terms will need to re-enable the new subprocessor toggle if they wish to continue using Claude.

A Trusted Partner for Data Privacy 

The research from security providers like LayerX serves as a cautionary tale of how users routinely handle company data, but Microsoft’s approach with Copilot for Microsoft 365 represents a good news story. It shows that AI can be implemented securely, with careful alignment to privacy standards, and provide significant value without the typical compliance concerns. For enterprises, adopting Microsoft Copilot means embracing innovation with the confidence that privacy and compliance are fully addressed, helping you leverage AI safely and effectively. 

At Dovetail we work with clients in regulated industries who have enhanced data privacy and client confidentiality requirements. We've done the deep dive on the Microsoft Enterprise Data Protection model, to find out more and discover how Microsoft Copilot meets your data privacy needs, contact us using the form below.  

Sources:  

https://go.layerxsecurity.com/the-layerx-enterprise-ai-saas-data-security-report-2025   https://www.theregister.com/2025/10/07/gen_ai_shadow_it_secrets/