Software

AI + ML

Copilot for Microsoft 365 might boost productivity if you survive the compliance minefield

Loads of governance issues to worry about, and the chance it might spout utter garbage


Microsoft has published a Transparency Note for Copilot for Microsoft 365, warning enterprises to ensure user access rights are correctly managed before rolling out the technology.

Concerns over data governance have held up some Copilot projects as biz customers consider how best to integrate the service into their organizations, and Microsoft's Transparency Note warns that administrators must check user access is configured correctly before rolling anything out.

The note makes it clear: "Copilot for Microsoft 365 only accesses data that an individual user has existing access to, based on, for example, existing Microsoft 365 role-based access controls."

Copilot for Microsoft 365 is an add-on for which Microsoft expects $30 per user per month, with an annual subscription. It uses large language models (LLMs) and integrates data with Microsoft Graph and Microsoft 365 apps and services to summarize, predict, and generate content.

At first glance, the service is innocuous enough. It gets input from a user in an app, such as Word. That user prompt is then parsed to improve the odds of getting something useful out of the service and then sent to the LLM for processing. What comes out of the LLM is post-processed before being returned to the user.

According to Microsoft: "This post-processing includes other grounding calls to Microsoft Graph, responsible AI checks such as content classifiers, security, compliance and privacy checks, and command generation."

In addition to ensuring user access is configured correctly, the Transparency Note warns organizations to consider legal and compliance issues when using the service, particularly in regulated industries.

"Microsoft is examining regulatory requirements that apply to Microsoft as a provider of the technology and addressing them within the product through a process of continuous improvement," the document states.

Then there's the recommendation to allow Copilot for Microsoft 365 to reference web content from Bing to improve "the quality, accuracy, and relevance" of its responses. Allowing Microsoft Graph to be extended with sources like CRM systems, external file repositories, and other organizational data is another recommendation that will require enterprises to take a long, hard look at governance.

Microsoft's Transparency Note for Copilot for Microsoft 365 is a useful document, highlighting that enterprises must consider the implications of deploying the service.

Last month, Jack Berkowitz, chief data officer of Securiti, told us of bigger corporations pausing Copilot deployments because the tool is accessing data and "aggressively summarizing information" that certain employees shouldn't have access to – salaries, for example.

"Now, maybe if you set up a totally clean Microsoft environment from day one, that would be alleviated," he said. "But nobody has that. People have implemented these systems over time, particularly really big companies. And you get these conflicting authorizations or conflicting access to data."

The much-touted productivity gains from the AI service need to be balanced by its risks – even Microsoft notes "users should always take caution and use their best judgment when using outputs from Copilot for Microsoft 365" – and worries over compliance and data governance must be addressed before unleashing the service on an organization. ®

Send us news
19 Comments

If every PC is going to be an AI PC, they better be as good at all the things trad PCs can do

Microsoft's Copilot+ machines suck at one of computing's oldest use cases

Microsoft Bing Copilot accuses reporter of crimes he covered

Hallucinating AI models excel at defamation

From Copilot to Copirate: How data thieves could hijack Microsoft's chatbot

Prompt injection, ASCII smuggling, and other swashbuckling attacks on the horizon

AI-pushing Adobe says AI-shy office workers will love AI if it saves them time

knowledge workers, overwhelmed by knowledge tasks? We know what you need

The future of AI/ML depends on the reality of today – and it's not pretty

The return of Windows Recall is more than a bad flashback

Microsoft tweaks fine print to warn everyone not to take its AI seriously

Don't use LLMs for anything important and don't try to reverse engineer it

Canadian artist wants Anthropic AI lawsuit corrected

Tim Boucher objects to the mischaracterization of his work in authors' copyright claim

AI firms propose 'personhood credentials' … to fight AI

It's going to take more than CAPTCHA to prove you're real

Microsoft's Inflection acquihire is too small to matter, say UK regulators

Deal can't lessen competition if AI minnow wasn't much of a competitor

Microsoft resurrects Windows Recall for upcoming preview

Insiders get ready for Redmond's second run at AI snoopware

Top companies ground Microsoft Copilot over data governance concerns

Securiti's Jack Berkowitz polled 20-plus CDOs, and half have hit pause

Microsoft to stop telling investors about peformance of server products

Shuffles financial metrics so Copilot Pro revenue ends up in a happier place