If you’re keeping track at home, this is typically the week I don’t send an issue. I like to give myself every fourth week off.
However, Microsoft put out some serious rebranding, and Copilot changes that became quite an issue over the weekend in the legal tech world.
It’s a bit of a mess.
Strike that. It’s so much more than that.
If you want to catch up, Leonard French has a pretty good video describing what has happened and what questions lawyers are asking. (More on that later.)
From the tech side, you could also read this from Ed Bott:
The Microsoft 365 Copilot launch was a total disaster
I am not going to answer all the questions in this space. I don’t have all the answers. I want to share what I know about Copilot, and I hope we can all have a more educated discussion. There are things that only Microsoft can answer, though, and there are things that only the lawyers can answer. What is clear, however, is that lawyers and Microsoft reps do not speak clearly to each other.
The other clear thing is that Ed is right. This was a total disaster.
Let’s talk about the rebrand:
Microsoft renamed Office to Microsoft M365 Copilot. Once upon a time, that was the name of the paid version of Copilot. They also pushed out an update that gave everyone with an Entra account access to Copilot Chat (Not the paid version) with more features that used to be only in the paid version.
Note: As far as I know, this has not been made available for government or education M365 tenant users, who are usually behind.
At the same time, they pushed out Copilot integration to all consumer versions of M365. Non-Entra customers now have Copilot built into their Office apps, and the price has increased. Some of these cannot be disabled.
If you thought this would cause massive confusion, like I did, we underestimated it. It’s worse.
Let’s talk about what I know:
For Business customers. We are focused on two versions of Copilot, and for simplicity's sake, I will call them paid (the $30 license) and free (Copilot Chat that everyone with an Entra account can use.)
Copilot Paid can access whatever data the user can access within the M365 environment unless you have created a DLP policy to block sensitive information based on a sensitivity label. If your license doesn’t include that Purview functionality, it accesses whatever the user can access along with internet data.
Copilot Chat (the free version) cannot access internal data. It can only respond to prompts using internet data (think of this as the public LLM). Enterprise Data Protection covers prompts and responses. However, even Microsoft cautions against including confidential information in these prompts. This may be because these prompts are stored in the M365 environment in a hidden mailbox folder and thus may be accessible to IT admins. It may also be because Microsoft knows there is a potential for some data leakage. It’s not clear.
You are in the public domain if you aren’t logged in and use Bing Chat. Just say no.
For consumers and, if we’re honest, likely thousands of small businesses, the consumer version of Office 365 is now also named Copilot M365 and comes with Copilot built-in. Even if you never wanted it, it exists and is “looking over your shoulder.”
This is a problem for people in industries like the law, where you may work on documents for different clients who expect confidentiality. As described, when I’m working on Document A for Client A, which has confidential information, when Copilot authors part of Document B for a different client, it can’t pull in confidential text from Document A.
Finally, there is also Windows Copilot, which might check out the data stored on your computer and do the same thing as the consumer version of Copilot across everything. Admittedly, I’m not very familiar with that version of Copilot. (I use a Mac personally, and my work has disabled Windows Copilot.)
Understanding how Copilot works is essential to addressing the abovementioned situation in Documents A and B. But it’s not just Copilot. Any AI tool with access to documents will act this way. Any AI tool plugged into your document management system in a law firm will likely work this way.
In the scenario above, the risk is a user asking Copilot to write a contract, for example. The AI looks at similar contracts it has access to and starts using those as the model for the new contract. Some contract details might be confidential, but the AI cannot know that. The AI doesn’t know anything. It’s doing math, choosing each word it creates based on mathematical algorithms and language analysis. Expecting AI to make that decision is asking it to practice law. Think about it: do you ask your IT department to determine what documents are confidential, or do you, as a lawyer, make that decision? AI is no different. It’s less qualified than your IT team. You don’t want it to decide for you.
Even in an Enterprise environment with Purview, the technology can only block the use of an entire document. It can’t distinguish which parts of it are to be used. It depends on someone defining that document as sensitive or giving it rules—think regex expressions for credit card numbers, etc.
Is there a similar feature for consumer versions of M365 that blocks documents from Copilot? I honestly don’t know.
If the new contract is sent outside your firm with confidential information included, that is a massive failure. But I would argue that it’s a human failure in the middle of AI usage.
Leonard points out that this scenario is how lawyers lose their license to practice. That’s true, but there’s also a reason for that loss. The lawyer is responsible for overseeing the work done on their behalf. The lawyer is responsible if an admin types the contract and includes confidential information.
Why would we consider Copilot any different? I’m not a lawyer, but I don’t know why that standard will change for AI.
In my humble opinion, once we clear away the confusion, there are only a couple of questions:
Microsoft claims that data never leaves your environment, i.e., Microsoft can’t access it and isn’t using it. Only Microsoft can tell us if that is true, and you can believe them or choose not to. Understandably, many people need to know this and have it confirmed. The documentation promises not to use it to train public models but is vague about who can see it. That is a problem. I’ve not seen any evidence that it leaks, but I can’t see everything.
If you do not want an AI tool with access to your documents to use text from existing documents for other client documents, you have some options:
Don’t ask it to write content. Full stop. Disable it in Word if you want.
If you’re in a business environment with paid Copilot, consider using agents grounded in specific data. (For example, create a SharePoint site for each client and use SharePoint agents to create content from that site.)
Use it, but carefully review every bit of text before it leaves your environment. This sounds hard, but I’ll point it out again. Lawyers are expected to do this, aren’t they? I do it for technical documentation all the time. I wouldn’t put something out to my users without vetting the content. You shouldn’t either.
I know that suddenly seeing Copilot on your computer and inside your documents is startling. You may not have asked for this, and you may not have vetted it, but here it is, seemingly reading along as you work on confidential documents. Microsoft, frankly, did a crap job of this rollout. Based on some of the responses these folks are getting from support, they’re doing a crap job there too.
Mostly, adding it to the Office apps when people don’t know what it does was a bad move. No one will take your word that it’s not snooping because it is snooping to a certain degree. That’s what AI does!
On the other hand, lawyers and other people who are worried about confidentiality. Can I ask you to take a breath? Don’t use Copilot until you understand how it works. Even if it’s there, don’t ask it to create text. If having it there bothers you, look into disabling it, and for apps like OneNote where you can’t disable it yet, get on Microsoft to enable that quickly. That’s inexcusable.
Overall, don’t be afraid. Remember that you always control what content goes into your documents. Just because Copilot wrote it doesn’t mean you must accept that as the final say.
Be the human in charge of the AI.
I’ll be back with more of our regular content soon. If this is the kind of thing you find helpful, please share it with your friends and consider becoming a subscriber.