More Changes Coming That Impact Loop and Copilot Retention
I wrote about Retention of Copilot and Loop - then I saw new updates that make what I wrote outdated already!
Such is the world of M365. What was planned to be a week off from the newsletter turned into this quick update because there are already changes that require a follow-up to what I wrote a week ago.
First, I discussed a couple of weeks ago the process of running an eDiscovery search to locate Loop files by targeting the .loop file extension.
It turns out that you will soon need to add another extension.
Message Center item MC1085134 says:
Microsoft Copilot Pages will introduce a new .page file extension, identical in functionality to the .loop extension but with a different name and icon. Rollout starts mid-July 2025 and completes by early August 2025. No admin action is required, but user notification is recommended.
I assume this will be a way to identify Copilot Pages as unique things from other Loop pages. It even comes with a new icon.
Another new feature that has me rethinking what a retention policy for Copilot interactions should look like. In last week's newsletter, I mentioned that many organizations might want a relatively short retention period for interactions, treating it similarly to Teams chat, as ephemeral communication.
Then I noticed that Copilot Memory had begun rolling out to the public, and I wondered if a short retention and auto-deletion policy would limit what Copilot would remember about our users.
Introducing Copilot Memory: A More Productive and Personalized AI for the Way You Work.
Copilot Memory: Copilot picks up on important details from your conversations, like “I prefer Python for data science” or “I’m working on Project Alpha,” and remembers them. Users can also tell Copilot "please remember I like using bullet points in my writing" and Copilot will log this as a memory.
Custom Instructions: You can explicitly tell Copilot how you want it to behave—like “Keep my emails concise” or “Use a formal tone”—and it will apply those preferences automatically in future interactions.
I also saw this bit:
Admins can disable memory across the organization or for specific users, and memory data is discoverable through Microsoft Purview eDiscovery when needed.:
My assumption (testing in progress to confirm) is that the data will be subject to the retention policy applied to Copilot interactions. I have tested and confirmed that the “Memory” interactions, such as specifying my location and role at work, are stored in the same manner as other interactions. A search of my mailbox for these items returns them as interactions because they are indeed interactions. I prompted Copilot to remember details about me, and the response was to add them as memory items.
Assuming that I’ve set a short retention policy due to the temporary nature of AI interactions, I am now presented with an interesting question. How long should Copilot remember things about users by default? Do I want to stop auto-deleting Copilot interactions so that it always remembers facts about users until the user explicitly removes them from its memory? If that sounds good to you, I’d also ask, do you want to store every interaction forever? That sounds like a data protection and eDiscovery nightmare.
Where should we find a balance? How many teams in your organization may need to have input on that decision?
It shouldn’t be taken lightly.
Next week, I’ll be back with another deep dive into something new in the UI when it comes to eDiscovery Exports. I’m not sure how long this has been in existence. I only realized that an option was no longer available when I saw someone asking about it on Reddit. I had overlooked it, as I hadn’t been very focused on exports lately.
Paid subscribers will receive all the details from my ongoing testing, but even free subscribers will get a heads-up on what is going on and how it works. Be sure to subscribe and share it with your peers!