Posted in

Microsoft put the same disclaimer on Copilot that a psychic uses to avoid getting…

The topic Microsoft put the same disclaimer on Copilot that a psychic uses to avoid getting… is currently the subject of lively discussion — readers and analysts are keeping a close eye on developments.

This is taking place in a dynamic environment: companies’ decisions and competitors’ reactions can quickly change the picture.

Affiliate links on Android Authority may earn us a commission. Learn more.

For all the complaints people make about AI replacing human skills, there’s another side to it: The rise of AI has also forced humans to develop new skills, specifically in terms of being able to sort useful AI output from incorrect, hallucinated garbage. Over the past couple years, many of us have gotten pretty good at this, and have leaned to make the most of the many limitations we experience with so many AI agents. While the companies behind these projects are similarly aware of the limitations we’re up against, one of them seems to be overcompensating a bit in the legal department, as Copilot users notice some concerning language in Microsoft’s terms of service.

Anyone using AI for anything even remotely serious should know by now to sanity check the program’s output — AI will confidently share mistakes as truths, and users need to be vigilant to not take its output at face value. Correspondingly, all the major players make disclaimers to this effect, trying to promote their products’ benefits while also acknowledging their limitations. Google’s Gemini overview is a good example of this, explaining how Gemini does what it does, while also drawing attention to places where it still needs improvement.

And then there’s Microsoft. Like many of the other companies doing AI, it likes to advertise all the important tasks Copilot can aid you with, like coming up with new strategies for your business:

On its face, there’s nothing unusual there — pretty par for the course with your modern AI platform. We just hope that the business customers Microsoft’s going after with spots like this aren’t reading the full Copilot terms of use (via Tom’s Hardware). Because if they did, they’d see this concerning disclaimer:

That “only” is doing a whole lot of work there. Microsoft continues:

Now, that bit actually sounds relatively in line with the disclaimers we’ve seen from other AI firms: Be careful, and verify the correctness of any AI output before acting on it. On its own, we wouldn’t be stopping and thinking twice about that popping up in Copilot’s terms.

But for whatever reason, Microsoft was compelled to go back and add in what’s basically an “LOL JK” to the entire document.

Honestly, this is probably just a lawyer feeling the need to overcorrect and cover Microsoft’s liabilities — they just happened to go much too far in the process, inviting ridicule. If only someone had asked Copilot to scan over the terms and spot any potential embarrassments — except, as we now know, that’s not what it’s made for!

Thank you for being part of our community. Read our Comment Policy before posting.