Recently, several people noticed that Microsoft's Copilot Terms of Use page for its AI tool, which has become increasingly embedded in the company's Windows 11 operating system and Microsoft 365 Office suite, included several notable disclaimers.

"Copilot is for entertainment purposes only," one reads. Adding that "it can make mistakes," "may not work as intended," and that users should not "rely on Copilot for important advice." And to top it all off, there's a final "Use Copilot at your own risk." Naturally, this statement makes it sound like Copilot should not be used in any formal or business-facing capacity, which runs counter to the company's AI push to get its vast customer base to use Copilot for everything from search to summarizing documents, and even when firing up Paint or Notepad.
To make matters worse, the disclosure also states that Microsoft makes "no guarantees" that Copilot will operate as intended, which kind of makes it seem more experimental than practical. Again, this contradicts Microsoft's big Copilot marketing push across all its software offerings, as well as the arrival of Copilot+ PCs, which include dedicated NPUs to run local AI Copilot tools.
Naturally, this revelation sparked widespread criticism across social media platforms, calling into question the usefulness of Copilot and adding more fuel to the idea that Microsoft's AI focus has been detrimental to the stability of its platforms, such as Windows 11.
In the days since this revelation began making the rounds, a Microsoft spokesperson told PCMag that it plans to change the disclaimer very soon. "The 'entertainment purposes' phrasing is legacy language from when Copilot originally launched as a search companion service in Bing," the statement says. "As the product has evolved, that language is no longer reflective of how Copilot is used today and will be altered with our next update."
It'll be interesting to see how the Copilot Terms of Use page is updated, as AI tools, by their very nature, are prone to errors or hallucinations. Odds are, Microsoft will change the language to be less alarming; however, it's unlikely that it will do a 180 and say that Copilot is infallible and can be trusted.




