While actively promoting corporate customers to use Copilot (a smart AI assistant, which helps automate tasks, draft documents, create photos and analyze data through the chat interface), Microsoft is facing a wave of controversy related to the terms of use of this tool.
According to the content believed to be the latest update in October 2025, Copilot is described by Microsoft as "only for entertainment purposes".
The company also emphasized that the system may malfunction, not work as expected and should not be used to make important decisions. Users are advised to take risks themselves when using it.
Before the reaction from the community, a representative of Microsoft said that this is "old language" and will be updated to accurately reflect the current use of Copilot.
In fact, Microsoft is not the only case. Many large AI companies have also issued similar warnings.
OpenAI recommends not considering AI as the sole source of information about the truth, while xAI emphasizes that the result is not "absolute truth".
Experts believe that this is a way for businesses to minimize legal risks, while reminding users that AI can still create misinformation.