2024-08-19
한어Русский языкEnglishFrançaisIndonesianSanskrit日本語DeutschPortuguêsΕλληνικάespañolItalianoSuomalainenLatina
Source: Huanqiu.com
[Global Network Technology Comprehensive Report] On August 19, in response to the recent controversy caused by the accuracy of its AI tools, Microsoft has taken new measures to remind users to be cautious with its AI services. To this end, the company updated its service agreement and made it clear that its AI tools should be regarded as auxiliary tools rather than substitutes for professional advice.
The revised terms will take effect at the end of next month. In particular, Microsoft highlighted problems with its health chatbot and warned users that over-reliance on the robot's advice could be risky.
Microsoft made it clear that AI cannot replace the role of professionals. Regarding the limitations of its auxiliary AI, Microsoft further explained: "AI services are not designed, intended, or used to replace professional advice." At the same time, the company also emphasized that its health chatbot "is not designed or intended to replace professional medical advice, nor should it be used to diagnose, treat, mitigate, prevent, or manage disease or other health conditions."
In addition, the agreement reiterates the restrictions on Copilot AI experiences that are subject to Bing's terms of use. Microsoft explicitly states that the experience should not be used to extract data through crawling or collection, unless Microsoft explicitly permits it.