Published in News

Microsoft has five new AI policies

by on17 August 2023


Terms and condition change

Software King of the World Microsoft is bringing in five new changes to its terms and conditions to handle AI.

The five new policies will come into effect on September 30 and will prevent users from reverse engineering the services to discover any underlying components of the models, algorithms, and systems.

Users will be prevented from extracting data unless they have been explicitly permitted. This prevents web scraping, web harvesting, or web data extraction methods to extract data from the AI services.

There are limits on the use of data from the AI Services and it cannot be used to create, train, or improve (directly or indirectly) any other AI service.

Microsoft will process and store your inputs to the service and output from the service for purposes of monitoring and preventing abusive or harmful uses or outputs of the service.

Vole said that users are solely responsible for responding to any third-party claims regarding their use of the AI services in compliance with applicable laws (including, but not limited to, copyright infringement or other claims relating to content output during Your use of the AI services).

A spokesperson from Microsoft declined to comment on how long the company plans to store user inputs into its software. "We regularly update our terms of service to reflect our products and services better. Our most recent update to the Microsoft Services Agreement includes adding language to reflect artificial intelligence in our services and its appropriate use by customers."

Microsoft has previously said, however, that it doesn't save conversations or use that data to train its AI models for its Bing Enterprise Chat mode. Its policies less clear for its Microsoft 365 Copilot; although it doesn't appear to use customer data or prompts for training, it does store information.

 

Last modified on 17 August 2023
Rate this item
(2 votes)