Microsoft might be saving your Bing Chat conversations

Uh-oh — Microsoft might be storing information from your Bing chats.This is probably totally fine as long as you've never chatted about anything you wouldn't want anyone else reading, or if you thought your Bing chats would be deleted, or if you thought you had more privacy than you actually have.In its terms of service, Microsoft updated new AI policies. Introduced on July 30 and going into effect on Sept. 30, the policy said: "As part of providing the AI services, Microsoft will process and store your inputs to the service as well as output from the service, for purposes of monitoring for and preventing abusive or harmful uses or outputs of the service." SEE ALSO: Microsoft is testing Bing Chat on Chrome and Safari According to the Register's reading of a new clause "AI Services" in Microsoft's terms of service, Microsoft can store your conversations with Bing if you're not an enterprise user — and we don't know for how long. Mi

Microsoft might be saving your Bing Chat conversations
Yusuf Mehdi, Microsoft Corporate Vice President of Modern Life, Search, and Devices, speaks during a keynote address announcing ChatGPT integration for Bing at Microsoft in Redmond, Washington, on February 7, 2023. - Microsoft's long-struggling Bing search engine will integrate the powerful capabilities of language-based artificial intelligence, CEO Satya Nadella said, declaring what he called a new era for online search.

Uh-oh — Microsoft might be storing information from your Bing chats.

This is probably totally fine as long as you've never chatted about anything you wouldn't want anyone else reading, or if you thought your Bing chats would be deleted, or if you thought you had more privacy than you actually have.

In its terms of service, Microsoft updated new AI policies. Introduced on July 30 and going into effect on Sept. 30, the policy said: "As part of providing the AI services, Microsoft will process and store your inputs to the service as well as output from the service, for purposes of monitoring for and preventing abusive or harmful uses or outputs of the service."

According to the Register's reading of a new clause "AI Services" in Microsoft's terms of service, Microsoft can store your conversations with Bing if you're not an enterprise user — and we don't know for how long. 

Microsoft did not immediately respond to a request for comment from Mashable, and a spokesperson from Microsoft declined to comment to the Register about how long it will store user inputs.

"We regularly update our terms of service to better reflect our products and services," a representative said in a statement to the Register. "Our most recent update to the Microsoft Services Agreement includes the addition of language to reflect artificial intelligence in our services and its appropriate use by customers."

Beyond storing data, there were four additional policies in the new AI Services clause. Users cannot use the AI service to "discover any underlying components of the models, algorithms, and systems." Users are not allowed to extract data from the AI services. Users cannot use the AI services to "create, train, or improve (directly or indirectly) any other AI service." And finally, users are "solely responsible for responding to any third-party claims regarding Your use of the AI services in compliance with applicable laws (including, but not limited to, copyright infringement or other claims relating to content output during Your use of the AI services)."

So maybe be a bit more careful while using Microsoft Bing chats or switch to Bing Enterprise Chat mode — Microsoft said in July that it doesn't save those conversations.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow