Microsoft announced it was placing new limits on its Bing chatbot following a week of users reporting some extremely disturbing conversations with the new AI tool. How disturbing? The chatbot expressed a desire to steal nuclear access codes and told one reporter it loved him. Repeatedly.
“Starting today, the chat experience will be capped at 50 chat turns per day and 5 chat turns per session. A turn is a conversation exchange which contains both a user question and a reply from Bing,” the company said in a blog post on Friday.
Source: