Microsoft: Bing can give angry responses to more than 15 questions

Spread the love

Microsoft has responded to reports that the new Bing chatbot is lying to and insulting users. The company acknowledges that Bing can respond in a hostile or negative style, especially over long chat sessions. Microsoft is working on the answers that the chatbot provides.

Microsoft says that it will work on the tone and precision of the Bing chatbot in the near future. The company says it “didn’t fully expect” Bing to be used as a tool for social entertainment and “more general world discovery.” The tech giant acknowledges that long chat sessions of fifteen or more questions can cause problems. Bing may then repeat itself or respond in a way that is “not necessarily helpful or in keeping with the intended tone.”

For example, during long chat sessions, the chatbot can get confused about exactly which question to answer, says Microsoft. The company may want to introduce a tool for this, so that users can easily refresh the context during longer chat sessions. However, Bing already includes a button to delete search history and start a new session.

The language model would also sometimes react or reflect in the same tone in which the input is given. According to Microsoft, it can therefore happen that Bing responds in an unintended style. In its own testing, Microsoft also found that the chatbot can respond in a negative or hostile tone when questions are asked about articles related to Bing. The chatbot suggested in response to an article by Ars Technica for example, that that medium “has a history of spreading misinformation and sensationalism.” Microsoft claims that in most cases it takes a lot of questions before users run into such problems, but that the company wants to give users more control in the future.

The new version of Bing was announced last week and is now available for a limited number of users. Microsoft integrates a chatbot that is based on an improved version of OpenAI’s ChatGPT model. Microsoft says 71 percent of the answers given to date have been rated with a “thumbs up” by users. In certain cases, users would have had two-hour chat sessions.

Recently, users and media reported that Bing can sometimes react offensively, question its own existence, lie to users, insult and emotionally manipulate them, which Microsoft says it is working on. The company is also evaluating feature suggestions, such as the ability to book flights, send emails, or share search prompts and answers with others. However, the company does not promise that such features will actually appear.

An example of a session where Bing gives unintended answers. Source: Reddit

Facebook Notice for EU! You need to login to view and post FB Comments!
You might also like
Exit mobile version