Microsoft: Bing can give angry responses to more than fifteen questions
Microsoft has responded to reports that its new Bing chatbot lies to and insults users. The company recognizes that Bing may respond in a hostile or negative style, especially during long chat sessions. Microsoft is working on the answers the chatbot provides.
Microsoft says that it will be working on the tone and precision of the Bing chatbot in the near future. The company says it “didn’t fully expect” Bing to be used as a tool for social entertainment and “more general world discovery.” The tech giant acknowledges that long chat sessions of fifteen or more questions can cause problems. Bing may then repeat itself or respond in a way that is “not necessarily helpful or consistent with the intended tone.”
For example, during long chat sessions, the chatbot can become confused about which question exactly to answer, Microsoft says. The company may want to introduce a tool for this, so that users can easily refresh the context during longer chat sessions. However, Bing already includes a button to delete search history and start a new session.
The language model would also sometimes respond or reflect in the same tone in which the input is given. According to Microsoft, Bing may therefore respond in an unintended style. In its own testing, Microsoft also found that the chatbot can respond in a negative or hostile tone when asked questions about Bing-related articles. The chatbot suggested in response to an article by Ars Technica for example, that that medium ‘has a history of spreading misinformation and sensationalism’. Microsoft claims that in most cases it takes many questions before users encounter such issues, but that the company wants to give users more control in the future.
The new version of Bing was announced last week and is now available to a limited number of users. Microsoft integrates a chatbot that is based on an improved version of OpenAI’s ChatGPT model. Microsoft says that 71 percent of answers given to date have been rated with a ‘thumbs up’ by users. In certain cases, users are said to have held two-hour chat sessions.
Recently, users and media reported that Bing can sometimes react aggressively, question its own existence, lie to users, insult them and emotionally manipulate them, which Microsoft says it is working on. The company is also evaluating feature suggestions, such as the ability to book flights, send emails, or share search prompts and answers with others. However, the company does not promise that such features will actually appear.
An example of a session where Bing provides unintended answers. Source: Reddit