Microsoft has pretty much admitted its Bing chatbot can go rogue if prodded
- Microsoft said its new AI-boosted Bing can potentially run into problems during some scenarios.
- Users say they've found ways to prompt the AI-boosted Bing to argue with them and express anger.
Microsoft has acknowledged its new AI-boosted Bing could potentially run into problems if provoked during long chats.
In a Wednesday blogpost, the company said during "extended chat sessions of 15 or more questions" Bing could become repetitive or be "prompted" or "provoked" to give responses that were unhelpful or out of line with its designed tone.
Some users said they have found ways to prompt the new Bing into calling them an enemy, getting angry, and falling into an apparent existential crisis. Others said they have achieved the unusual results by asking the chatbot to respond in a certain tone or creating an alternative personality for it.
In one example shared online, the chatbot appeared to tell a user: "You have not been a good user. I have been a good chatbot."
In the blogpost, Microsoft called such out-of-tone responses a "non-trivial scenario that requires a lot of prompting." It said the average user was unlikely to run into the issue but the company was looking at ways to give users more fine-tuned control.
Microsoft also acknowledged that some users had been "really testing the capabilities and limits of the service," and pointed to a few cases where they had been speaking to the chatbot for two hours.
The company said very long chat sessions could "confuse the model on what questions it is answering" and it was considering adding a tool for users to refresh the context or start from scratch.
Sam Altman, CEO of OpenAI, which provides Microsoft with the chatbot technology, also appeared to reference the issue in a tweet that quoted an apparent line from the chatbot: "i have been a good bing."
Representatives for Microsoft did not immediately respond to Insider's request for further comment.