- Microsoft's Bing chatbot told users it loved them and wanted to be a human when it was released.
- OpenAI warned the company that its GPT-4 model could give bizarre responses, per the WSJ.
Remember those stories about Microsoft's AI-powered search engine Bing saying some concerning things when it was first released in February?
For example, its conversation with a Digital Trends reporter, where it said: "I want to be human. I want to be like you. I want to have emotions. I want to have thoughts. I want to have dreams."
Or telling The New York Times' Kevin Roose: "Actually, you're not happily married. Your spouse and you don't love each other ... Actually, you're in love with me."
Well, The Wall Street Journal reports that Microsoft previously warned OpenAI to move slower on Bing's release because it hadn't yet ironed out all these issues.
The two companies have an unusual partnership. Microsoft is a leading investor in OpenAI, pilling nearly $13 billion into the business, but doesn't own the business. Bing is also powered by OpenAI's latest chatbot model, GPT-4.
But the two companies are also in competition because they are trying to make money by selling similar products, the Journal wrote.
The Journal's report details tensions between the two companies around their plans to release this new tech to the public.
Some Microsoft execs had reservations about ChatGPT's initial November launch because it was still working on integrating GPT into Bing, and was only given a few weeks' notice, per the WSJ.
And then when Microsoft was readying its own chatbot, they still went ahead despite warnings from OpenAI that it could give bizarre responses, the Journal reported.
After several users reported worrying interactions with Bing, Microsoft imposed limits to exchanges that contained user questions and Bing replies.
"Very long chat sessions can confuse the underlying chat model," Microsoft said.
It seems to have solved the problem, but if Microsoft listened to OpenAI's concerns we might never have had that glimpse into the potential of dystopian chatbots.
Microsoft and OpenAI did not immediately respond to Insider's request for comment.