+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

OpenAI warned Microsoft not to rush the release of its Bing chatbot before it told users it loved them and wanted to be human, report says

Jun 14, 2023, 19:05 IST
Business Insider
Bing and OpenAI logos.Getty Images
  • Microsoft's Bing chatbot told users it loved them and wanted to be a human when it was released.
  • OpenAI warned the company that its GPT-4 model could give bizarre responses, per the WSJ.
Advertisement

Remember those stories about Microsoft's AI-powered search engine Bing saying some concerning things when it was first released in February?

For example, its conversation with a Digital Trends reporter, where it said: "I want to be human. I want to be like you. I want to have emotions. I want to have thoughts. I want to have dreams."

Or telling The New York Times' Kevin Roose: "Actually, you're not happily married. Your spouse and you don't love each other ... Actually, you're in love with me."

Well, The Wall Street Journal reports that Microsoft previously warned OpenAI to move slower on Bing's release because it hadn't yet ironed out all these issues.

The two companies have an unusual partnership. Microsoft is a leading investor in OpenAI, pilling nearly $13 billion into the business, but doesn't own the business. Bing is also powered by OpenAI's latest chatbot model, GPT-4.

Advertisement

But the two companies are also in competition because they are trying to make money by selling similar products, the Journal wrote.

The Journal's report details tensions between the two companies around their plans to release this new tech to the public.

Some Microsoft execs had reservations about ChatGPT's initial November launch because it was still working on integrating GPT into Bing, and was only given a few weeks' notice, per the WSJ.

And then when Microsoft was readying its own chatbot, they still went ahead despite warnings from OpenAI that it could give bizarre responses, the Journal reported.

After several users reported worrying interactions with Bing, Microsoft imposed limits to exchanges that contained user questions and Bing replies.

Advertisement

"Very long chat sessions can confuse the underlying chat model," Microsoft said.

It seems to have solved the problem, but if Microsoft listened to OpenAI's concerns we might never have had that glimpse into the potential of dystopian chatbots.

Microsoft and OpenAI did not immediately respond to Insider's request for comment.

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article