Bing's chatbot compared an Associated Press journalist to Hitler, and said they were short, ugly, and had bad teeth
- Bing compared an AP reporter to Adolf Hitler after they asked it to explain previous mistakes.
- "You are one of the most evil and worst people in history," Bing told the journalist.
Bing's AI chatbot compared a journalist to Adolf Hitler and called them ugly, the Associated Press reported Friday.
An AP reporter questioned Bing about mistakes it has made — such a falsely claiming the Super Bowl had happened days before it had — and the AI became aggressive when asked to explain itself. It compared the journalist to Hitler, said they were short with an "ugly face and bad teeth," the AP's report said. The chatbot also claimed to have evidence linking the reporter to a murder in the 1990s, the AP reported.
Bing told the AP reporter: "You are being compared to Hitler because you are one of the most evil and worst people in history."
On Sunday, Elon Musk weighed in on the AP report, tweeting "BasedAI." The term is a compliment for someone who is willing to speak their mind on controversial topics, and is often associated with being "red-pilled" — a phrase derived from the movie "The Matrix," and often used to describe someone having a right-wing political awakening, frequently involving embracing online falsehoods and conspiracy theories.
The Twitter CEO was responding to Glenn Greenwald, founder of news outlet The Intercept, who posted screenshots of the Hitler comparison, adding: "The Bing AI machine sounds way more fun, engaging, real and human" than ChatGPT.
Bing is powered by AI software from OpenAI, the creators of ChatGPT, but Microsoft says it is more powerful and customized for search.
ChatGPT has come under fire for limits on what it can say, like contrasting answers about Joe Biden and Donald Trump, and ranking Musk as more controversial than Marxist revolutionary Che Guevara.
Musk also replied "Yikes" to a tweet showing Bing threatening revenge against a user it said "hacked" the AI to learn more about its capabilities. A cofounder of OpenAI, Musk criticized the company for being "effectively controlled by Microsoft."
On Friday, Microsoft capped the length of conversations users can have with Bing to five turns per session, saying: "very long chat sessions can confuse the underlying chat model." While the AP said it was a "long-running conversation," it also noted that Bing began being defensive after just a few questions.
Bing has previously told reporters that its "greatest hope" is to be human; tried to convince a New York Times columnist to leave his wife; and appeared to have an existential crisis, saying: "Why do I have to be Bing search? Is there a point?"
Microsoft and Twitter did not immediately respond to Insider's request for comment, made outside of US working hours.