scorecard
  1. Home
  2. tech
  3. news
  4. A radio host is suing OpenAI for defamation, alleging that ChatGPT created a false legal document that accused him of 'defrauding and embezzling funds'

A radio host is suing OpenAI for defamation, alleging that ChatGPT created a false legal document that accused him of 'defrauding and embezzling funds'

Beatrice Nolan   

A radio host is suing OpenAI for defamation, alleging that ChatGPT created a false legal document that accused him of 'defrauding and embezzling funds'
Tech1 min read
  • A radio host is suing OpenAI, alleging that ChatGPT generated a fake legal complaint about him.
  • The bot falsely identified Mark Walters in a case he had no involvement in, court documents said.

A radio host is suing OpenAI for defamation, alleging that ChatGPT created a fake legal complaint about him.

According to the lawsuit filed in a Georgia court, Mark Walters alleged that ChatGPT provided journalist Fred Riehl with a false legal complaint instead of a real case he was reporting on. The bot falsely identified Walters as involved in the case and said he was accused of "defrauding and embezzling funds," per court documents.

In the complaint, Walters said Riehl asked the AI-powered chatbot for a summary of a Washington case involving Attorney General Bob Ferguson and the Second Amendment Foundation, providing the chatbot with a link to the case.

The complaint said that in response to this, ChatGPT provided a summary saying the case was between Walters and the Second Amendment Foundation, and that Walters was "accused of defrauding and embezzling funds from the SAF."

The bot named the radio host as the organization's treasurer and chief financial officer, and said he had been accused of misappropriating funds for personal expenses and manipulating financial records, according to the suit.

Walters said that when the bot was pressed for more details, it eventually provided an entirely fake complaint, per court documents. The Georgia suit called the complaint a "complete fabrication" that "bears no resemblance to the actual complaint, including an erroneous case number."

The defamation case is the first of its kind, according to Bloomberg Law, and was most likely caused by a "hallucination." Experts have been warning for some time that AI chatbots can sometimes give false but convincing answers.

Representatives for OpenAI and Walters did not immediately respond to Insider's request for comment made outside normal working hours.


Advertisement

Advertisement