The lawyer who used ChatGPT's fake legal cases in court said he was 'duped' by the AI, but a judge questioned how he didn't spot the 'legal gibberish'
- A lawyer used ChatGPT to help search for legal cases to write an affidavit backing his lawsuit.
- The AI hallucinated six fake cases, per a federal judge, which the lawyer included in the filing.
The lawyer who used ChatGPT to help write up an affidavit — and didn't realize the AI had completely made up fake legal cases to cite — said he was "duped" by the tool during a sanctions hearing before a New York judge on Thursday, Inner City Press reported.
But the judge in the hearing pressed the lawyer, questioning how the lawyer missed the fakes and saying ChatGPT's fabricated ramblings were "legal gibberish," journalist Matthew Russell Lee reported for his nonprofit outlet.
"Chat GPT wasn't supplementing your research - it was your research, right?" US District Court Judge P. Kevin Castel asked lawyer Steven Schwartz of personal injury law firm Levidow, Levidow & Oberman, according to Inner City Press.
The affidavit used by Schwartz and colleague Peter LoDuca was for a lawsuit from a man who alleged he was hurt by a serving cart on a flight. The court filing included six court cases that were "bogus judicial decisions with bogus quotes and bogus internal citations," Castel wrote in a previous court order.
A receptionist who picked up the phone at Levidow, Levidow & Oberman late Thursday said the lawyers were out of the office. They couldn't be immediately reached for comment.
At the sanctions hearing over the mishap, Castel pressed Schwartz, Inner City Press reported.
"You say you verify cases," Castel said, according to Inner City Press
"I, I, I thought there were cases that could not be found on Google," Schwartz replied, according to the outlet.
"Six cases, none found on Google. This non-existent case, Varghese, the excerpt you had was inconsistent, even on the first page," Castel said, the outlet reported. "Can we agree that's legal gibberish?"
Schwartz responded that he thought ChatGPT's output was "excerpts," Inner City Press reported.
Another lawyer at the firm, Peter LoDuca, is also facing sanctions, though he said in court that he didn't do the research that was used in the affidavit.
"I have worked with Mr. Schwartz for 27 years," LoDuca said in court, Inner City Press reported. "I should have been more skeptical. I can't go back and change what was done. This will never happen again."
The hearing adjourned without Castel making a decision on possible sanctions for the lawyers.