Saturday, June 24, 2023

Hallucinations

NEW YORK, June 22 (Reuters)

New York lawyers sanctioned for using fake ChatGPT cases in legal brief

 A U.S. judge on Thursday imposed sanctions on two New York lawyers who submitted a legal brief that included six fictitious case citations generated by an artificial intelligence chatbot, ChatGPT.

U.S. District Judge P. Kevin Castel in Manhattan ordered lawyers Steven Schwartz, Peter LoDuca and their law firm Levidow, Levidow & Oberman to pay a $5,000 fine in total.

The judge found the lawyers acted in bad faith and made "acts of conscious avoidance and false and misleading statements to the court."

Levidow, Levidow & Oberman said in a statement on Thursday that its lawyers "respectfully" disagreed with the court that they acted in bad faith.

"We made a good faith mistake in failing to believe that a piece of technology could be making up cases out of whole cloth," the firm's statement said.

Hallucinations, sometimes called confabulations or delusions, in Chatbots. A curious thing about them is that nobody seems really to understand how they occur, Large Language Models being, to all intents and purposes, opaque. We should all be both more intrigued by and worried by them.

No comments: