This article was produced in collaboration with Court Watch, an independent outlet that unearths overlooked court records. Subscribe to them here.
After a judge called out a law firm for citing fake cases in court documents, the attorneys admitted to AI “hallucinating” the cases.
In a court order filed last week, Wyoming District Judge Kelly Rankin demanded the attorneys explain why they shouldn’t be sanctioned or disciplined for citing made-up information, including referencing eight non-existent cases.
The lawsuit, first filed in 2023, is against Walmart and Jetson Electronic Bikes, Inc., which makes hoverboards for sale at Walmart. The plaintiffs, including a woman, her husband, and four minor children, claim a Jetson hoverboard’s lithium ion battery malfunctioned and caught fire while they were sleeping and burned their house down, severely injuring several members of the family.
In a motion in limine filed by the plaintiff’s attorneys in January, they cited multiple cases (a "motion in limine" is a pretrial motion where attorneys request specific evidence or arguments be excluded from presentation during the trial) that don’t exist. “The cases are not identifiable by their Westlaw cite, and the Court cannot locate the District of Wyoming cases by their case name in its local Electronic Court Filing System,” Rankin wrote. She demanded each of the attorneys “provide a thorough explanation for how the motion and fake cases were generated,” and “explain their role in drafting or supervising the motion.”
Four days later, they responded: “Our internal artificial intelligence platform ‘hallucinated’ the cases in question while assisting our attorney in drafting the motion in limine,” the law firm said in a filed response. “This matter comes with great embarrassment and has prompted discussion and action regarding the training, implementation, and future use of artificial intelligence within our firm. This serves as a cautionary tale for our firm and all firms, as we enter this new age of artificial intelligence.”
Lawyers increasingly use AI tools for research and analyzing documents. But this isn’t the first time using AI to draft legal cases has gotten lawyers in trouble. In 2022, a man filed an action alleging he was injured by an Avianca airlines metal serving cart during an Avianca Airlines flight. His lawyers cited non-existent cases, and instead of admitting it and apologizing immediately, they doubled down and defended the filings. Eventually, they were fined $5,000 for fabricating the case, with the judge writing that they “abandoned their responsibilities when they submitted non-existent judicial opinions with fake quotes and citations created by the artificial intelligence tool ChatGPT, then continued to stand by the fake opinions after judicial orders called their existence into question.”
And in 2024, disbarred former celebrity attorney Michael Cohen gave his own lawyer, David Schwartz fake case citations generated by Google Bard. Cohen and Schwartz weren’t fined, but the judge who let them off without discipline did call the error “embarrassing” for them.