AI Hallucination Cases

This database tracks legal decisions1 I.e., all documents where the use of AI, whether established or merely alleged, is addressed in more than a passing reference by the court or tribunal.

Notably, this does not cover mere allegations of hallucinations, but only cases where the court or tribunal has explicitly found (or implied) that a party relied on hallucinated content or material.

As an exception, the database also covers some judicial decisions where AI use was alleged but not confirmed. This is a judgment call on my part.
in cases where generative AI produced hallucinated content – typically fake citations, but also other types of AI-generated arguments. It does not track the (necessarily wider) universe of all fake citations or use of AI in court filings.

While seeking to be exhaustive (59 cases identified so far), it is a work in progress and will expand as new examples emerge. This database has been featured in news media, and indeed in several decisions dealing with hallucinated material.2 Examples of media coverage include:
- M. Hiltzik, AI 'hallucinations' are a growing problem for the legal profession (LA Times, 22 May 2025)
- E. Volokh, "AI Hallucination Cases," from Courts All Over the World (Volokh Conspiracy, 18 May 2025)
- J-.M. Manach, "Il génère des plaidoiries par IA, et en recense 160 ayant « halluciné » depuis 2023" (Next, 1 July 2025) - J. Koebler & J. Roscoe, "18 Lawyers Caught Using AI Explain Why They Did It (404 Media, 30 September 2025)

If you know of a case that should be included, feel free to contact me.3 (Readers may also be interested in this project regarding AI use in academic papers.)

Based on this database, I have developped an automated reference checker that also detects hallucinations: PelAIkan. Check the Reports Report icon in the database for examples, and reach out to me for a demo !

For weekly takes on cases like these, and what they mean for legal practice, subscribe to Artificial Authority.

State
Party
Nature – Category
Nature – Subcategory

Case Court / Jurisdiction Date ▼ Party Using AI AI Tool Nature of Hallucination Outcome / Sanction Monetary Penalty Details Report(s)
Lozano González v. Roberge Housing Administrative Tribunal (Canada) 1 May 2025 Pro Se Litigant ChatGPT
False Quotes Legal Norm (1)

The landlord sought to repossess a rental property, claiming the lease renewal was suspended based on a misinterpretation of Quebec's civil code articles. He used ChatGPT to translate these articles, which resulted in a completely different meaning. The Tribunal found the repossession request invalid as it was based on a date prior to the lease's end. The Tribunal rejected the claim of abuse, accepting the landlord's sincere belief in his misinterpretation, influenced by AI translation, and noted his language barrier and residence in Mexico. The Tribunal advised the landlord to seek reliable legal advice in the future.

Simpson v. Hung Long Enterprises Inc. B.C. Civil Resolution Tribunal (Canada) 25 April 2025 Pro Se Litigant Unidentified
Fabricated Case Law (4)
Misrepresented Legal Norm (1)
Other side compensated for time spent through costs order (500 CAD)

"Ms. Simpson referred to a non-existent CRT case to support a patently incorrect legal position. She also referred to three Supreme Court of Canada cases that do not exist. Her submissions go on to explain in detail what legal principles those non-existent cases stand for. Despite these deficiencies, the submissions are written in a convincingly legal tone. Simply put, they read like a lawyer wrote them even though the underlying legal analysis is often wrong. These are all common features of submissions generated by artificial intelligence." [...]

"25. I agree with Hung Long that there are two extraordinary circumstances here that justify compensation for its time. The first is Ms. Simpson’s use of artificial intelligence. It takes little time to have a large language model create lengthy submissions with many case citations. It takes considerably more effort for the other party to wade through those submissions to determine which cases are real, and for those that are, whether they actually say what Ms. Simpson purported they did. Hung Long’s owner clearly struggled to understand Ms. Simpson’s submissions, and his legal research to try to understand them was an utter waste of his time. I reiterate my point above that Ms. Simpson’s submissions cited a non-existent case in support of a legal position that is the precise opposite of the existing law. This underscores the impact on Hung Long. How can a self-represented party respond to a seemingly convincing legal argument that is based on a case it is impossible to find?

26. I am mindful that Ms. Simpson is not a lawyer and that legal research is challenging. That said, she is responsible for the information she provides the CRT. I find it manifestly unfair that the burden of Ms. Simpson’s use of artificial intelligence should fall to Hung Long’s owner, who tried his best to understand submissions that were not capable of being understood. While I accept that Ms. Simpson did not knowingly provide fake cases or misleading submissions, she was reckless about their accuracy."

SQBox Solutions Ltd. v. Oak BC Civil Resolution Tribunal (Canada) 31 March 2025 Pro Se Litigant Implied
Fabricated Case Law (1)
False Quotes Legal Norm (2)
Misrepresented Case Law (4)
Litigant lost on merits

"By relying on inaccurate and false AI submissions, Mr. Oak hurts his own case. I understand that Mr. Oak himself might not be aware that the submissions are misleading, but they are his submissions and he is responsible for them. "

Source: Steve Finlay
AQ v. BT CRT (Canada) 28 March 2025 Pro Se Litigant Implied
Fabricated Case Law (2), Legal Norm (1)
Misrepresented Case Law (1), Legal Norm (1)
Arguments ignored
Geismayr v. The Owners, Strata Plan KAS 1970 Civil Resolution Tribunal (Canada) 14 February 2025 Pro Se Litigant Copilot
Fabricated Case Law (9)
Misrepresented Case Law (1)
Citations ignored
Duarte v. City of Richmond British Columbia Human Rights Tribunal (Canada) 18 December 2024 Pro Se Litigant Implied
Fabricated Case Law (1)
Warning

Nathan Duarte, a pro se litigant, filed a complaint against the City of Richmond alleging discrimination based on political beliefs. During the proceedings, Duarte cited three cases to support his claim that union affiliation is a protected characteristic. However, neither the City nor the Tribunal could locate these cases, leading to the suspicion that they were fabricated, possibly by a generative AI tool. The court held:

"While it is not necessary for me to determine if Mr. Duarte intended to mislead the Tribunal, I cannot rely on these “authorities” he cites in his submission. At the very least, Mr. Duarte has not followed the Tribunal’s Practice Direction for Legal Authorities, which requires parties, if possible, to provide a neutral citation so other participants can access a copy of the authority without cost. Still, I am compelled to issue a caution to parties who engage the assistance of generative AI technology while preparing submissions to the Tribunal, in case that is what occurred here. AI tools may have benefits. However, such applications have been known to create information, including case law, which is not derived from real or legitimate sources. It is therefore incumbent on those using AI tools to critically assess the information that it produces, including verifying the case citations for accuracy using legitimate sources. Failure to do so can have serious consequences. For lawyers, such errors have led to disciplinary action by the Law Society: see for example, Zhang v Chen, 2024 BCSC 285. Deliberate attempts to mislead the Tribunal, or even careless submission of fabricated information, could also form the basis for an award of costs under s. 37(4) of the Code. The integrity of the Tribunal’s process, and the justice system more broadly, requires parties to exercise diligence in ensuring that their engagement with artificial intelligence does not supersede their own judgement and credibility."

Monster Energy Company v. Pacific Smoke International Inc. Canadian Intellectual Property Office (Canada) 20 November 2024 Lawyer
Fabricated Case Law (1)
The fabricated citation was disregarded by the court.

In a trademark opposition case between Monster Energy Company and Pacific Smoke International Inc., the Applicant, Pacific Smoke, cited a non-existent case, 'Hennes & Mauritz AB v M & S Meat Shops Inc, 2012 TMOB 7', in support of its argument. This was identified as an AI hallucination by the court. The court disregarded this citation and reminded the Applicant of the seriousness of relying on false citations, whether accidental or AI-generated.

Industria de Diseño Textil, S.A. v. Sara Ghassai Canadian Intellectual Property Office (Canada) 12 August 2024 Lawyer Implied
Fabricated Case Law (1)
Warning
Zhang v. Chen BC Supreme Court (Canada) 20 February 2024 Lawyer ChatGPT
Fabricated Case Law (2)
Claimant awarded costs

"[29] Citing fake cases in court filings and other materials handed up to the court isan abuse of process and is tantamount to making a false statement to the court.Unchecked, it can lead to a miscarriage of justice."