AI Hallucination Cases

This database tracks legal decisions1 I.e., all documents where the use of AI, whether established or merely alleged, is addressed in more than a passing reference by the court or tribunal.
Notably, this does not cover mere allegations of hallucinations, but only cases where the court or tribunal has explicitly found (or implied) that a party relied on hallucinated content or material.
in cases where generative AI produced hallucinated content – typically fake citations, but also other types of AI-generated arguments. It does not track the (necessarily wider) universe of all fake citations or use of AI in court filings.

While seeking to be exhaustive (3 cases identified so far), it is a work in progress and will expand as new examples emerge. This database has been featured in news media, and indeed in several decisions dealing with hallucinated material.2 Examples of media coverage include:
- M. Hiltzik, AI 'hallucinations' are a growing problem for the legal profession (LA Times, 22 May 2025)
- E. Volokh, "AI Hallucination Cases," from Courts All Over the World (Volokh Conspiracy, 18 May 2025)
- J-.M. Manach, "Il génère des plaidoiries par IA, et en recense 160 ayant « halluciné » depuis 2023" (Next, 1 July 2025) - J. Koebler & J. Roscoe, "18 Lawyers Caught Using AI Explain Why They Did It (404 Media, 30 September 2025)

If you know of a case that should be included, feel free to contact me.3 (Readers may also be interested in this project regarding AI use in academic papers.)

For weekly takes on cases like these, and what they mean for legal practice, subscribe to Artificial Authority.

State
Party
Nature – Category
Nature – Subcategory

Case Court / Jurisdiction Date ▼ Party Using AI AI Tool Nature of Hallucination Outcome / Sanction Monetary Penalty Details
KMG Wires Private Limited v. The National Faceless Assessment Centre et al. High Court Bombay (India) 6 October 2025 Expert Implied
Fabricated Case Law (1)
Assessment quashed and set aside

Assessing Officer relied on three judicial decisions that the court found to be non-existent. The court observed such citations appear to have been fetched (implicitly) from AI and admonished that quasi-judicial officers must verify AI-generated results before relying on them; held Assessment Order violated principles of natural justice and remanded matter.

Greenopolis Welfare Association (GWA) v. Narender Singh et al. Dehli High Court (India) 25 September 2025 Lawyer Unidentified
Fabricated Case Law (1)
False Quotes Case Law (1)
Petition withdrawn
Buckeye Trust v. PCIT (India) 30 December 2024 Judge Implied
Misrepresented Case Law (2), Legal Norm (2)
Outdated Advice Repealed Law (1)
Judgment was retracted and case re-heard

Seemingly, the judge cited back hallucinated authorities invoked by one counsel. The Judgment was later reportedly withdrawn.