AI Hallucination Cases

This database tracks legal decisions1 I.e., all documents where the use of AI, whether established or merely alleged, is addressed in more than a passing reference by the court or tribunal.

Notably, this does not cover mere allegations of hallucinations, but only cases where the court or tribunal has explicitly found (or implied) that a party relied on hallucinated content or material.

As an exception, the database also covers some judicial decisions where AI use was alleged but not confirmed. This is a judgment call on my part.
in cases where generative AI produced hallucinated content – typically fake citations, but also other types of AI-generated arguments. It does not track the (necessarily wider) universe of all fake citations or use of AI in court filings.

While seeking to be exhaustive (31 cases identified so far), it is a work in progress and will expand as new examples emerge. This database has been featured in news media, and indeed in several decisions dealing with hallucinated material.2 Examples of media coverage include:
- M. Hiltzik, AI 'hallucinations' are a growing problem for the legal profession (LA Times, 22 May 2025)
- E. Volokh, "AI Hallucination Cases," from Courts All Over the World (Volokh Conspiracy, 18 May 2025)
- J-.M. Manach, "Il génère des plaidoiries par IA, et en recense 160 ayant « halluciné » depuis 2023" (Next, 1 July 2025) - J. Koebler & J. Roscoe, "18 Lawyers Caught Using AI Explain Why They Did It (404 Media, 30 September 2025)

If you know of a case that should be included, feel free to contact me.3 (Readers may also be interested in this project regarding AI use in academic papers.)

Based on this database, I have developped an automated reference checker that also detects hallucinations: PelAIkan. Check the Reports Report icon in the database for examples, and reach out to me for a demo !

For weekly takes on cases like these, and what they mean for legal practice, subscribe to Artificial Authority.

State
Party
Nature – Category
Nature – Subcategory

Case Court / Jurisdiction Date ▼ Party Using AI AI Tool Nature of Hallucination Outcome / Sanction Monetary Penalty Details Report(s)
United States v. Clint Travis Rodgers W.D. Missouri (Central Division) (USA) 10 February 2026 Lawyer Implied
Misrepresented Case Law (4)
Admonishment
Nelson L. Bruce v. The United States D. South Carolina (USA) 9 February 2026 Pro Se Litigant Implied
Fabricated Case Law (1)
Misrepresented Legal Norm (1)
Warning
Nonnie Berg v. United Airlines, Inc. (3) D. Colorado (USA) 6 February 2026 Pro Se Litigant Unidentified
Fabricated Case Law (2)
Misrepresented Case Law (1)
Filing restriction
Source: Jesse Schaefer
Raul Gonzales Davila v. Roblen United States District Court, D. Connecticut (USA) 6 February 2026 Lawyer Unidentified
Fabricated Case Law (1)
False Quotes Case Law (1)
CLE
Nonnie Berg v. United Airlines, Inc. (2) D. Colorado (USA) 28 January 2026 Pro Se Litigant Implied
Fabricated Case Law (2)
Warning

The court identified citations to seemingly nonexistent cases in the plaintiff's filings (e.g., "Hernandez" and "United States v. Miller"), noted prior warnings about AI-generated or unvetted citations, denied the motions, and warned it may recommend dismissal or similar sanctions if the conduct continues.

United States v. Michael Shane DeBaere (2) W.D. Virginia (USA) 23 January 2026 Pro Se Litigant Implied
Fabricated Case Law (1)
False Quotes Case Law (2)
Misrepresented Case Law (1)
Warning
McNeal v. United Food and Commercial Workers Local 555; Safeway, Inc. D. Oregon (USA) 22 January 2026 Pro Se Litigant Implied
Fabricated Case Law (1)
Warning
United States v. Juliet Payseur and 20-22 McGregor Avenue, LLC D. New Jersey (USA) 7 January 2026 Pro Se Litigant Unidentified
Fabricated Case Law (1)
Misrepresented Case Law (2)
Warning
Report
Motion to Strike
Source: Jesse Schaefer
Gerou v. George, Whitten, and United States E.D. Wisconsin (USA) 18 December 2025 Pro Se Litigant Implied
Fabricated Case Law (3)
False Quotes Case Law (1)
Misrepresented Case Law (3)
Warning
Angelica E. Cruz et al. v. United States of America C.D. California (USA) 16 December 2025 Lawyer Implied
Fabricated Case Law (1)
Order to Show Cause
United States v. Brian Boehm M.D. Pennsylvania (USA) 3 December 2025 Pro Se Litigant Implied
False Quotes Case Law (2)
Misrepresented Case Law (1)
Ordered disclosure of AI use and affidavit certifying accuracy of citations for future filings

"12) The Al tool possibly used was sophisticated enough to include pinpoint citations to precedential Third Circuit authority.
13) Fortunately for Boehm , the cases cited in his motion are very real decisions by the Third Circuit Court of Appeals.
14) Unfortunately for Boehm, these cases are misrepresented in his motion and his motion also contains false quotations from these opinions."

Linda Oliver v. Christian Dribusch United States District Court, Northern District of New York (USA) 21 November 2025 Pro Se Litigant Implied
Fabricated Case Law (1)
Warning
Cotto v. United States D. Colorado (USA) 17 November 2025 Pro Se Litigant Implied
Fabricated Case Law (1)
Motion for reconsideration denied; court identified the cited case/citation as nonexistent/miscited and rejected reliance on it.
Source: Jesse Schaefer
Nathan Strong v. The United States Court of Federal Claims (USA) 13 November 2025 Pro Se Litigant Implied
Fabricated Case Law (2)
False Quotes Case Law (2)
Warning
United States v. Thomas Czartorski, et al. W.D. Kentucky (USA) 10 November 2025 Lawyer ChatGPT
Fabricated Case Law (3)
Misrepresented Case Law (2)
Order to show Cause

In his response, Counsel acknowledged that he first researched relevant cases, and then "entered the cases into ChatGPT and requested that it highlight favorable arguments contained in the list of cases."

Lareina A. Sauls v. Pierce County, et al. W.D. Washington (USA) 30 October 2025 Pro Se Litigant Implied
Misrepresented Case Law (1)
Warning
Nonnie Berg v. United Airlines, Inc. (1) D. Colorado (USA) 30 October 2025 Pro Se Litigant Implied
Fabricated Case Law (5), Exhibits or Submissions (1)
Warning

In an earlier Report and Reccomendations, the court found that significant portions of the plaintiff's filings copied from an AI program included citations to cases that could not be identified in Westlaw and an apparent AI-generated medical report; the court struck the filings and instructed compliance with Rule 11 and practice standards.

McCaster v. United States Court of Federal Claims (USA) 23 October 2025 Pro Se Litigant Implied
Fabricated Case Law (1)
Admonishment
Source: David Timm
Serafin v. United States Department of State, et al. E.D. Missouri (USA) 16 October 2025 Pro Se Litigant Implied
Fabricated Case Law (3)
Misrepresented Case Law (2)
Warning
United States v. Glennie Antonio McGee S.D. Alabama (USA) 10 October 2025 Lawyer Ghostwriter Legal
Fabricated Case Law (1)
False Quotes Case Law (1)
Misrepresented Case Law (1)
Outdated Advice Overturned Case Law (1)
Public reprimand, referral and order to notify jurisdictions; monetary sanction 5000

Folllowing a show cause order, Counsel admitted to having used the tool together with Google Search, and explained that, although he was aware of the issues with AI models like ChatGPT, he said he did not expect this tool to fall into the same issues.

The Court found Attorney James A. Johnson used Ghostwriter Legal to draft a motion that contained multiple fabricated case citations, misstated/false quotations attributed to authorities, and cited precedent that had been reversed by the Supreme Court. The Court found the conduct tantamount to bad faith and imposed sanctions under its inherent authority. Sanctions include an order to file, not under seal, this order "in any case in any court wherein he appears as counsel fortwelve (12) months after the date of this order."

David R. Pete v. United States Department of Justice, et al. E.D. Texas (USA) 10 October 2025 Pro Se Litigant Unidentified
Fabricated Case Law (2)
Magistrate Judge's recommendation adopted; in forma pauperis denied; plaintiff ordered to pay $405 filing fee within 10 days or the case will be dismissed.
Jackson v. United States DHS D. Nevada (USA) 1 October 2025 Pro Se Litigant Implied
Fabricated Case Law (1)
False Quotes Case Law (1)
Misrepresented Case Law (1)
Warning
United States v. Malik D. Maryland (USA) 19 September 2025 Pro Se Litigant Implied
Fabricated Legal Norm (1)
False Quotes Doctrinal Work (1)
Warning
Source: Jesse Schaefer
United States v. Michael Shane DeBaere (1) W.D. Virginia (USA) 27 August 2025 Pro Se Litigant Implied
Fabricated Case Law (3), Doctrinal Work (1)
Warning
Parra v. United States Court of Federal Claims (USA) 27 June 2025 Pro Se Litigant Unidentified
Fabricated Case Law (2)
Warning

Plaintiff Ravel Ferrera Parra, proceeding pro se, filed a lawsuit against the United States alleging financial harm due to misconduct by various judicial and governmental entities. The court dismissed the case for lack of jurisdiction, as the claims were not within the court's purview.

The court noted that Plaintiff's filings appeared to be assisted by AI, as evidenced by the rapid filing of responses tell-tale language ("Would you like additional affidavits, supporting exhibits, or further refinements before submission?"), the inclusion of fabricated case citations. "While Plaintiff’s use of AI, by itself, does not violate this Court’s Rules, Plaintiff’s citation to fake cases does."

The court further pointed out that:

"“It is no secret that generative AI programs are known to ‘hallucinate’ nonexistent cases.” Sanders, 176 Fed. Cl. at 169 (citation omitted). That appears to have happened here. When searching the Federal Claims Reporter for “Tucker v. United States, 71 Fed. Cl. 326 (2006),” Plaintiff’s citation brings the Court to the third page of Grapevine Imports, Ltd. v. United States, 71 Fed. Cl. 324, 326 (2006), a real tax case from this Court. Similarly, the AI used by Plaintiff in Sanders v. United States, 176 Fed. Cl. 163, 169 (2025) also made up a citation to a case called Tucker v. United States. Perhaps both AI programs hallucinated this case name based on the Tucker Act, this Court’s jurisdictional statute. Regardless, here, as in Sanders, the citation to a case called Tucker v. United States does not exist."

The court warned Plaintiff about the risks of using AI-generated content without verification but did not impose sanctions.

Sister City Logistics, Inc. v. John Fitzgerald United States District Court for the Southern District of Georgia (USA) 27 June 2025 Pro Se Litigant
Fabricated Case Law (1)
False Quotes Case Law (1)
Warning

The court observed that the original motion to remand filed pro se by the plaintiff contained non-existent case law and falsified quotations. Although the court did not impose sanctions in this instance, it warned that future use of fake legal authority would result in a show cause order, including against the Counsel who later joined the case.

Twist It Up, Inc. v. Annie International, Inc. United States District Court, Central District of California (USA) 27 June 2025 Lawyer Implied
Fabricated Case Law (2)
Misrepresented Case Law (1)
Monetary Sanction 500 USD

Penalty decided in an order from the next day.

Schoene v. Oregon Department of Human Services United States District Court for the District of Oregon (USA) 25 June 2025 Pro Se Litigant Implied
Fabricated Case Law (5)
Warning

"Before addressing the merits of Schoene’s motion, the Court notes that Schoene cited several cases in her reply brief to support her motion to amend, including Butler v. Oregon, 218 Or. App. 114 (2008), Curry v. Actavis, Inc., 2017 LEXIS 139126 (D. Or. Aug. 30, 2017), Estate of Riddell v. City of Portland, 194 Or. App. 227 (2004), Hampton v. City of Oregon City, 251 Or. App. 206 (2012), and State v. Burris, 107 Or. App. 542 (1991). These cases, however, do not exist. Schoene’s false citations appear to be hallmarks of an artificial intelligence (“AI”) tool, such as ChatGPT. It is now well known that AI tools “hallucinate” fake cases. See Kruse v. Karlen, 692 S.W.3d 43, 52 (Mo. Ct. App. 2024) (noting, in February 2024, that the issue of fictitious cases being submitted to courts had gained “national attention”).6 In addition, the Court notes that a basic internet search seeking guidance on whether it is advisable to use AI tools to conduct legal research or draft legal briefs will explain that any legal authorities or legal analysis generated by AI needs to be verified. The Court cautions Schoene that she must verify the accuracy of any future citations she may include in briefing before this Court and other courts"

Sanders v. United States Fed. claims court (USA) 31 March 2025 Pro Se Litigant Implied
Fabricated Case Law (4)
Misrepresented Case Law (1), Legal Norm (1)
Warning

AI Use

The plaintiff did not admit to using AI, but the court inferred likely use due to the submission of fabricated citations matching the structure and behavior typical of generative AI hallucinations. The decision referenced public concerns about AI misuse and cited specific examples of federal cases where similar misconduct occurred.

Hallucination Details

Plaintiff cited:

  • Tucker v. United States, 24 Cl. Ct. 536 (1991) – does not exist
  • Fargo v. United States, 184 F.3d 1096 (Fed. Cir. 1999) – fabricated citation pointing to an unrelated Ninth Circuit case
  • Bristol Bay Native Corporation v. United States, 87 Fed. Cl. 122 (2009) – fictional
  • Quantum Construction, Inc. v. United States, 54 Fed. Cl. 432 (2002) – nonexistent
  • Hunt Building Co., LLC v. United States, 61 Fed. Cl. 243 (2004) – real case misused; contains no mention of unjust enrichment

Ruling/Sanction

The court granted the government’s motion to dismiss for lack of subject matter jurisdiction under Rule 12(b)(1). Although the court found a clear Rule 11 violation, it opted not to sanction the plaintiff, citing the evolving context of AI use and the absence of bad faith. A formal warning was issued, with notice that future hallucinated filings may trigger sanctions.

Key Judicial Reasoning

Judge Roumel noted that plaintiff’s attempt to rely on fictional case law was a misuse of judicial resources and a disservice to her own advocacy. The court cited multiple precedents addressing hallucinated citations and AI misuse, stating clearly that while leeway is granted to pro se litigants, the line is crossed when filings rely on fictitious law.

United States v. Hayes E.D. Cal. (USA) 17 January 2025 Federal Defender Unidentified One fake case citation with fabricated quotation Formal Sanction Imposed + Written Reprimand

AI Use

Defense counsel Andrew Francisco submitted filings quoting and relying on a fabricated case (United States v. Harris, 761 F. Supp. 409 (D.D.C. 1991)) and a nonexistent quotation. Although Francisco claimed he had not used AI, the court found the fabrication bore the hallmarks of an AI hallucination and rejected his explanations as implausible.

Hallucination Details

Francisco cited and quoted from a wholly fictitious United States v. Harris case, which neither existed at the cited location nor contained the quoted material. Upon confrontation, Francisco incorrectly tried to shift the source to United States v. Broussard, but that case also did not contain the quoted text. Searches in Westlaw and Lexis confirmed the quotation existed nowhere.

Ruling/Sanction

The Court formally sanctioned Francisco for degrading the integrity of the court and violating professional responsibility rules. Although monetary sanctions were not immediately imposed, the misconduct was recorded and would be taken into account in future disciplinary proceedings if warranted.

Key Judicial Reasoning

The court emphasized that submitting fake legal authorities undermines judicial credibility, wastes opposing parties' resources, and abuses the adversarial system. Persistent refusal to candidly admit errors aggravated the misconduct. The Court explicitly cited Mata v. Avianca and other AI hallucination cases as precedent for sanctioning such behavior, finding Francisco’s case especially egregious due to repeated bad faith evasions after being given opportunities to correct the record.

Source: Volokh
Thomas v. Commissioner of Internal Revenue United States Tax Court (USA) 23 October 2024 Lawyer, Paralegal Implied
Misrepresented Case Law (3)
Pretrial Memorandum stricken

The lawyer for the petitioner admitted to not reviewing the memorandum, which was prepared by a paralegal. The court deemed the Pretrial Memorandum stricken but did not impose a monetary penalty, considering the economic situation of the petitioner and the lawyer's service to a client who might otherwise be unrepresented. It was also pertinent that the law being stated was accurate (even if the citations were wrong).