AI Hallucination Cases

This database tracks legal decisions1 I.e., all documents where the use of AI, whether established or merely alleged, is addressed in more than a passing reference by the court or tribunal.

Notably, this does not cover mere allegations of hallucinations, but only cases where the court or tribunal has explicitly found (or implied) that a party relied on hallucinated content or material.

As an exception, the database also covers some judicial decisions where AI use was alleged but not confirmed. This is a judgment call on my part.
in cases where generative AI produced hallucinated content – typically fake citations, but also other types of AI-generated arguments. It does not track the (necessarily wider) universe of all fake citations or use of AI in court filings.

While seeking to be exhaustive (1312 cases identified so far), it is a work in progress and will expand as new examples emerge. This database has been featured in news media, and indeed in several decisions dealing with hallucinated material.2 Examples of media coverage include:
- M. Hiltzik, AI 'hallucinations' are a growing problem for the legal profession (LA Times, 22 May 2025)
- E. Volokh, "AI Hallucination Cases," from Courts All Over the World (Volokh Conspiracy, 18 May 2025)
- J-.M. Manach, "Il génère des plaidoiries par IA, et en recense 160 ayant « halluciné » depuis 2023" (Next, 1 July 2025) - J. Koebler & J. Roscoe, "18 Lawyers Caught Using AI Explain Why They Did It (404 Media, 30 September 2025)

If you have any questions about the database, a FAQ is available here.
And if you know of a case that should be included, feel free to contact me.3 (Readers may also be interested in this project regarding AI use in academic papers.)

Based on this database, I have developped an automated reference checker that also detects hallucinations: PelAIkan. Check the Reports Report icon in the database for examples, and reach out to me for a demo !

For weekly takes on cases like these, and what they mean for legal practice, subscribe to Artificial Authority.

State
Party
Nature – Category
Nature – Subcategory

Case Court / Jurisdiction Date ▼ Party Using AI AI Tool Nature of Hallucination Outcome / Sanction Monetary Penalty Details Report(s)
NCR v KKB Court of King’s Bench of Alberta (Canada) 9 July 2025 Pro Se Litigant Implied
Misrepresented Case Law (1)
The court disregarded the fabricated citations and did not impose costs on the self-represented litigant.
in re: Turner Iowa Attorney Disciplinary Board (USA) 9 July 2025 Lawyer Implied
Fabricated Case Law (1)
Motion stricken
Dhuruvasangary v. Toronto Standard Condominium Corporation No. 1532 ONCAT (Canada) 9 July 2025 Pro Se Litigant Implied
Fabricated Case Law (2)
Misrepresented Case Law (1)
VGH 3 S 1012/25; VG 2 K 1899/25 VGH Baden-Württemberg (Germany) 8 July 2025 Lawyer Implied
Fabricated Case Law (1)

This case was described and reviewed by Dean A. Blohm here.

Coomer v. Lindell/MyPillow, Inc. (1) D. Colorado (USA) 7 July 2025 Lawyer Co-Pilot, Westlaw’s AI, Gemini, Grok, Claude, ChatGPT, Perplexity
Fabricated Case Law (4)
False Quotes Case Law (2)
Misrepresented Case Law (5)
Monetary Sanctions 6000 USD

Prior Order to Show Cause available here.

After reviewing - and dismissing - the factual allegations made by Counsel, and noting that they had submitted errata in parallel cases (dealing with other fabricated citations), the court swiftly concluded that they "have violated Rule 11 because they were not reasonable in certifying that the claims, defenses, and other legal contentions contained in Defendants’ Opposition to Motion in Limine [Doc. 283] were warranted by existing law or by a nonfrivolous argument for extending, modifying, or reversing existing law or for establishing new law."

Both counsel were sanctioned with a 3,000 USD fine, payable to the court.

Smith v. Gamble Ohio CA (USA) 7 July 2025 Pro Se Litigant Implied Fabricated citation(s) Sanctions granted against Father (TBD) 1 USD
Source: Robert Freund
Wright Brothers Aero, Inc. B-423326.2 GAO (USA) 7 July 2025 Pro Se Litigant Unidentified
Fabricated Case Law (1)
Warning
Various Leaseholders v Assethold Limited Property Chamber (UK) 7 July 2025 Lawyer Implied
Fabricated Case Law (4)
Misrepresented Case Law (7)
In re Eugene Ezra Perkins D. Oregon (Bankruptcy) (USA) 7 July 2025 Pro Se Litigant Unidentified
Fabricated Case Law (2)
Misrepresented Case Law (2)
Case dismissed; motion to alter or amend denied
AQ v. BW Civil Resolution Tribunal (Canada) 4 July 2025 Pro Se Litigant Implied
False Quotes Legal Norm (1)
Monetary Sanction 1000 CAD

In the case AQ v. BW, the applicant AQ claimed damages for the non-consensual sharing of an intimate image by the respondent BW. Both parties were self-represented. The tribunal found that BW shared an intimate image of AQ without consent, violating the Intimate Images Protection Act (IIPA). BW attempted to defend their actions by citing a fabricated version of CRTA section 92, which was identified as a hallucination likely generated by artificial intelligence. Judge held:

"16. I have considered my obligation to give sufficient reasons. I do not consider that obligation to include responding to arguments concocted by artificial intelligence that have no basis in law. I accept that artificial intelligence can be a useful tool to help people find the right language to present their arguments, if used properly. However, people who blindly use artificial intelligence often end up bombarding the CRT with endless legal arguments. They cannot reasonably expect the CRT to address them all. So, while I have reviewed all the parties’ materials and considered all their arguments, I have decided against addressing many of the issues they raise. If I do not address a particular argument in this decision, it is because the argument lacks any merit, is about something plainly irrelevant, or both."

The tribunal dismissed BW's defenses as baseless and awarded AQ $5,000 in damages and an additional $1,000 for time spent due to BW's submission of irrelevant evidence. The tribunal emphasized that arguments concocted by AI without legal basis would not be addressed.

Source: Steve Finlay
Sharita Hill v. State of Oklahoma W.D. Oklahoma (USA) 3 July 2025 Lawyer Implied
Fabricated Case Law (1)
False Quotes Case Law (1)
Misrepresented Case Law (1)
Warning

"Further, these inaccuracies signal that Plaintiff's counsel may have used AI to assist in the drafting of Plaintiff's Response (or otherwise counsel produced exceptionally sloppy work). In this regard, this Court's Chambers Rules include “Disclosure and Certification Requirements” for use of “Generative Artificial Intelligence” and expressly provide that an attorney or party must disclose in any document to be filed with the Court “that AI was used and the specific AI tool that was used” and to “certify in the document that the person has checked the accuracy of any portion of the document drafted by generative AI, including all citations and legal authority.” See id.6 No such disclosure and certification has been made in this case. The Court's Rules further provide that an attorney will be responsible for the contents of any documents prepared with generative AI, in accordance with Rule 11 of the Federal Rules of Civil Procedure, and that the failure to make the disclosure and certification “may result in the imposition of sanctions.”"

Matter of Sewell Properties Trust Colorado Court of Appeals (USA) 3 July 2025 Pro Se Litigant Implied
Misrepresented Case Law (2), Legal Norm (2)
Warning

The court noted that:

"both Lehr-Guthrie's and McDonald's briefs are replete with errors in their citations to case authority, such as repeated citation errors, references to nonexistent quotes, and incorrect statements about the cases (for instance, as noted above, the two cases McDonald cited for a proposition relating to the duty of impartiality don't even reference that duty). This suggests to us that the briefs may have been drafted with the use of generative artificial intelligence (GAI). “[U]sing a GAI tool to draft a legal document can pose serious risks if the user does not thoroughly review the tool's output.” Al-Hamim v. Star Hearthstone, LLC, 2024 COA 128, ¶ 32. Self-represented litigants must be particularly careful, as they “may not understand that a GAI tool may confidently respond to a query regarding a legal topic ‘even if the answer contains errors, hallucinations, falsehoods, or biases.’ ” Id. (citation omitted).4 We advise the parties that errors caused by GAI in future filings may result in sanctions. See id. at ¶ 41. "

The court warned that future errors caused by AI could result in sanctions.

Tyrone Walker v. Juliane Pierre Massachusetts CA (USA) 3 July 2025 Lawyer Implied Fabricated citations Struck from the record

"In a prior order, we struck portions of Walker's brief that included citations to nonexistent cases. We note that the arguments raised would not have changed the outcome of the appeal in any event. "

Pulserate Investments v. Andrew Zuze and Others Supreme Court (Zimbabwe) 3 July 2025 Lawyer Unidentified 12 fabricated citations N/A

Counsel in charge apologised to the Court in a letter (available here), explaining that he had failed to supervise the work of his subordinates.

Plonit et al. v. The Administrator General in the Tel Aviv District et al. Family Court in Petah Tikva (Israel) 3 July 2025 Lawyer Unidentified
Fabricated Case Law (5), Doctrinal Work (6)
Monetary Penalty 7000 ILS
Muhammad v. Gap Inc. S.D. Ohio (USA) 3 July 2025 Pro Se Litigant ChatGPT
Fabricated Case Law (3)
False Quotes Case Law (4)
OSC

"Compounding the problem, what generative AI lacks in precision, it more than makes up for in speed. Litigants who simply file the material that AI tools generate, without carefully reviewing it first for accuracy, have the potential to swamp courts with what appear at first glance to be legal arguments built on law and precedent, but which are in fact nothing of the sort. And not only are these problems in their own right, but they also heighten the two concerns the Court highlighted above—that defendants will be forced to spend more time and incur more costs parsing through copious baseless filings to defend an action, and that Courts will waste precious time doing the same in ruling on motions and moving matters along."

(Plaintiff acknowledged use of ChatGPT in a subsequent filing)

Plaintiff was eventually designated as vexatious litigant and his case dismissed with prejudice.

Source: Jesse Schaefer
Murray v. State of Victoria Federal Court (Australia) 2 July 2025 Lawyer Google Scholar (allegedly)
Fabricated Case Law (1)
Order of costs to other party; no professional referral 1 AUD

"14    Here, the applicant's solicitor’s use of AI in the preparation of two court documents has given rise to cost, inconvenience and delay to the parties and has compromised the effectiveness of the administration of justice. But I do not consider the use of AI in this case means that it is appropriate to refer the solicitors’ conduct to the Victorian Legal Services Board. Here an inexperienced junior solicitor was given the task of preparing document citations for an amended pleading, and did so while working remotely and without access to the documents to be cited. In attempting to cite the relevant documents she used an (apparently AI-assisted) research tool which she considered had produced accurate citations when she previously used it. And as soon as Massar Briggs Law was told of the false citations the problem was addressed. The junior solicitor and the principal solicitor have apologised or expressed their regret to the other parties and the Court, and there was no suggestion that they were not genuine in doing so.

15    The junior solicitor took insufficient care in using Googe Scholar as the source of document citations in court documents, and in failing to check the citations against the physical and electronic copies of the cited documents that were held at Massar Briggs Law’s office. The error was centrally one of failing to check and verify the output of the search tool, which was contributed to by the inexperience of the junior solicitor and the failure of Mr Briggs to have systems in place to ensure that her work was appropriately supervised and checked. To censure those errors it is sufficient that these reasons be published."

Rafi Najib v MSS Security Pty Limited Fair Work Commission (Australia) 2 July 2025 Pro Se Litigant Unidentified
Fabricated Case Law (1)
Misrepresented Case Law (1), Legal Norm (1)
Application dismissed
Source: Jay Iyer
Angela and Theodore Chagnon v. Holly Nelson Chancery Court of Wyoming (USA) 2 July 2025 Pro Se Litigant Implied
Fabricated Case Law (1)
Misrepresented Legal Norm (1)
Order to show cause issued; potential striking of motion

Defendant Holly Nelson, appearing pro se, filed a motion to dismiss that included a fabricated case citation, Finch v. Smith, which does not exist. The court inferred that Nelson used AI to draft the motion without verifying the accuracy of the citations. The court issued an order to show cause, requiring Nelson to justify why her filing does not violate Rule 11, or alternatively, to withdraw her motion. If she fails to do so, the court intends to strike her motion entirely.

Source: Robert Freund
ATSum 0010525-47.2025.5.03.0037 Regional Labour Court (Brazil) 2 July 2025 Lawyer Unidentified
Fabricated Case Law (3)
Monetary Sanctions 1

Counsel for the claimant admitted to using an AI tool to draft the initial legal document without verifying its content. This resulted in the creation of non-existent judicial precedents to support the claimant's case. The court found this to be a serious violation of procedural loyalty and good faith, as it attempted to deceive the court and the opposing party. Consequently, the claimant was fined 5% of the updated value of the case for litigating in bad faith, although the lawyer's apologies were partially accepted, preventing a higher fine.

Bucher v. Appeals Committee Administrative Court (Israel) 2 July 2025 Lawyer Implied
Fabricated Case Law (1)
Misrepresented Case Law (1)
Monetary penalty imposed
Cologne District Court, 312 F 130/25 Cologne District Court (Family Court) (Germany) 2 July 2025 Lawyer Implied
Fabricated Case Law (1), Doctrinal Work (4)
Court admonished the attorney and warned that knowingly disseminating untruths may violate BRAO §43a(3)
Doe v. Noem D.C. DC (USA) 1 July 2025 Lawyer ChatGPT One fabricated authority Order to Show Cause

Fake citation, in this brief, was to : Moms Against Poverty v. Dep’t of State, 2022 WL 17951329, at *3 . Case docket can be found here.

Counsel later confirmed having used ChatGPT and apologised.

Shahid v. Esaam Georgia CA (USA) 30 June 2025 Judge, Lawyer Unidentified
Fabricated Case Law (14)
Misrepresented Case Law (4), Legal Norm (1)
Case remanded; monetary penalty 2500 USD

" After the trial court entered a final judgment and decree of divorce, Nimat Shahid (“Wife”) filed a petition to reopen the case and set aside the final judgment, arguing that service by publication was improper. The trial court denied the motion, using an order that relied upon non-existent case law."

"We are troubled by the citation of bogus cases in the trial court's order. As the reviewing court, we make no findings of fact as to how this impropriety occurred, observing only that the order purports to have been prepared by Husband's attorney, Diana Lynch. We further note that Lynch had cited the two fictitious cases that made it into the trial court's order in Husband's response to the petition to reopen, and she cited additional fake cases both in that Response and in the Appellee's Brief filed in this Court. "

Northbound Processing v. South African Diamond Regulator High Court (South Africa) 30 June 2025 Lawyer Legal Genius
Fabricated Case Law (2)
Misrepresented Case Law (2)
Referral to the Legal Practice Council for investigation

"[92] In Mavundla, the court emphasised the trite duty of legal practitioners not to mislead the court, whether through negligence or intent. This includes the duty to present an honest account of the law, which means (inter alia) not presenting fictitious or non-existent cases.24 In my view, it matters not that such cases were not presented orally, but were contained in written heads of argument. Written heads are as important a memorial of counsel’s argument as oral argument and, for purely practical reasons, are often more heavily relied upon by judges.

[...]

[95] In this case, counsel’s explanations bear out their submission that there was no deliberate attempt to mislead the court in relation to the use of incorrect case citations in the heads of argument. Their apologies are acknowledged. As is clear from Mavundla, however, even negligence in this context may have grave repercussions particularly to the administration of justice and, in appropriate circumstances, could constitute serious professional misconduct.

[96] As a consequence, it is appropriate to make the same order as in Mavundla, namely that the conduct of theapplicant’s legal practitioners is referred to the Legal Practice Council for investigation."

Crespo v. Tesla, Inc. S.D. Florida (USA) 30 June 2025 Pro Se Litigant Implied
Fabricated Case Law (2)
False Quotes Case Law (1)
Plaintiff required to apologize and pay attorney's fees 921 USD

In the case of Crespo v. Tesla, Inc., the pro se plaintiff, Leonardo Crespo, submitted discovery motions containing fabricated case citations and a false quote, which were identified as potentially generated by AI. The court ordered Crespo to show cause for these submissions and admitted to using AI in his filings. The court acknowledged Crespo's candor and imposed sanctions requiring him to apologize to the defendant's counsel and pay reasonable attorney's fees incurred by the defendant in addressing the fake citations.

(In a subsequent ruling, the court averred that the reasonable fees amount was 921 USD.)

Case No. 525309-08-22 Jerusalem Enforcement and Collection Authority (Israel) 30 June 2025 Lawyer Implied
Fabricated Legal Norm (1)
Misrepresented Legal Norm (1)
Warning
LaPointe v. Chief Animal Welfare Inspector Ontario's Animal Care Review Board (Canada) 30 June 2025 Pro Se Litigant ChatGPT
Fabricated Case Law (1), Exhibits or Submissions (1)
Misrepresented Case Law (1)
AI-generated materials excluded from evidence
Benjamin Gamble v. Ho-Chunk Nation Election Board Ho-Chunk Nation Trial Court (USA) 30 June 2025 Pro Se Litigant Implied
Fabricated Legal Norm (3)
Page v. Long Melbourne County Court (Australia) 27 June 2025 Pro Se Litigant Implied
Fabricated Case Law (1)
Misrepresented Case Law (2)
Litigant lost on merits

"Generative AI can be beguiling, particularly when the task of representing yourself seems overwhelming. However, a litigant runs the risk that their case will be damaged, rather than helped, if they choose to use AI without taking the time to understand what it produces, and to confirm that it is both legally and factually accurate. "

Parra v. United States Court of Federal Claims (USA) 27 June 2025 Pro Se Litigant Unidentified
Fabricated Case Law (2)
Warning

Plaintiff Ravel Ferrera Parra, proceeding pro se, filed a lawsuit against the United States alleging financial harm due to misconduct by various judicial and governmental entities. The court dismissed the case for lack of jurisdiction, as the claims were not within the court's purview.

The court noted that Plaintiff's filings appeared to be assisted by AI, as evidenced by the rapid filing of responses tell-tale language ("Would you like additional affidavits, supporting exhibits, or further refinements before submission?"), the inclusion of fabricated case citations. "While Plaintiff’s use of AI, by itself, does not violate this Court’s Rules, Plaintiff’s citation to fake cases does."

The court further pointed out that:

"“It is no secret that generative AI programs are known to ‘hallucinate’ nonexistent cases.” Sanders, 176 Fed. Cl. at 169 (citation omitted). That appears to have happened here. When searching the Federal Claims Reporter for “Tucker v. United States, 71 Fed. Cl. 326 (2006),” Plaintiff’s citation brings the Court to the third page of Grapevine Imports, Ltd. v. United States, 71 Fed. Cl. 324, 326 (2006), a real tax case from this Court. Similarly, the AI used by Plaintiff in Sanders v. United States, 176 Fed. Cl. 163, 169 (2025) also made up a citation to a case called Tucker v. United States. Perhaps both AI programs hallucinated this case name based on the Tucker Act, this Court’s jurisdictional statute. Regardless, here, as in Sanders, the citation to a case called Tucker v. United States does not exist."

The court warned Plaintiff about the risks of using AI-generated content without verification but did not impose sanctions.

Sister City Logistics, Inc. v. John Fitzgerald United States District Court for the Southern District of Georgia (USA) 27 June 2025 Pro Se Litigant
Fabricated Case Law (1)
False Quotes Case Law (1)
Warning

The court observed that the original motion to remand filed pro se by the plaintiff contained non-existent case law and falsified quotations. Although the court did not impose sanctions in this instance, it warned that future use of fake legal authority would result in a show cause order, including against the Counsel who later joined the case.

Assessment and Training Solutions Consulting B-423398 GAO (USA) 27 June 2025 Pro Se Litigant Implied Fabricated citations, misrepresented precedents Warning
Twist It Up, Inc. v. Annie International, Inc. United States District Court, Central District of California (USA) 27 June 2025 Lawyer Implied
Fabricated Case Law (2)
Misrepresented Case Law (1)
Monetary Sanction 500 USD

Penalty decided in an order from the next day.

Enviro Plus Duct Cleaning v. Department of Public Works Canadian ITT (Canada) 26 June 2025 Pro Se Litigant Implied
Fabricated Case Law (1)
Misrepresented Legal Norm (1)
Source: Courtready
Auto Test Ltd. v. Ministry of Transport Tel Aviv-Yafo District Court (Israel) 25 June 2025 Lawyer Implied
Fabricated Case Law (1)
Misrepresented Case Law (1)
Motion for Costs denied

"Regarding the petitioner, this is a case of improper conduct, to say the least, on the part of its counsel (who apologized for it), who made use of artificial intelligence in the petition and in the supplementary argument, in which many non-existent and/or erroneous judgments were inserted and embedded. In accordance with the Supreme Court's ruling, there would have been grounds, as a result, for dismissing the petition outright, but I did not do so due to the conduct of the state, as detailed above, and due to the importance of publishing the tender. However, in this case, there is no place to award costs in favor of the petitioner, due to this improper conduct (as, beyond that, the petition also requested many remedies, some of which are not within the jurisdiction of this court)."

(Translation by Gemini 2.5.)

Schoene v. Oregon Department of Human Services United States District Court for the District of Oregon (USA) 25 June 2025 Pro Se Litigant Implied
Fabricated Case Law (5)
Warning

"Before addressing the merits of Schoene’s motion, the Court notes that Schoene cited several cases in her reply brief to support her motion to amend, including Butler v. Oregon, 218 Or. App. 114 (2008), Curry v. Actavis, Inc., 2017 LEXIS 139126 (D. Or. Aug. 30, 2017), Estate of Riddell v. City of Portland, 194 Or. App. 227 (2004), Hampton v. City of Oregon City, 251 Or. App. 206 (2012), and State v. Burris, 107 Or. App. 542 (1991). These cases, however, do not exist. Schoene’s false citations appear to be hallmarks of an artificial intelligence (“AI”) tool, such as ChatGPT. It is now well known that AI tools “hallucinate” fake cases. See Kruse v. Karlen, 692 S.W.3d 43, 52 (Mo. Ct. App. 2024) (noting, in February 2024, that the issue of fictitious cases being submitted to courts had gained “national attention”).6 In addition, the Court notes that a basic internet search seeking guidance on whether it is advisable to use AI tools to conduct legal research or draft legal briefs will explain that any legal authorities or legal analysis generated by AI needs to be verified. The Court cautions Schoene that she must verify the accuracy of any future citations she may include in briefing before this Court and other courts"

Dastou v. Holmes Massachusetts (USA) 25 June 2025 Lawyer ChatGPT Fabricated citations and false quotes CLE Course obligation; endorsement of decision not to bill client
Romero v. Goldman Sachs Bank USA S.D.N.Y. (USA) 25 June 2025 Pro Se Litigant Implied
Fabricated Case Law (1)
False Quotes Case Law (1)
Misrepresented Case Law (1)
Warning
Hussein v. Canada Ottawa (Canada) 24 June 2025 Lawyer Visto.Ai
Fabricated Case Law (1)
Monetary Sanction 100 CAD

In the original order, the court held:

"[38] Applicants’ counsel provided further correspondence advising, for the first time, of his reliance on Visto.ai described as a professional legal research platform designed specifically for Canadian immigration and refugee law practitioners. He also indicated that he did not independently verify the citations as they were understood to reflect well established and widely accepted principles of law. In other words, the undeclared and unverified artificial intelligence had no impact, and the substantive legal argument was unaffected and supported by other cases.

[39] I do not accept that this is permissible. The use of generative artificial intelligence is increasingly common and a perfectly valid tool for counsel to use; however, in this Court, its use must be declared and as a matter of both practice, good sense and professionalism, its output must be verified by a human. The Court cannot be expected to spend time hunting for cases which do not exist or considering erroneous propositions of law.

[40] In fact, the two case hallucinations were not the full extent of the failure of the artificial intelligence product used. It also hallucinated the proper test for the admission on judicial review of evidence not before the decision-maker and cited, as authority, a case which had no bearing on the issue at all. To be clear, this was not a situation of a stray case with a variation of the established test but, rather, an approach similar to the test for new evidence on appeal. As noted above, the case relied upon in support of the wrong test (Cepeda-Gutierrez) has nothing to do with the issue. I note in passing that the case comprises 29 paragraphs and would take only a few minutes to review.

[41] In addition, counsel’s reliance on artificial intelligence was not revealed until after the issuance of four Directions. I find that this amounts to an attempt to mislead the Court and to conceal the reliance by describing the hallucinated authorities as “mis-cited” Had the initial request for a Book of Authorities resulted in the explanation in the last letter, I may have been more sympathetic. As matters stand, I am concerned that counsel does not recognize the seriousness of the issue."

In the final order, the court added:

"While the use of generative AI is not the responsibility of the responding party, it was not appropriate for the Respondent to not make any response to the Court’s four directions and Order. Indeed, assuming that the Respondent noticed the hallucinated cases on receipt of the written argument, it should have brought this to the attention of the Court.

[...]

Given that Applicant’s counsel was not remunerated for his services in the file, which included the motion on which the offending factum was filed and a motion for a stay of removal and, in addition, that I am also of the view that the Respondent’s lack of action exacerbated matters and it should not benefit as a result, I am ordering a modest amount of $100 to be payable by Applicant’s counsel personally."

Mintvest Capital, LTD v. NYDIG Trust Company, et al. D.C. Puerto Rico (USA) 23 June 2025 Lawyer Claude
Fabricated Case Law (3)
False Quotes Case Law (5)
Misrepresented Case Law (3)
Order to pay opposing counsel's fees 1 USD

Plaintiff's counsel in the case of Mintvest Capital, LTD v. NYDIG Trust Company, et al., was found to have included numerous non-existent cases, false quotations, and misrepresented precedents in their filings. The errors were attributed to the use of the AI tool 'Claude' without proper verification. The court recommended sanctions under Rule 11, requiring the attorney to pay the defendants' attorney fees related to the faulty submissions. The court emphasized the need for attorneys to ensure the accuracy of citations, especially when using AI tools, to maintain professional standards.

Malone & Anor v Laois County Council & Ors High Court (Ireland) 23 June 2025 Pro Se Litigant Implied
Fabricated Legal Norm (1)
False Quotes Case Law (1)
Misrepresented Exhibits or Submissions (1), Legal Norm (1)
Warning

Referring to Ayinde, the judge held that "The principle is essentially the same - though I hasten to say that I would not push the analogy too far as to a factual comparison of the present case with that case and the error in the present case is not of the order of the misconduct in that case. However, appreciable judicial time was wasted on the issue - not least trying to find the source of the quotation. And it does illustrate:

  • The vital importance of precision and accuracy in written submissions. That duty lies on lay litigants as much as on lawyers.
  • That text in submissions formatted so as to convey that it is a direct and verbatim quotation from an identified source must be exactly that. Of course, it is permissible to edit the text (for example to exclude irrelevant content or by underlining for emphasis) but, if so, that it has been done must be apparent on the face of the document.
  • That opposing parties are entitled to written submissions in good time to check them.

43. All that said, in a substantive sense, the issue is not vital to this case. The underlying proposition for which Mr Malone contends - that domestic courts must implement EU law - is uncontroversial. Not least for that reason, and in light also of the manner in which Mr Malone generally presented his case at the hearing, I am inclined to accept that there was no attempt or intention to mislead and accept also that Mr Malone has apologized for the error. It does not affect the outcome of the present motions."

Iskenderian v. Southeastern Hawai'i (USA) 23 June 2025 Lawyer Fabricated citations N/A

After other side pointed out that all authorities cited were fictitious, Counsel admitted it in brief. The court seemingly did not react.

Bottrill v Graham & Anor (No 2) District Court of New South Wales (Australia) 20 June 2025 Pro Se Litigant Unidentified
Fabricated Case Law (1)
The second defendant's Notice of Motion for summary dismissal of the plaintiff’s claim was dismissed, with costs reserved to the trial judge.

"When the parties came before the court on 22 May 2025, there had been little time for the plaintiff, the first defendant and the court to examine the second defendant’s written submissions served late on the night before. It was nevertheless immediately apparent that the second defendant sought to rely upon authority and court rules which were not merely misstated but, in some circumstances, imaginary. I am satisfied that all of the judgments and rules referred to in the submissions of 21 May 2025 were misstated, non-existent, or both, and that Gen AI had been used to prepare these submissions.

An example was the citation of a decision of the Supreme Court of New South Wales described as “Wu v Wilks” (I will not provide the citation given in full, as there is a risk of it being picked up as genuine by other Gen AI: Luck v Secretary, Services Australia [2025] FCAFC 26 at [14]). There is no decision with this name, either in the Supreme Court of New South Wales or in any other jurisdictions. The caselaw citation given for “Wu v Wilks” belonged to a judgment on wholly unrelated material and the principles of law for which it was cited. All of the citations suffered similar problems.

I drew these issues to the attention of the second defendant and enquired whether she had used Gen AI in the preparation of her submissions and, if so, whether she was aware of the Practice Note. She acknowledged that she had done so but said this was because she had very little time to provide submissions in reply and was deeply distressed by these proceedings"

Pro Health Solutions Ltd v ProHealth Inc Intellectual Property Office (UK) 20 June 2025 Pro Se Litigant, Lawyer ChatGPT
False Quotes Case Law (3)
Misrepresented Case Law (6), Doctrinal Work (2)
Warning; No costs awarded for the appeal since both sides seemingly erred

Claimant used Chat GPT to assist in drafting his grounds of appeal and skeleton argument. The documents included fabricated citations and misrepresented case summaries. Claimant admitted to using Chat GPT and apologized for the errors.

Compounding matters, the court suspected that the respondent had also used AI, since the cases cited in the Counsel's skeleton, though extant, did not support any of the propositions made - and Counsel was unable to explain how they got there.

Reilly v. Conn. Interlocal Risk Mgmt. Agency D. Connecticut (USA) 20 June 2025 Pro Se Litigant Implied
False Quotes Case Law (1)
Misrepresented Case Law (1)
Warning

"Artificial intelligence may ultimately prove a helpful tool to assist pro se litigants in bringing meritorious cases to the courts. In that way, artificial intelligence has the potential to contribute to the cause of justice. However, accessing any beneficial use of artificial intelligence requires carefully understanding its limitations. For example, if merely asked to write an opposition to an opposing party’s motion or brief, or to respond to a court order, an artificial intelligence program is likely to generate such a response, regardless of whether the response actually has an arguable basis in the law. Where the court or opposing party was correct on the law, the program will very likely generate a response or brief that includes a false statement of the law. And because artificial intelligence synthesizes many sources with varying degrees of trustworthiness, reliance on artificial intelligence without independent verification renders litigants unable to represent to the Court that the information in their filings is truthful."

Monfared v. Minister of Citizenship and Immigration IRB (Canada) 20 June 2025 Lawyer ChatGTP
Fabricated Case Law (1)
Misrepresented Exhibits or Submissions (1)
J.R.V. v. N.L.V. SC British Columbia (Canada) 19 June 2025 Pro Se Litigant Unidentified
Fabricated Case Law (1)
Costs to the claimant in the amount of $200. 200 CAD

In the case of J.R.V. v. N.L.V., the respondent, appearing in person, used a generative AI tool to prepare parts of her written argument. This resulted in the inclusion of citations to non-existent cases, known as 'hallucinations.' The claimant sought costs due to the need to research and respond to these false citations. The court acknowledged the issue but noted that the respondent was not represented by counsel and was unaware of the AI's capability to generate false citations. Moreover, the claimant was wrong as to the alleged non-existence of some citations. The court ordered the respondent to pay $200 in costs to the claimant.

UB v Secretary of State for the Home Department Upper Tribunal (Immigration and Asylum Chamber) (UK) 18 June 2025 Lawyer Unidentified
Fabricated Case Law (1)
Misrepresented Case Law (1)
In re Marriage of Isom and Kareem Illinois CA (USA) 16 June 2025 Pro Se Litigant Implied
Fabricated Case Law (1)
Misrepresented Case Law (1)
The appeal was denied, and the trial court's decision was affirmed.