AI Hallucination Cases

This database tracks legal decisions1 I.e., all documents where the use of AI, whether established or merely alleged, is addressed in more than a passing reference by the court or tribunal.
Notably, this does not cover mere allegations of hallucinations, but only cases where the court or tribunal has explicitly found (or implied) that a party relied on hallucinated content or material.
in cases where generative AI produced hallucinated content – typically fake citations, but also other types of AI-generated arguments. It does not track the (necessarily wider) universe of all fake citations or use of AI in court filings.

While seeking to be exhaustive (35 cases identified so far), it is a work in progress and will expand as new examples emerge. This database has been featured in news media, and indeed in several decisions dealing with hallucinated material.2 Examples of media coverage include:
- M. Hiltzik, AI 'hallucinations' are a growing problem for the legal profession (LA Times, 22 May 2025)
- E. Volokh, "AI Hallucination Cases," from Courts All Over the World (Volokh Conspiracy, 18 May 2025)
- J-.M. Manach, "Il génère des plaidoiries par IA, et en recense 160 ayant « halluciné » depuis 2023" (Next, 1 July 2025) - J. Koebler & J. Roscoe, "18 Lawyers Caught Using AI Explain Why They Did It (404 Media, 30 September 2025)

If you know of a case that should be included, feel free to contact me.3 (Readers may also be interested in this project regarding AI use in academic papers.)

For weekly takes on cases like these, and what they mean for legal practice, subscribe to Artificial Authority.

State
Party
Nature – Category
Nature – Subcategory

Case Court / Jurisdiction Date ▼ Party Using AI AI Tool Nature of Hallucination Outcome / Sanction Monetary Penalty Details
X.L. v. Z.L. et al Ontario SCJ (Canada) 16 October 2025 Pro Se Litigant Implied
Fabricated Case Law (2)
Misrepresented Case Law (6)
No reliance on authorities submitted; AI use to be a factor in costs submissions
Ren v. Area 09 BCPAAB (Canada) 7 October 2025 Pro Se Litigant Implied
Fabricated Case Law (2)
Misrepresented Doctrinal Work (1)
Breach of Board's Code of Conduct
Specter Aviation Limited v. Laprade CS Québec (Canada) 1 October 2025 Pro Se Litigant Unidentified
Fabricated Case Law (1)
Monetary sanction for procedural misconduct 5000 CAD

Monsieur Laprade filed a contestation containing multiple citations to non-existent authorities generated with the assistance of artificial intelligence. The Court found these to be fabricated (so-called "hallucinated") citations, constituting a manquement important to the conduct of the proceeding under art. 342 C.p.c., and imposed a 5,000$ sanction.

Reddy v Saroya CA Alberta (Canada) 26 September 2025 Lawyer Implied
Fabricated Case Law (1)

The appellant's original factum contained references to seven cases that could not be located (six allegedly decisions of this Court). Respondent flagged the issue; appellant's counsel ultimately acknowledged a contractor-drafted factum and that a large language model may have been used. The Court allowed an amended factum and reserved costs, warning that use of LLM without verification may attract costs, contempt proceedings, or Law Society referral.

Stile Carpentry Ltd. v. 2004424 Ontario CA Ontario (Canada) 23 September 2025 Pro Se Litigant Implied
Fabricated Case Law (1), Exhibits or Submissions (1), Legal Norm (1)
False Quotes Case Law (1)
Misrepresented Case Law (1)
Régie du bâtiment du Québec c. 9308-2469 Québec inc. Régie du bâtiment du Québec (Canada) 11 September 2025 Pro Se Litigant ChatGPT
Fabricated Case Law (1), Legal Norm (1)
Disregarded AI-generated arguments
Re X Corp. BC Civil Resolution Tribunal (Canada) 4 September 2025 Pro Se Litigant Unidentified
False Quotes Case Law (1)
Misrepresented Case Law (1)
Claim for compensation dismissed due to false and misleading AI-assisted submissions
Source: Steve Finlay
Lockwood v. ICBC BC Civil Resolution Tribunal (Canada) 3 September 2025 Pro Se Litigant Implied
Fabricated Legal Norm (2)
Argument ignored
Alana Kotler v Ontario Secondary School Teachers’ Federation; Toronto District School Board (Intervenor) Ontario Labour Relations Board (ON LRB) (Canada) 29 August 2025 Pro Se Litigant implied
Fabricated Case Law (14)
Misrepresented Case Law (1)
Application dismissed for failing to make out a prima facie violation; Board will not consider cases that do not exist or cannot be located.
The applicant relied on numerous case citations that the Board and OSSTF could not locate as cited; one located decision did not support the proposition relied upon. Applicant acknowledged possible citation errors and was asked to provide copies but objected. The Board refused to rely on unlocatable authorities and dismissed the application.
Myers v. Tarion Warranty Corporation Ontario (Canada) 28 August 2025 Pro Se Litigant Implied
Fabricated Case Law (1)
False Quotes Case Law (1)
Misrepresented Case Law (1)
Arguments ignored
Maxwell v. WestJet Airlines Ltd. Civil Resolution Tribunal (Canada) 15 August 2025 Pro Se Litigant ChatGPT
Fabricated Case Law (1)
Misrepresented Exhibits or Submissions (1)
Outdated Advice Repealed Law (1)
Argument given no weight
Source: Steve Finlay
Musselman v. Vanderstelt CA British Columbia (Canada) 8 August 2025 Pro Se Litigant Implied
Fabricated Case Law (2)
Weighed in deciding to grant security for trial costs
Halton (Regional Municipality) v. Rewa et al. Ontario SJC (Canada) 1 August 2025 Pro Se Litigant Unidentified
Fabricated Case Law (3)
Misrepresented Case Law (1)
Motion adjourned, opposing party's costs to be compensated 1 CAD
Pennytech Inc v Superior Building Group Limited Ontario Landlord and Tenant Board (Canada) 21 July 2025 Lawyer Implied
Fabricated Case Law (5), Legal Norm (1)
Misrepresented Exhibits or Submissions (1)
Warning
Moradi v. British Columbia (Human Rights Tribunal) Supreme Court of British Columbia (Canada) 18 July 2025 Pro Se Litigant ChatGPT
Fabricated Case Law (3)
Misconduct taken into account in allocating costs
Blaser v. Campbell Civil Resolution Tribunal (Canada) 15 July 2025 Pro Se Litigant Implied
Fabricated Case Law (2)
Warning
Source: Steve Finlay
Lloyd’s Register Canada v. Munchang Choi Federal Court of Canada (Canada) 10 July 2025 Pro Se Litigant Unidentified
Fabricated Case Law (1)
Misrepresented Case Law (1)
Motion Record removed from Court file; costs awarded to Applicant 500 CAD

The Respondent, a self-represented litigant, used generative AI tools for drafting and preliminary research, leading to the citation of a non-existent case, 'Fontaine v Canada, 2004 FC 1777', in his Motion Record. The Court found this to be a fabricated citation, and the (allegedly) intended citation pointed to an irrelevant case.

The court further pointed out that the Respondent had already been caught fabricating citations in a previous proceeding. Despite acknowledging use of AI, the respondent had also failed to provide the declaration on this point required by the AI Practice Direction. The Court ordered the removal of the Motion Record from the file. Costs of $500 CAD were awarded to the Applicant.

NCR v KKB Court of King’s Bench of Alberta (Canada) 9 July 2025 Pro Se Litigant Implied
Misrepresented Case Law (1)
The court disregarded the fabricated citations and did not impose costs on the self-represented litigant.
AQ v. BW Civil Resolution Tribunal (Canada) 4 July 2025 Pro Se Litigant Implied
False Quotes Legal Norm (1)
Monetary Sanction 1000 CAD

In the case AQ v. BW, the applicant AQ claimed damages for the non-consensual sharing of an intimate image by the respondent BW. Both parties were self-represented. The tribunal found that BW shared an intimate image of AQ without consent, violating the Intimate Images Protection Act (IIPA). BW attempted to defend their actions by citing a fabricated version of CRTA section 92, which was identified as a hallucination likely generated by artificial intelligence. Judge held:

"16. I have considered my obligation to give sufficient reasons. I do not consider that obligation to include responding to arguments concocted by artificial intelligence that have no basis in law. I accept that artificial intelligence can be a useful tool to help people find the right language to present their arguments, if used properly. However, people who blindly use artificial intelligence often end up bombarding the CRT with endless legal arguments. They cannot reasonably expect the CRT to address them all. So, while I have reviewed all the parties’ materials and considered all their arguments, I have decided against addressing many of the issues they raise. If I do not address a particular argument in this decision, it is because the argument lacks any merit, is about something plainly irrelevant, or both."

The tribunal dismissed BW's defenses as baseless and awarded AQ $5,000 in damages and an additional $1,000 for time spent due to BW's submission of irrelevant evidence. The tribunal emphasized that arguments concocted by AI without legal basis would not be addressed.

Source: Steve Finlay
LaPointe v. Chief Animal Welfare Inspector Ontario's Animal Care Review Board (Canada) 30 June 2025 Pro Se Litigant ChatGPT
Fabricated Case Law (1), Exhibits or Submissions (1)
Misrepresented Case Law (1)
AI-generated materials excluded from evidence
Hussein v. Canada Ottawa (Canada) 24 June 2025 Lawyer Visto.Ai
Fabricated Case Law (1)
Monetary Sanction 100 CAD

In the original order, the court held:

"[38] Applicants’ counsel provided further correspondence advising, for the first time, of his reliance on Visto.ai described as a professional legal research platform designed specifically for Canadian immigration and refugee law practitioners. He also indicated that he did not independently verify the citations as they were understood to reflect well established and widely accepted principles of law. In other words, the undeclared and unverified artificial intelligence had no impact, and the substantive legal argument was unaffected and supported by other cases.

[39] I do not accept that this is permissible. The use of generative artificial intelligence is increasingly common and a perfectly valid tool for counsel to use; however, in this Court, its use must be declared and as a matter of both practice, good sense and professionalism, its output must be verified by a human. The Court cannot be expected to spend time hunting for cases which do not exist or considering erroneous propositions of law.

[40] In fact, the two case hallucinations were not the full extent of the failure of the artificial intelligence product used. It also hallucinated the proper test for the admission on judicial review of evidence not before the decision-maker and cited, as authority, a case which had no bearing on the issue at all. To be clear, this was not a situation of a stray case with a variation of the established test but, rather, an approach similar to the test for new evidence on appeal. As noted above, the case relied upon in support of the wrong test (Cepeda-Gutierrez) has nothing to do with the issue. I note in passing that the case comprises 29 paragraphs and would take only a few minutes to review.

[41] In addition, counsel’s reliance on artificial intelligence was not revealed until after the issuance of four Directions. I find that this amounts to an attempt to mislead the Court and to conceal the reliance by describing the hallucinated authorities as “mis-cited” Had the initial request for a Book of Authorities resulted in the explanation in the last letter, I may have been more sympathetic. As matters stand, I am concerned that counsel does not recognize the seriousness of the issue."

In the final order, the court added:

"While the use of generative AI is not the responsibility of the responding party, it was not appropriate for the Respondent to not make any response to the Court’s four directions and Order. Indeed, assuming that the Respondent noticed the hallucinated cases on receipt of the written argument, it should have brought this to the attention of the Court.

[...]

Given that Applicant’s counsel was not remunerated for his services in the file, which included the motion on which the offending factum was filed and a motion for a stay of removal and, in addition, that I am also of the view that the Respondent’s lack of action exacerbated matters and it should not benefit as a result, I am ordering a modest amount of $100 to be payable by Applicant’s counsel personally."

J.R.V. v. N.L.V. Supreme Court of British Columbia (Canada) 19 June 2025 Pro Se Litigant Unidentified
Fabricated Case Law (1)
Costs to the claimant in the amount of $200. 200 CAD

In the case of J.R.V. v. N.L.V., the respondent, appearing in person, used a generative AI tool to prepare parts of her written argument. This resulted in the inclusion of citations to non-existent cases, known as 'hallucinations.' The claimant sought costs due to the need to research and respond to these false citations. The court acknowledged the issue but noted that the respondent was not represented by counsel and was unaware of the AI's capability to generate false citations. Moreover, the claimant was wrong as to the alleged non-existence of some citations. The court ordered the respondent to pay $200 in costs to the claimant.

Attorney General v. $32,000 in Canadian Currency Ontario SCJ (Canada) 16 June 2025 Pro Se Litigant Implied
Fabricated Case Law (2)
Warning

"[49] Mr. Ohenhen submitted a statement of legal argument to the court in support of his arguments. In those documents, he referred to at least two non-existent or fake precedent court cases, one ostensibly from the Court of Appeal for Ontario and another ostensibly from the British Columbia Court of Appeal. In reviewing his materials after argument, I tried to access these cases and was unable to find them. I asked the parties to provide them to me.

[50] Mr. Ohenhen responded with a “clarification”, providing different citations to different cases. I asked for an explanation as to where the original citations came from, and specifically, whether they were generated by artificial intelligence. I have received no response to that query.

[51] While Mr. Ohenhen is not a lawyer with articulated professional responsibilities to the court, every person who submits authorities to the court has an obligation to ensure that those authorities exist. Simple CanLII searches would have revealed to Mr. Ohenhen that these were fictitious citations. Putting fictitious citations before the court misleads the court. It is unacceptable. Whether the cases are put forward by a lawyer or self-represented party, the adverse effect on the administration of justice is the same.

[52] Mr. Ohenhen’s failure to provide a direct and forthright answer to the court’s questions is equally concerning.

[53] Court processes are not voluntary suggestions, to be complied with if convenient or helpful to one’s case. The proper administration of justice requires parties to respect the rules and proceed in a forthright manner. That has not happened here.

[54] I have not attached any consequences to this conduct in this case. However, should such conduct be repeated in any court proceedings, Mr. Ohenhen should expect consequences. Other self-represented litigants should be aware that serious consequences from such conduct may well flow."

Zahariev v. Zaharieva Supreme Court of British Columbia (Canada) 9 June 2025 Pro Se Litigant Implied
Fabricated Case Law (3)
Misrepresented Case Law (1)
Ko v. Li Ontario SCJ (Canada) 28 May 2025 Lawyer ChatGPT
Fabricated Case Law (3)
Plaintiff’s application dismissed; no costs imposed; court warns against future use of generative AI without verification

(Order to show cause is here.)

Hallucination Details

The applicant’s factum included citations to:

  • Alam v. Shah, which linked to an unrelated case (Gatoto v. 5GC Inc.)
  • DaCosta v. DaCosta, which returned a 404 error
  • Johnson v. Lanka, cited for a proposition directly contradicted by the decision
  • Meschino Estate v. Meschino, which actually linked to a wrongful dismissal case (Antonacci v. Great Atlantic & Pacific Co.)

The judge noted these citations bore “hallmarks of an AI response” and described the conduct as possibly involving “hallucinations” from generative AI. The court ordered counsel to appear to explain whether she knowingly relied on AI and failed to verify the content. No clarification or correction was received from counsel after the hearing.

Ruling/Sanction

At the end of the show cause proceedings, Justice Myers noted that, due to the media reports about this case, the goals of any further contempt proceedings were already met, including: "maintaining the dignity of the court and the fairness of civil justice system, promoting honourable behaviour by counsel before the court, denouncing serious misconduct, deterring similar future misconduct by the legal profession, the public generally, and by Ms. Lee specifically, and rehabilitation".

The judge therefore declined to impose a fine or to continue the contempt proceedings, on the condition that Counsel undertakes Continuing Professional Development courses (as she said she would), and does not bill her client for any unrelated work (which was helped by the fact that she had so far been working pro bono).

R. v. Chand Ontario (Canada) 26 May 2025 Lawyer Implied
Misrepresented Case Law (1)
Warning and Directions for Remainder of case
Lozano González v. Roberge Housing Administrative Tribunal (Canada) 1 May 2025 Pro Se Litigant ChatGPT
False Quotes Legal Norm (1)

The landlord sought to repossess a rental property, claiming the lease renewal was suspended based on a misinterpretation of Quebec's civil code articles. He used ChatGPT to translate these articles, which resulted in a completely different meaning. The Tribunal found the repossession request invalid as it was based on a date prior to the lease's end. The Tribunal rejected the claim of abuse, accepting the landlord's sincere belief in his misinterpretation, influenced by AI translation, and noted his language barrier and residence in Mexico. The Tribunal advised the landlord to seek reliable legal advice in the future.

Simpson v. Hung Long Enterprises Inc. B.C. Civil Resolution Tribunal (Canada) 25 April 2025 Pro Se Litigant Unidentified
Fabricated Case Law (4)
Misrepresented Legal Norm (1)
Other side compensated for time spent through costs order (500 CAD)

"Ms. Simpson referred to a non-existent CRT case to support a patently incorrect legal position. She also referred to three Supreme Court of Canada cases that do not exist. Her submissions go on to explain in detail what legal principles those non-existent cases stand for. Despite these deficiencies, the submissions are written in a convincingly legal tone. Simply put, they read like a lawyer wrote them even though the underlying legal analysis is often wrong. These are all common features of submissions generated by artificial intelligence." [...]

"25. I agree with Hung Long that there are two extraordinary circumstances here that justify compensation for its time. The first is Ms. Simpson’s use of artificial intelligence. It takes little time to have a large language model create lengthy submissions with many case citations. It takes considerably more effort for the other party to wade through those submissions to determine which cases are real, and for those that are, whether they actually say what Ms. Simpson purported they did. Hung Long’s owner clearly struggled to understand Ms. Simpson’s submissions, and his legal research to try to understand them was an utter waste of his time. I reiterate my point above that Ms. Simpson’s submissions cited a non-existent case in support of a legal position that is the precise opposite of the existing law. This underscores the impact on Hung Long. How can a self-represented party respond to a seemingly convincing legal argument that is based on a case it is impossible to find?

26. I am mindful that Ms. Simpson is not a lawyer and that legal research is challenging. That said, she is responsible for the information she provides the CRT. I find it manifestly unfair that the burden of Ms. Simpson’s use of artificial intelligence should fall to Hung Long’s owner, who tried his best to understand submissions that were not capable of being understood. While I accept that Ms. Simpson did not knowingly provide fake cases or misleading submissions, she was reckless about their accuracy."

SQBox Solutions Ltd. v. Oak BC Civil Resolution Tribunal (Canada) 31 March 2025 Pro Se Litigant Implied
Fabricated Case Law (1)
False Quotes Legal Norm (2)
Misrepresented Case Law (4)
Litigant lost on merits

"By relying on inaccurate and false AI submissions, Mr. Oak hurts his own case. I understand that Mr. Oak himself might not be aware that the submissions are misleading, but they are his submissions and he is responsible for them. "

Source: Steve Finlay
AQ v. BT CRT (Canada) 28 March 2025 Pro Se Litigant Implied
Fabricated Case Law (2), Legal Norm (1)
Misrepresented Case Law (1), Legal Norm (1)
Arguments ignored
Geismayr v. The Owners, Strata Plan KAS 1970 Civil Resolution Tribunal (Canada) 14 February 2025 Pro Se Litigant Copilot
Fabricated Case Law (9)
Misrepresented Case Law (1)
Citations ignored
Duarte v. City of Richmond British Columbia Human Rights Tribunal (Canada) 18 December 2024 Pro Se Litigant Implied
Fabricated Case Law (1)
Warning

Nathan Duarte, a pro se litigant, filed a complaint against the City of Richmond alleging discrimination based on political beliefs. During the proceedings, Duarte cited three cases to support his claim that union affiliation is a protected characteristic. However, neither the City nor the Tribunal could locate these cases, leading to the suspicion that they were fabricated, possibly by a generative AI tool. The court held:

"While it is not necessary for me to determine if Mr. Duarte intended to mislead the Tribunal, I cannot rely on these “authorities” he cites in his submission. At the very least, Mr. Duarte has not followed the Tribunal’s Practice Direction for Legal Authorities, which requires parties, if possible, to provide a neutral citation so other participants can access a copy of the authority without cost. Still, I am compelled to issue a caution to parties who engage the assistance of generative AI technology while preparing submissions to the Tribunal, in case that is what occurred here. AI tools may have benefits. However, such applications have been known to create information, including case law, which is not derived from real or legitimate sources. It is therefore incumbent on those using AI tools to critically assess the information that it produces, including verifying the case citations for accuracy using legitimate sources. Failure to do so can have serious consequences. For lawyers, such errors have led to disciplinary action by the Law Society: see for example, Zhang v Chen, 2024 BCSC 285. Deliberate attempts to mislead the Tribunal, or even careless submission of fabricated information, could also form the basis for an award of costs under s. 37(4) of the Code. The integrity of the Tribunal’s process, and the justice system more broadly, requires parties to exercise diligence in ensuring that their engagement with artificial intelligence does not supersede their own judgement and credibility."

Monster Energy Company v. Pacific Smoke International Inc. Canadian Intellectual Property Office (Canada) 20 November 2024 Lawyer
Fabricated Case Law (1)
The fabricated citation was disregarded by the court.

In a trademark opposition case between Monster Energy Company and Pacific Smoke International Inc., the Applicant, Pacific Smoke, cited a non-existent case, 'Hennes & Mauritz AB v M & S Meat Shops Inc, 2012 TMOB 7', in support of its argument. This was identified as an AI hallucination by the court. The court disregarded this citation and reminded the Applicant of the seriousness of relying on false citations, whether accidental or AI-generated.

Industria de Diseño Textil, S.A. v. Sara Ghassai Canadian Intellectual Property Office (Canada) 12 August 2024 Lawyer Implied
Fabricated Case Law (1)
Warning
Zhang v. Chen BC Supreme Court (Canada) 20 February 2024 Lawyer ChatGPT
Fabricated Case Law (2)
Claimant awarded costs

"[29] Citing fake cases in court filings and other materials handed up to the court isan abuse of process and is tantamount to making a false statement to the court.Unchecked, it can lead to a miscarriage of justice."