AI Hallucination Cases

This database tracks legal decisions1 I.e., all documents where the use of AI, whether established or merely alleged, is addressed in more than a passing reference by the court or tribunal.

Notably, this does not cover mere allegations of hallucinations, but only cases where the court or tribunal has explicitly found (or implied) that a party relied on hallucinated content or material.

As an exception, the database also covers some judicial decisions where AI use was alleged but not confirmed. This is a judgment call on my part.
in cases where generative AI produced hallucinated content – typically fake citations, but also other types of AI-generated arguments. It does not track the (necessarily wider) universe of all fake citations or use of AI in court filings.

While seeking to be exhaustive (60 cases identified so far), it is a work in progress and will expand as new examples emerge. This database has been featured in news media, and indeed in several decisions dealing with hallucinated material.2 Examples of media coverage include:
- M. Hiltzik, AI 'hallucinations' are a growing problem for the legal profession (LA Times, 22 May 2025)
- E. Volokh, "AI Hallucination Cases," from Courts All Over the World (Volokh Conspiracy, 18 May 2025)
- J-.M. Manach, "Il génère des plaidoiries par IA, et en recense 160 ayant « halluciné » depuis 2023" (Next, 1 July 2025) - J. Koebler & J. Roscoe, "18 Lawyers Caught Using AI Explain Why They Did It (404 Media, 30 September 2025)

If you know of a case that should be included, feel free to contact me.3 (Readers may also be interested in this project regarding AI use in academic papers.)

Based on this database, I have developped an automated reference checker that also detects hallucinations: PelAIkan. Check the Reports Report icon in the database for examples, and reach out to me for a demo !

For weekly takes on cases like these, and what they mean for legal practice, subscribe to Artificial Authority.

State
Party
Nature – Category
Nature – Subcategory

Case Court / Jurisdiction Date ▼ Party Using AI AI Tool Nature of Hallucination Outcome / Sanction Monetary Penalty Details Report(s)
QWYN and Commissioner of Taxation Administrative Review Tribunal of Australia (Australia) 5 February 2025 Lawyer Copilot
False Quotes Doctrinal Work (1)
The Tribunal affirmed the decision under review, rejecting the applicant's submissions based on the AI-generated content.

"The Applicant engaged the Copilot [Microsoft’s Artificial Intelligence product] in a range of probing questions pertaining to superannuation and taxation matters, upon which in part, it returned the following responses:

The Explanatory Memorandum to the Taxation Laws Amendment (Superannuation) Bill 1992, which introduced the new regime taxing superannuation benefits, states in paragraph 2.20 that “the Bill will provide a tax rebate of 15 per cent for disability superannuation pensions. This will apply to all disability pensions, irrespective of whether they are paid from a taxed or an untaxed source. The rebate recognises that disability pensions are paid as compensation for the loss of earning capacity and are not merely a form of retirement income.

  1. I have examined the Explanatory Memorandum to the Taxation Laws Amendment (Superannuation) Bill 1992. I was unable to locate any paragraph in that document in the same or similar terms to the paragraph generated by Copilot. It did not contain a paragraph 2.20.
  2. It has been noted by others that AI bots are prone to hallucinations.[35] That appears to be what has happened here. It is my assessment that submitting unverified material generated by AI, is not consistent with a party’s duty to use their best endeavours to assist the Tribunal to achieve its statutory objectives. To expect the Tribunal to read and consider material which a party does not know is authentic impedes the Tribunal’s attempts to provide a mechanism of review that ensures that applications are resolved as quickly and with as little expense as a proper consideration of the issues permits.
  3. Nothing in the remainder of the applicant’s submissions altered my view that the untaxed element of the benefit should be taxed under Subdivision 301-B."
Valu v. Minister for Immigration and Multicultural Affairs (Australia) 31 January 2025 Lawyer ChatGPT
Fabricated Case Law (17)
False Quotes Exhibits or Submissions (8)
Referral to Legal Services Commissioner

AI Use

Counsel used ChatGPT to generate a summary of cases for a submission, which included fictitious Federal Court decisions and invented quotes from a Tribunal ruling. He inserted this output into the brief without verifying the sources. Counsel later admitted this under affidavit, citing time pressure, health issues, and unfamiliarity with AI's risks. He noted that guidance from the NSW Supreme Court was only published after the filing.

Hallucination Details

The 25 October 2024 submission cited at least 16 completely fabricated decisions (e.g. Murray v Luton [2001] FCA 1245, Bavinton v MIMA [2017] FCA 712) and included supposed excerpts from the AAT’s ruling that did not appear in the actual decision. The Court and Minister’s counsel were unable to verify any of the cited cases or quotes.

Ruling/Sanction

Judge Skaros ordered referral to the OLSC under the Legal Profession Uniform Law (NSW) 2014, noting breaches of rules 19.1 and 22.5 of the Australian Solicitors’ Conduct Rules. The Court accepted Counsel’s apology and health-related mitigation but found that the conduct fell short of professional standards and posed systemic risks given increasing AI use in legal practice.

Key Judicial Reasoning

While acknowledging that Counsel corrected the record and showed contrition, the Court found that the damage—including wasted judicial resources and delay to proceedings—had already occurred. The ex parte email submitting corrected materials, without notifying opposing counsel, further compounded the breach. Given the public interest in safeguarding the integrity of litigation amidst growing AI integration, referral to the OLSC was deemed necessary, even without naming Counsel in the judgment.

Hanna v Flinders University South Australia (Australia) 29 January 2025 Pro Se Litigant Implied
Fabricated Case Law (1)
Body by Michael Pty Ltd and Industry Innovation and Science Australia Administrative Review Tribunal (Australia) 24 January 2025 Pro Se Litigant ChatGPT
Fabricated Case Law (1)
False Quotes Doctrinal Work (1)
Misrepresented Legal Norm (4)
Fake references withdrawn before the hearing

"Nevertheless, due to that withdrawal being requested prior to the hearing, I have not considered those paragraphs, these reasons for decision do not take account of those paragraphs and I merely make some general comments below applicable to all parties that appear before the Tribunal.

The use of Chat GPT is problematic for the Tribunal. It perhaps goes without saying that it is not acceptable for a party to attempt to mislead the Tribunal by citing case law that is non-existent or citing legal conclusions that do not follow, whether that attempt is deliberate or otherwise. All parties should be aware that the Tribunal checks and considers all cases and conclusions referred to in both parties’ submissions in any event. This matter would have inevitably been discovered, and adverse inferences may have been drawn. To ensure no such adverse inferences are drawn, parties are encouraged to use publicly available databases to search for case law and not to seek to rely on artificial intelligence."

Candice Dias v Angle Auto Finance Fair Work Commission (Australia) 20 January 2025 Pro Se Litigant Implied
Fabricated Case Law (3)
Misrepresented Case Law (1)
Kaur v RMIT SC Victoria (CA) (Australia) 11 November 2024 Pro Se Litigant Implied
Fabricated Case Law (1)
In re Dayal (Australia) 27 August 2024 Lawyer LEAP
Fabricated Case Law (1)
Referral to the Victorian Legal Services Board and Commissioner for potential disciplinary review; no punitive order issued by the court itself; apology accepted.

Counsel admitted the list of authorities and accompanying summaries were generated by an AI research module embedded in his legal practice software. He stated he did not verify the content before submitting it. The judge found that neither Counsel nor any other legal practitioner at his firm had checked the validity of the generated output.

The court accepted Counsel’s unconditional apology, noted remedial steps, and acknowledged his cooperation and candour. However, it nonetheless referred the matter to the Office of the Victorian Legal Services Board and Commissioner under s 30 of the Legal Profession Uniform Law Application Act 2014 (Vic) for independent assessment. The referral was explicitly framed as non-punitive and in the public interest.

In September 2025, the Board sanctioned Counsel, preventing him from acting as a principal lawyer or operate his own practice, and put him down for two years of supervision (see here).

Lakaev v McConkey Supreme Court of Tasmania (Australia) 12 July 2024 Pro Se Litigant Implied
Fabricated Case Law (1)
Misrepresented Case Law (1)
Appeal dismissed for want of prosecution

The appellant's submissions included a misleading reference to a High Court case, De L v Director-General, NSW Department of Community Services, misrepresenting its relevance to false testimony, which was not the case's subject matter, and a fabricated reference to Hewitt v Omari [2015] NSWCA 175, which does not exist. The appeal was dismissed, considering the lack of progress and potential prejudice to the respondent.

Finch v The Heat Group Family Court (Australia) 27 February 2024 Pro Se Litigant Implied
Fabricated Case Law (2)
Misrepresented Case Law (1)

Applicant (unrepresented) provided a list of 24 authorities claimed to show instances where MinterEllison had been restrained. Court's associate and judge found the list contained fabricated or misdescribed citations; judge characterised the provision of those authorities as an egregious instance of misleading the court but did not impose professional sanctions. Restraint application dismissed on merits.

Nash v. Director of Public Prosecutions Supreme Court of Western Australia - Court of Appeal (Australia) 8 May 2023 Pro Se Litigant Implied Fictitious authorities Appeal dismissed

"Mr Nash is unrepresented. He prepared the appellant's case himself, although it appears that he may have had some assistance with later submissions (including, perhaps, from an artificial intelligence program such as Chat GPT). Neither form of submission made coherent submissions as to why the trial judge's decision was affected by material error or otherwise gave rise to a miscarriage of justice. Nor did the material sought to be adduced by Mr Nash as additional evidence on the appeal disclose any miscarriage of justice.

[...]. There is otherwise no jurisdictional basis to transfer criminal proceedings under State law in this Court to the court of another State. The authorities cited by Mr Nash in support of such jurisdiction do not exist; they are fictitious."

The court dismissed the appeal, finding no merit in the grounds presented, and refused to admit additional evidence. No professional sanctions or monetary penalties were imposed as Nash was a pro se litigant.

Source: Jay Iyer