ChatGPT in the Courtroom: Mishap Reinforces Need for AI Fact Checking 


Robert Mata’s lawsuit against Avianca, a Colombian airline, has made headlines over the last week as Mata v. Avianca quickly became ChatGPT v. Avianca. The lawsuit has caught the attention of news outlets, not for the substance of the case but for a lawyer’s failed use of ChatGPT (to learn more about how Caravel leverages AI, sign up for our upcoming webinar here). 

Lawyers at Levidow, Levidow & Oberman PC, a respected New York City law firm, filed a lawsuit for their client, Robert Mata, against Avianca. The lawsuit was sparked by an incident on an Avianca flight when Mata was allegedly injured by a metal serving cart that struck his knee. However, the story really begins when Mata’s legal counsel, or rather, when ChatGPT compiled relevant case law which was presented to the court in support of their case. 

The plaintiff’s brief cited cases including Miller v. United Airlines, Petersen v. Iran Air and Varghese v. China Southern Airlines, all seeming to be legitimate with relevant court decisions to Mata’s case. Trouble came, however, when the defense, following standard practice, investigated the cited cases and came up blank. Evidently, they had done the due diligence that the opposing counsel had not and began to question the existence of the cited cases. 

In response to this challenge, the plaintiff’s lawyers opted to return to their source, asking ChatGPT if the cases presented in the initial brief were legitimate – it answered yes. They then asked it to provide snapshots of the cases, which it did. The evidence ChatGPT provided was then submitted to the court, which had also tried and failed to find these cases.  

It was not until the court formally requested an explanation from the plaintiff’s legal team that they admitted to their use of ChatGPT. This detail cemented that the cases had been fabricated, as suggested by the defense. Notably they were fabricated by ChatGPT rather than the lawyers themselves, though they failed to fact check the information the AI provided. The revelation has since prompted the judge to schedule a hearing to discuss the lawyers’ actions and potential sanctions.  

So, how did ChatGPT get things so wrong? GPT is not foolproof, and experiences hallucinations. Hallucinations refer to instances in which the GPT software, and others like it, make up events or facts. These hallucinations are not predictable or identifiable, which is why it is of the utmost importance that GPT and other generative AI outputs are verified for accuracy.  

See our Director of Legal Innovation, Monica Goyal, discuss hallucinations and AI tools in general for Canadian Lawyer Magazine here 

The story has raised critical ethical questions around the use of AI in law and beyond. Are lawyers who use generative AI, such as ChatGPT, fully responsible for its potential errors? Should they be using ChatGPT at all? How do we determine when it’s appropriate to source answers from ChatGPT? These questions and others like it have loomed around this cautionary tale, but as a firm that continues to explore innovative technologies, we have seen firsthand that generative AI can be useful to lawyers. Like many legal tools, it needs to be used strategically and responsibly to reap the benefits. 

Since the end of 2022 a group of Caravel’s lawyers have had access to Harvey, an AI platform built on Open AI specifically for legal work. As the group of legal professionals at our firm continue to help perfect Harvey’s services, they share information around AI best practices with the rest of our team and community. Tools such as Harvey or ChatGPT have a few standard rules which legal professionals should always follow.  

  • The prompts provided to the AI tool should be as specific as possible.
  • They should only ever be used for a first draft of legal documents, those drafts should be reviewed and edited by a lawyer.
  • They should be fact checked diligently by a person, not by AI.

When these precautions are taken, generative AI can be useful in legal work, reducing billable time and moving along client work.  

If you’re interested in hearing Monica Goyal speak more on AI in law, Caravel will be hosting a free webinar on June 20th. You can register for the event here! 

  • Share:

Work with a law firm that gives your business the attention it deserves.

Contact us