Peter Kendall
Partner
Cayman Islands
Feb 10, 2025
Key takeaways
During the course of his judgment, Asif J echoed the sentiments of Harber v HMRC [2023] UKFTT 1007 (TC) whereby the English Court stressed that reliance on fictious AI authorities could cause a great deal of harm to the reputation and the efficient running of the judicial system. That judgment, in turn, referred to the well-known New York case of Mata v Avianca, Inc. (2023), in which Judge Kevin Castel sanctioned two attorneys for submitting a brief drafted by ChatGPT that had referred to non-existent case law authorities.
Whilst the Court has confirmed that there is nothing inherently wrong with using technology (including AI tools) to make the conduct of legal disputes more efficient and their resolution speedier, Asif J stated that: "it is vital that anyone who uses such an AI tool verifies that the material generated is correct and ensures that it does not contain hallucinations". In other words, that any statutes, procedural rules and case law authorities that are referred to exist, say what they are asserted to say and that principles of law are accurately stated.
Asif J commented that anyone using AI tools must take personal responsibility for the accuracy of the material produced and be prepared to face personal consequences (which could include wasted costs orders) if the work product they put forward to the Court is not accurate. This is because failing to take such obvious precautions gives rise to many harms, including:
• wasting the time of the opponents and the Court;
• wasting public funds and causing the opponent to incur unnecessary costs;
• delaying the determination of other cases;
• failing to put forward other correct lines of argument;
• tarnishing the reputation of judges to whom non-existent judgments are attributed; and
• impacting the reputation of the Courts and legal profession more generally.
As the use of AI tools in the conduct of litigation increases, Asif J stated that it is vital that all counsel involved in the conduct of cases before the Court are alive to the risk that material generated by AI may include errors and hallucinations. With respect to the obligations of attorneys, Asif J cautioned that:
"Attorneys who rely on such material must check it carefully before presenting it to the court. But equally, opponents should be astute to challenge material that appears to be erroneous, as was the case here. As officers of the Court, in my view, an attorney’s duty to assist the Court includes the duty to point out when their opponent is at risk of misleading the Court, including by reference to non-existent law or cases."
Authors
Partner/Cayman Islands
Senior Counsel/Cayman Islands
Associate/Cayman Islands
Key contacts
Senior Counsel
Cayman Islands
Associate
Cayman Islands