The use of AI in legal proceedings: the Grand Court weighs in

More recently, in Bradley v Frye-Chaikin, the Grand Court has had cause to consider the use of AI tools to produce legal submissions. In so doing, the Grand Court reviewed:
- The US decision of Mata v Avianca, Inc, in which Judge Kevin Castel had sanctioned two attorneys for submitting a brief drafted by ChatGPT that had referred to non-existent case law authorities.
- The subsequent English decision of Harber v HMRC, in which Judge Redston, perhaps unsurprisingly, criticised the applicant for submitting a document that referred to nine purported authorities that did not exist.
The Honourable Justice Asif KC held that there is “nothing inherently wrong” with using technology to aid in the efficiency of the proceedings. However, in so doing, he echoed the sentiments of Judge Redston insofar as the Cayman Islands is concerned. He emphasised that reliance on AI, without taking the proper precautionary steps to ensure accuracy, could cause a great deal of harm, including: wasting the Court and the opponent’s time; wasting public funds and causing the opponent to incur unnecessary costs; delaying the determination of other cases; failing to put forward other correct lines of argument; tarnishing the reputation of judges to whom non-existent judgments are attributed; and impacting the reputation of the Courts and legal profession more generally.
The Judge accordingly warned: “As the use of AI tools in the conduct of litigation increases, it is vital that all counsel involved in the conduct of cases before the courts are alive to the risk that material generated by AI may include errors and hallucinations. Attorneys who rely on such material must check it carefully before presenting it the Court.”
Perhaps more notably, the Judge also held that opponents “should be astute to challenge material that appears to be erroneous, as was the case here. As officers of the Court, in my view, an attorney’s duty to assist the Court includes the duty to point out when their opponent is at risk of misleading the Court, by reference to non-existent law or authorities.”
This decision is a welcome reminder of the Court’s commitment to embracing technology to enhance service delivery, whilst maintaining public trust and confidence. In the words of Sir Geoffrey, “We will adopt and utilise AI for the things it does well, but we are a long way from being made redundant”.
This blog post was also written by Diana DeMercado, a member of our articled clerk programme.