May 15, 2026
Getting your Trinity Audio player ready...

By Alric Lindsay

In a Cayman Islands Court of Appeal judgment dated September 18, 2025, it was noted that Samuel Johnson, the appellant, used generative Ai to produce his skeleton arguments when he argued that he did not need to provide a Cayman Islands address in court documents as required by the Grand Court Rules (GCR) O6 r5.  The Court of Appeal disagreed with Johnson’s argument and warned him that contempt of court proceedings or a criminal investigation could have been commenced against him for using Ai to fabricate his skeleton argument with non-existent cases. 

The background of the case is that Johnson wished to pursue proceedings against the Cayman Islands Health Services Authority, however,  (CIHSA).

He prepared what he described as an ex parte originating summons and a statement of claim, which he wished to serve on the CIHSA, which gives his address for service in Greater Georgetown, Guyana.

The Registry drew his attention to the Grand Court Rules (GCR) O6 r5, which require him to provide the address of a place within the jurisdiction of the Cayman Islands “at or to which documents for the plaintiff may be delivered or sent”.  Johnson refused to provide an address in compliance with O6 r5.

Concerning this, the judgment said:

He has not explained in any detail why he is not willing/able to make arrangements to provide a physical address for the delivery of documents within the Cayman Islands. His argument appears to be that the requirement is unnecessary, certainly at this stage. He claims that he has no local contacts and cannot reasonably be expected to retain a local agent solely for formal service. He also has concerns about the integrity of any agent whom he might appoint. He has sought to nominate the physical address of the court but the Registry has (rightly) declined to accept the nomination.

It is clear that the purpose of the rule in GCR O6 r5 is to secure a fixed physical address in the Cayman Islands at which any plaintiff can be contacted and served with documents throughout any litigation without any dispute as to the service and without any complication in effecting service through post or electronic communication. This is highly desirable as a point of certainty in an increasingly mobile and ephemeral world. Fixing the physical address of the plaintiff is an advantage to the defendant and to the court and provides certainty of access to the plaintiff in the Cayman Islands throughout any court process.

The judgment concluded:

We regard the requirement for a physical address in the Cayman Islands as an important procedural and practical protection for those persons who have to defend a private law claim brought against them in the Cayman Islands as well as one that is essential for the court to retain fixed and certain mechanisms for communication with the plaintiff.

No sufficient evidence has been produced on this issue, no doubt because Mr Johnson’s position is based on principle rather than evidence. It follows that we do not consider that at present either the overriding objective or the powers of the court under GCR O2 r1 would require the exercise of any case management discretion in favour of Mr Johnson.

Regarding the use of Ai, the judgment said:

One final matter needs to be stated. Mr Johnson cited two cases in his skeleton argument dated 28 July 2025; Brown v Board of Governors (1999) JLR 217 (Jamaica CA) and Cukurova Finance International Ltd v Alfa Telecom Turkey Ltd [2012] CICA (Cayman Islands Court of Appeal). He was asked to provide copies of these cases because the court were unable to trace them but was unable to do so. He initially said that the “references were inserted with assistance as [he does] not have access to a proper legal database”. Shortly before the hearing Mr Johnson explained that he had used AI to generate his skeleton argument and confirmed that one of the cases might not exist. He asked us to disregard the references.

The judgment added:

Thus it is now clear that the references were generated by the unacknowledged use of generative AI. The cases are fabricated. In R (oao Ayinde) v London Borough of Haringey [2025] EWHC 1383 (Admin), the English Divisional Court explained the problems that may arise from the use of generative artificial ‘intelligence’ in preparing documents to be used in court at [6]-[9]:

“In the context of legal research, the risks of using artificial intelligence are now well known. Freely available generative artificial intelligence tools, trained on a large language model such as ChatGPT are not capable of conducting reliable legal research. Such tools can produce apparently coherent and plausible responses to prompts, but those coherent and plausible responses may turn out to be entirely incorrect. The responses may make confident assertions that are simply untrue. They may cite sources that do not exist. They may purport to quote passages from a genuine source that do not appear in that source. Those who use artificial intelligence to conduct legal research notwithstanding these risks have a professional duty therefore to check the accuracy of such research by reference to authoritative sources, before using it in the course of their professional work (to advise clients or before a court, for example). Authoritative sources include the Government’s database of legislation, the National Archives database of court judgments, the official Law Reports published by the Incorporated Council of Law Reporting for England and Wales and the databases of reputable legal publishers. This duty rests on lawyers who use artificial intelligence to conduct research themselves or rely on the work of others who have done so. This is no different from the responsibility of a lawyer who relies on the work of a trainee solicitor or a pupil barrister for example, or on information obtained from an internet search. We would go further however. There are serious implications for the administration of justice and public confidence in the justice system if artificial intelligence is misused.”

The judgment continued:

In our judgment these duties apply with little modification to self-representing litigants. Litigants-in-person, even those less experienced than Mr Johnson, have a duty not to mislead the courts (see Barton v Wright Hassell LLP [2018] UKSC 12 at [18]). This duty includes a duty not to mislead by citing fabricated authorities. This is particularly the case where, as here, the self-representing person is appearing without notice and where a duty of candour to the court arises in any event. We consider Mr Johnson breached his duty to the court. The breach is more egregious because he did not immediately acknowledge his use of generative AI in preparing documents for the court. However, given his frank if belated acknowledgement that he had used AI and that some of the caselaw cited might not in reality exist, we do not consider it is necessary for the court to take any action against him.

The judgment concluded:

This judgment must though act as a warning of the risks of using artificial intelligence for legal research, and the potential consequences if a person puts fabricated citations before the court. These consequences may include initiating proceedings for contempt or referral for criminal investigation. There will almost always be costs consequences and the case advanced by the self-representing litigant may be liable to be stayed or struck out.

For the future, it must be made clear that all statements, submissions, skeletons or other documents from parties to court proceedings which use generative AI and that are to be submitted to the courts must identify the use of AI and the party concerned must take personal responsibility for checking that none of the references are fabricated.

Case reference:

CICA CIVIL APPEAL No. 5 of 2025 (formerly G 145 of 2024)

Neutral Citation Number: [2025] CICA (Civ) 15