Clients at Risk: AI landing Litigants and Legal Counsel in Hot Water

Artificial intelligence (“AI”) is no longer knocking at the door of the legal profession — it’s in the courtroom and it is letting you down.

AI tools are becoming increasingly accessible, and some clients and lawyers are turning to them to save time and legal fees. While well-intentioned, this DIY approach is increasingly backfiring. Many of these AI-generated submissions are riddled with inaccuracies, irrelevant or fabricated case citations, and poor legal reasoning. Rather than streamlining the process, lawyers are now spending additional time untangling misinformation, correcting errors, and redoing work entirely. Ironically, what begins as a cost-saving shortcut often ends up inflating legal fees and undermining the client’s position in court.

In a recent case, Wikely v Kea Investments Ltd [2024] NZCA 609, the New Zealand Court of Appeal had to address a legal memorandum based on a phantom precedent from the use of AI. It’s non-existence distressingly came to light in the court room, leaving the litigant without justification for his argument. But this isn’t just a local anomaly. The problem is growing abroad. In one case in the United Kingdom, the court discovered that legal counsel had relied on generative AI tools to reduce their legal submissions, which cited as many as 18 non-existent cases. Closer to home, an Australian immigration lawyer used ChatGPT to prepare legal submissions which were complete with quoted judgments that, never existed.

Though AI offers speed and convenience, its use in legal practice can come at a high cost and in some instances, that cost may be your case. AI neglects human intuition that legal professionals develop from years of experience. These software’s search through sources on the internet to generate an answer and is evidently unable to differentiate between real legal cases, and those that do not exist. AI also does not have the nuances required to navigate complex issues which are landing both legal counsel and self-represented litigants in hot water.

Lawyers and judges are seeing a rise in clients and litigants turning to AI tools like ChatGPT for quick and affordable legal guidance, however they are doing so without consideration of their privacy risks. AI is not governed by the same confidentiality rules that bind licensed legal professionals. Unlike AI, lawyers in New Zealand and abroad are governed by the Conduct and Client Care Rules, which provides that lawyers must abide by confidentiality requirements ensuring that client-lawyer privilege is maintained.

If you are dealing with a legal issue, or have a question that you would like clarity on, then give us a call and we will arrange an initial consultation with one of our team.

Please contact us at admin@harristate.co.nz, or call our reception at 07 578 0059 to see how we can assist you.

Disclaimer: This article is general in nature and should not be treated as professional advice. It is recommended that you consult your advisor. No liability is assumed by Harris Tate Limited for any losses suffered by any person relying directly or indirectly upon the article above.

personal law