Back Home 5 News 5 Regulating artificial intelligence that thinks like a lawyer

Regulating artificial intelligence that thinks like a lawyer

28 Jul 2022

| Author: Diana Clement

Should bots providing legal services be regulated in the same way as lawyers who give regulated legal advice? The question is being pondered by members of the ADLS Technology & Law committee and the issue of legal services driven by artificial intelligence (AI) will be a key part of a submission the committee is preparing for the independent panel reviewing the regulation and structure of the legal profession. It’s an important issue because AI is already seeping into day-to-day law in New Zealand, says committee member Arran Hunt.

The technology is already replacing juniors by stealth, though some lawyers are failing to see the writing on the wall. Hunt, a partner at Stace Hammond, cites an arbitration his firm was involved in where other lawyers complained that his bundle of documents was too thin. “We pointed out that they [had] the same document 10 times,” Hunt says. He knew this because a Stace Hammond junior had entered all the documentation for the arbitration into LawFlow, which was intelligent enough to identify duplicates, as well as carrying out other tasks such as tagging privileged information.

In this instance, LawFlow identified that the same document was attached to multiple emails that had gone back and forth between the parties. “[With LawFlow], we’re providing one document rather than the same document 10 times,” Hunt says. “Then you hit a button and it creates all the documentation required for court in the New Zealand court formatting standards.”

Likewise, practice management software such as Actionstep is starting to use AI and automation. The software’s document automation pulls live legal data into preconfigured templates, with Smart Logic allowing conditional addition/removal of text blocks.


Regulating technology

At its most recent meeting, the ADLS committee discussed the regulation of legal advice and reserved areas of work, and how technology may already fall into these categories.

There is no way technology that falls into that category could be banned, says committee convenor Lloyd Gallagher. But the New Zealand Law Society needs to consider technology and its role in providing advice. “While the horse has bolted on the regulation of technology, so to speak, it has not bolted on whether the companies running it should be pulled into legal advice regulation under the Lawyers and Conveyancers Act,” Gallagher says. “The review into the Act gives a unique opportunity to ask [whether] regulation should include providers that fall into areas that are considered to be providing legal services, which often entails legal advice. In my opinion, and that of others on the committee, it should.”

Some legal work is already done by non-lawyers who do not represent themselves to be lawyers. An example is employment law advocates. Legal advice in non-reserved areas provided by AI could be viewed similarly. “AI technology, being simple database-result AI, and possibly full AI as it develops in the future, should be pulled into the ambit, with the owner being the responsible party for advice,”  Gallagher says.

AI will eventually be capable of giving legal advice rather than just writing contracts. Hunt cites the DALL·E 2 AI project as an example of just how far AI is advancing. DALL·E 2 uses AI to recreate an image of the Mona Lisa with a body in the style of Leonardo da Vinci. Although this has nothing to do with the law, the point, Hunt says, is the growing intelligence of AI technology.

He is in no doubt that AI has now surpassed the “Turing Test”, a test of a machine’s ability to demonstrate intelligent behaviour equivalent to, or indistinguishable from, that of a human. At this point, AI can understand things.


Reserved work

The question then arises as to whether AI itself should also be included in the definition of regulated services if advice is being given and if so, what that should look like, Gallagher says. “Certainly, at this time anyway, AI cannot provide reserved areas of work as it cannot yet make an argument. But it will be there one day.

“AI can currently provide legal advice in other areas of the law as has been seen with simple contracts and some rules, as code technology is being developed in Australia. Is it appropriate to run unchecked when consumers carry the ultimate risk?

What happens if it is programmed wrong and gives bad advice? Who is the responsible person if it does? How can the industry be seen to be maintaining a high standard of professional services if AI remains unregulated? And finally, how are we protected from a provider that may not fully understand the law or base the advice on international rules? “If non-lawyers are running AI legal systems and remain free from review, the risk to consumers increases while leaving unregulated firms and traders unchecked.

“Ultimately, someone is controlling that technology so there must be some review to keep it safe. Computers would be practising and someone will be controlling that technology. There’s a whole range of complexities in there,” Gallagher says.

“The committee’s position is how best we can balance this in the marketplace for practising lawyers as well as technology systems while protecting consumers. And how can we put this before the review panel to explain the issues and point them to some questions to which they might find some solutions or answers?

“With the complexities in this area, there’s no one solution that is going to fit everything. So, how do we balance that in the regulation to maintain quality of service without punishing lawyers in a free, open market? Consumers need to be protected, Gallagher says, but that needs to be balanced with maintaining a competitive marketplace and not placing unreasonable burdens on lawyers to pay fees that unregulated parties don’t face for that same advice.

“Right now, there’s nothing stopping you from starting up a company and doing it. And that puts the consumer at risk because we don’t know whether you are following appropriate rules and practice standards, asking the right questions, providing good advice or even whether that advice is from an international legal perspective, which doesn’t apply in New Zealand.” There are already examples of bots providing legal advice and some of it going horribly wrong, he says.


Patch protection?

Auckland University associate professor Alex Sims says she does not see an argument in favour of expanding lawyers’ powers to prevent non-practising lawyers and/or AI from providing advice.

“If it was expanded, that would reduce access to justice,” says Sims, a blockchain researcher, academic and futurist. “This looks very much like patch protection from lawyers, dressed up as concern for consumers.” If, however, AI was providing regulated services as things currently stand, then there should be a level playing field, she says, with a responsible person liable for that advice. Gallagher’s personal opinion is that the practising certificate has passed its use-by date. “I think it is creating a burden on solicitors with a high cost for practising areas, which is unnecessary,” he says.

The committee discussed creating a separate role for membership and an oversight system for practising and non-practicsng lawyers and legal services such as AI. With such a system, complaints could be made and investigations carried out against anyone acting in a legal services position. “This creates a level of balance that brings AI and computer-related systems underneath that umbrella and creates that consumer protection without burdensome requirements on lawyers and non-lawyers that would effectively destroy the competitive side of the market,” Gallagher says.

He suggests AI services provided by non-lawyers be pulled into a membership database and regulated. “What that means is that you, as the business owner, say ‘well hang on a minute, I need to make sure that I have someone on staff or I have the requisite knowledge to make sure that my technology is performing according to the standards that are supposed to be upheld’.” Gallagher cites Canada and the UK as examples for New Zealand to consider.

“Canada, while they haven’t gone down the AI or computer path, is already in a position where it pulls legal service providers such as arbitrators, mediators etc into the requirement to be included in a membership (organisation).” That ensures complaints can be heard. Here in New Zealand, that might be resurrecting the role of regional law societies as the membership bodies, with the New Zealand Law Society being the regulator. “If that was the case in New Zealand, fees could be apportioned based on qualifications and whether the person or technology was practising in reserved areas of work or not. That would bring back that regulation balance, which is out of balance now,” Gallagher says.



Increasingly, forms of AI will begin to disintermediate larger full-service firms, Sims says. But some lawyers are asleep at the wheel.

Small companies offering specialised services using technology will inevitably spring up, eating the lunches of larger firms using the partnership model. They’re fighting a rearguard battle, she says. It’s similar to large banks that find they’re being disintermediated by small start-ups such as online bank Revolut in the UK, or the buy-now, pay-later companies that are eating into consumer lending in New Zealand. “Yes, law is still needed as a profession,” Sims says.

“But we won’t need as many people to do it. We have seen the same thing in other professions.” The partnership model is particularly problematic for law firms that need to invest in technology to keep pace with the changes, she says. While some are managing to hold it off, it’s only a matter of time for the small trickle of technology to turn into a flood.

AI doesn’t discriminate about where in the legal industry it might hit. While fewer juniors will be needed, equally fewer judges will be needed.

The professional can’t always see this. Sims cites author Richard Susskind, who writes and speaks about the future of justice, and has had lawyers and other professionals comment after his talks that while they understand how AI is eating into other professions’ work, they can’t see it will affect their’s. “People come down afterwards and say, ‘yes, yes, yes, AI is going to change that profession, but not mine. I’m special. Everyone else can see it, but not the people themselves.” ■

Subscribe to


The weekly online publication is full of journalistic articles written for those in the legal profession. With interviews, thought pieces, case notes and analysis of current legal events, LawNews is a key source of news and insights for anyone working within or alongside the legal field.

Sign in or
become a Member
to join the discussion.


Submit a Comment

Your email address will not be published. Required fields are marked *

Latest Articles