Rapid Uptake of AI in Litigation
The dramatic development of AI capabilities has enthusiastically been taken up by lawyers in pleadings and submissions to courts. But that development and the utilization of AI in these contexts has been far from perfect. So much so that this growth comes as an “existential threat” to the justice system, warns Chief Justice Stephen Gaegler of the High Court of Australia. An increasing number of cases are being disrupted by the use of artificial intelligence, prompting Australian courts to respond. At present, most Courts in Australia have now developed, or are developing, guidance with respect to use of AI by legal practitioners. New South Wales Courts have paved the way, publishing practice notes and other jurisdictions are following suit with the development of guidelines.
Courts Respond to Emerging Risks
On 29 April 2025, Justice Needham of the Federal Court of Australia released a notice to the profession regarding the use of AI in the Federal Court. Pending the release of practice notes and guidelines following an extensive consultation process, parties and their solicitors are on notice that they continue to be responsible for the material that is tendered to the Court.
Ongoing Duties Despite AI Use
Practitioners are not prohibited from using AI when preparing court documents. However, if as a result of the use of AI, the practitioner has failed to discharge their obligations to the client or the Court pursuant to the relevant court rules or solicitor conduct rules, the consequences may be severe. Hallucinations in the form of false citations of wrong or fictious authorities are a recurring danger for lawyers in using AI for document preparation.
When AI Hallucinations Lead to Consequences
In a July 2025 decision, Justice Murphy of the Federal Court of Australia ordered indemnity costs against solicitors for the use of AI to prepare citations in court documents in respect of a native title claim. In addressing the use of AI in court proceedings, the Court noted that “Whilst the use of AI in the legal profession is growing, practitioners must be aware of its limitations. It is critical that legal practitioners use proper safeguards to verify the accuracy of the work produced. Any use of AI must be consistent with the overriding duty of legal practitioners as officers of the Court and their fundamental obligation to uphold, promote, and facilitate the administration of justice.”
Victorian Court Referrals and Professional Sanctions
In Victoria, Judge Humphreys of the Federal Circuit and Family Court of Australia referred the conduct of a solicitor to the Office of the Victorian Legal Services Board and Commissioner for tendering to the court a list and summary of legal authorities that did not exist. In deciding the matter, the Court observed that “The use of technology is an integral part of efficient modern legal practice. At the frontier of technological advances in legal practice and the conduct of litigation is the use of AI. Whilst the use of AI tools offer opportunities for legal practitioners, it also comes with significant risks.” In Dayal, the Court considered the guidelines issued by both the Supreme Court of Victoria and County Court of Victoria, reiterating that generative AI does not relieve the responsible legal practitioner of the need to exercise independent judgement and professional skill in finalising any document provided to the court. In August 2025, the Victorian Legal Services Board varied Mr Dayal’s practising certificate so that he could no longer practise as principal or handle trust moneys and required him to undertake supervised legal practice for a period of two years.
Clearer National Guidelines on the Horizon
The consultation process by the Federal Court of Australia remains advanced. Clearer directions are anticipated once guidelines or practice notices have been published.
No Excuse for Errors: Practitioners Remain Responsible
Lawyers are often under work, cost and time pressures. Senior lawyers often rely on their junior lawyers for work preparation. But we all know about AI hallucination risks. There is no excuse. We are responsible to our clients and the courts for the content of documents produced with the assistance of AI. The risks of not checking accuracy are severe.
This publication has been prepared for general guidance on matters of interest only and does not constitute professional legal advice. You should not act upon the information contained in this publication without obtaining specific professional legal advice. No representation or warranty (express or implied) is given as to the accuracy or completeness of the information contained in this publication and to the extent permitted by law, Cowell Clarke does not accept or assume any liability, responsibility or duty of care for any consequences of you or anyone else acting or refraining to act in relation on the information contained in this publication or for any decision based on it.