On Friday 6th June 2025, Dame Victoria Sharp (President of the King’s Bench Division) and Mr Justice Johnson handed down judgment in R (Ayinde) v London Borough of Haringey.
The case arose out of the well-publicised wasted costs judgment of Mr Justice Ritchie, in which severe criticism was levelled at a very junior barrister and Haringey Law Centre for their reliance on 5 fake authorities in underlying judicial review proceedings.
In short, during the course of her second-six pupillage, the junior barrister in question drafted and signed a High Court pleading in which she purported to rely upon 5 authorities that did not exist. This resulted in Mr Justice Ritchie, not only awarding wasted costs against her (and Haringey Law Centre) but also referring each to their respective regulators.
In addition to these measures, the High Court, exercising its “Hamid” jurisdiction, called the parties involved to attend court to consider what further steps should be taken against them, including the possible initiation of criminal contempt proceedings.
At this hearing, Andrew represented Haringey Law Centre and two of its employees.
In an important judgment, Dame Victoria Sharp set out the approach the Court will take in cases where lawyers rely upon fake authorities created by Artificial Intelligence. Given the increasing use of AI within all areas of the law, this case is essential reading for barristers and solicitors alike.
The barrister in question denied that the fake cases she relied upon had been created by AI. However, the Court did not accept her explanation (which it found to be incoherent) and held, on the evidence before it, that there were only two possible explanations:
- she had deliberately included fake citations in her written work; or alternatively
- she did in fact use AI to produce her list of cases and, in her denials, provided the court with untruthful evidence.
Either way, the Court found that the threshold for initiating contempt proceedings against her had been met. Although the Court – as a matter of discretion – decided not to initiate contempt proceedings against her, it re-referred the matter to the BSB to investigate several matters of the utmost seriousness, including:
- the truthfulness of her evidence, both before the President and Mr Justice Ritchie;
- how such fake authorities came to be placed before the court; and
- perhaps most importantly, whether those responsible for supervising her pupillage, complied with the relevant regulatory requirements placed upon them in respect of her supervision, work allocation and her competence to undertake the level of work she was carrying out.
In respect of the Law Centre and its employees, the Court found, that Ms Hussain (a paralegal employed by the Law Centre) was not at fault in any way and that she “acted appropriately throughout”. This was an important finding, as Ms Hussain had been criticised by Mr Justice Ritchie in the wasted costs Judgment. Indeed, it was only as a result of Mr Ayinde (the Law Centre’s lay client) agreeing to waive privilege that Andrew was able to take the Court through the relevant correspondence to show the complete picture of Ms Hussain’s (perfectly appropriate) conduct.
As to Ms Hussain’s supervisor (Mr Amadigwe) – again having regard to the documents that were not before Mr Justice Ritchie, the Court held that there was “no question of initiating contempt proceedings against him” and no reason to suspect that he had deliberately caused false material to be put before the Court. Despite this, the matter was re-referred to the SRA to investigate not only the adequacy of his response (once the matter had been raised by the other side in the underlying proceedings) but also what steps had been carried out to check the competence and experience of counsel.
In a sobering judgment, the Court made clear several points which will be of interest to all practitioners:
- Lawyers who use AI to conduct legal research have a professional duty to check the accuracy of such research by reference to authoritative sources before using it in the course of their work.
- This duty rests on both lawyers who use artificial intelligence to conduct research themselves, or rely on the work of others who have done so:
“this is no different from the responsibility of a lawyer who relies on the work of a trainee solicitor or pupil barrister for example, or on information obtained from an Internet search.” - The misuse of AI has serious implications for the administration of justice and public confidence in the justice system. Accordingly, those with individual leadership responsibilities (including Heads of Chambers and Managing Partners) are expected to take practical and effective measures to ensure that:
“every individual currently providing legal services … understands and complies with their professional and ethical obligations and their duties to the court if using artificial intelligence”; - In future Hamid hearings, the profession can expect the court to inquire whether the above management responsibilities have been fulfilled;
- The risks posed to the administration of justice if fake material is placed before a court are such that, save in exceptional circumstances, mere admonishment by the court is unlikely to be a sufficient response. This suggests that the likelihood of the initiation of contempt proceedings may well rise in future cases; and
- It will be no excuse for solicitors to simply point to the involvement and primary responsibility of counsel. In this matter, the Court accepted that the Law Centre was both an over-stretched charity with limited resources and was providing important work to vulnerable members of society. However, it also noted:
“It could be said, however, that in those circumstances, it is all the more important that professional standards and maintained, and they instruct those who adhere to them…”
Reliance on Artificial Intelligence will only increase in the future. This case shows the perils of the over-reliance on such an inconsistent tool. It is clear from this Judgment that moving forward, the Court will be less forgiving of not only individual practitioners misusing AI, but also of their chambers / firms if adequate guidance and training has not been provided.