Latest news with #HaringeyLawCentre


The Guardian
4 days ago
- The Guardian
High court tells UK lawyers to ‘urgently' stop misuse of AI in legal work
The high court has told senior lawyers to take urgent action to prevent the misuse of artificial intelligence after dozens of fake case-law citations were put before the courts that were either completely fictitious or contained made-up passages. Lawyers are increasingly using AI systems to help them build legal arguments, but two cases this year were blighted by made-up case-law citations which were either definitely or suspected to have been generated by AI. In a £89m damages case against the Qatar National Bank, the claimants made 45 case-law citations, 18 of which turned out to be fictitious, with quotes in many of the others also bogus. The claimant admitted using publicly available AI tools and his solicitor accepted he cited the sham authorities. When Haringey Law Centre challenged the London borough of Haringey over its alleged failure to provide its client with temporary accommodation, its lawyer cited phantom case law five times. Suspicions were raised when the solicitor defending the council had to repeatedly query why they could not find any trace of the supposed authorities. It resulted in a legal action for wasted legal costs and a court found the law centre and its lawyer, a pupil barrister, were negligent. The barrister denied using AI in that case but said she may have inadvertently done so while using Google or Safari in preparation for a separate case where she also cited phantom authorities. In that case she said she may have taken account of AI summaries without realising what they were. In a regulatory ruling responding to the cases on Friday, Dame Victoria Sharp, the president of the King's bench division, said there were 'serious implications for the administration of justice and public confidence in the justice system if artificial intelligence is misused' and that lawyers misusing AI could face sanctions, from public admonishment to facing contempt of court proceedings and referral to the police. She called on the Bar Council and the Law Society to consider steps to curb the problem 'as a matter of urgency' and told heads of barristers' chambers and managing partners of solicitors to ensure all lawyers know their professional and ethical duties if using AI. 'Such tools can produce apparently coherent and plausible responses to prompts, but those coherent and plausible responses may turn out to be entirely incorrect,' she wrote. 'The responses may make confident assertions that are simply untrue. They may cite sources that do not exist. They may purport to quote passages from a genuine source that do not appear in that source.' Ian Jeffery, the chief executive of the Law Society of England and Wales, said the ruling 'lays bare the dangers of using AI in legal work'. 'Artificial intelligence tools are increasingly used to support legal service delivery,' he added. 'However, the real risk of incorrect outputs produced by generative AI requires lawyers to check, review and ensure the accuracy of their work.' Sign up to First Edition Our morning email breaks down the key stories of the day, telling you what's happening and why it matters after newsletter promotion The cases are not the first to have been blighted by AI-created hallucinations. In a UK tax tribunal in 2023, an appellant who claimed to have been helped by 'a friend in a solicitor's office' provided nine bogus historical tribunal decisions as supposed precedents. She admitted it was 'possible' she had used ChatGPT, but said it surely made no difference as there must be other cases that made her point. The appellants in a €5.8m (£4.9m) Danish case this year narrowly avoided contempt proceedings when they relied on a made-up ruling that the judge spotted. And a 2023 case in the US district court for the southern district of New York descended into chaos when a lawyer was challenged to produce the seven apparently fictitious cases they had cited. The simply asked ChatGPT to summarise the cases it had already made up and the result, said the judge was 'gibberish' and fined the two lawyers and their firm $5,000.


Metro
09-05-2025
- Metro
Judge scolds barrister for using 'made-up cases' in her court arguments
A High Court judge has condemned a team of lawyers for basing arguments on five cases which turned out to be 'made-up'. Barrister Sarah Forey was instructed by solicitors at Haringey Law Centre to act for a homeless man who was claiming priority housing from Haringey Council in London. Ms Forey cited a number of cases – examples of previous legal rulings used to support an argument – in written submissions to the High Court. Lawyers for the council said they could not find five of the cases, suggesting the only explanation would be that Ms Forey used Artificial Intelligence (AI) tools. They asked for clarification from Haringey Law Centre, who dismissed the issue as 'cosmetic errors' and said any problems were 'easily explained'. Haringey Law Centre lawyer Sunnelah Hussain suggested the council lawyers raised the matter 'as technicalities to avoid undertaking really serious legal research'. The presiding judge, Mr Justice Ritchie, blasted Haringey Law Centre's response as 'grossly unprofessional'. In his ruling on the case, he said the solicitors and Ms Forey had shown 'appalling professional misbehaviour'. He said he was unable to reach a verdict on whether they did use AI 'because Ms Forey was not sworn in and was not cross examined'. But he accused them of 'misleading the Court' by submitting 'fake cases' and then trying to 'finesse them into being 'minor citation errors''. The judge also dismissed Ms Forey's claim that the error arose from filing and photocopying mistakes, saying: 'I do not accept that it is possible to photocopy a non-existent case and tabulate it.' He said the team had presented a 'reasonable and fair' case, suggesting they would have had a strong chance to win if they hadn't used the fake cases. More Trending Mr Justice Ritchie continued: 'The submission was a good one. The medical evidence was strong. The ground was potentially good. Why put a fake case in? 'On the balance of probabilities, I consider that it would have been negligent for this barrister, if she used AI and did not check it, to put that text into her pleading.' The judge ordered his ruling to be sent Bar Standards Board and the Solicitors Regulation Authority, saying Ms Forey and Haringey Law Centres should self-report to the watchdogs. The case was settled in favour of the homeless man, Frederick Ayinde, but ordered Ms Forey's team to pay wasted court costs. Get in touch with our news team by emailing us at webnews@ For more stories like this, check our news page. MORE: Mum wins court appeal to stop daughter having to stay overnight with rapist dad MORE: Harry claims police protection withdrawn to 'trap' him and Meghan in UK