logo
UK judge warns of risk to justice after lawyers cited fake AI-generated cases in court

UK judge warns of risk to justice after lawyers cited fake AI-generated cases in court

Washington Post12 hours ago

LONDON — Lawyers have cited fake cases generated by artificial intelligence in court proceedings in England, a judge has said — warning that attorneys could be prosecuted if they don't check the accuracy of their research.
High Court justice Victoria Sharp said the misuse of AI has 'serious implications for the administration of justice and public confidence in the justice system.'

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Hunt for Father Accused of Killing 3 Daughters Expands in Washington Forest
Hunt for Father Accused of Killing 3 Daughters Expands in Washington Forest

New York Times

time2 hours ago

  • New York Times

Hunt for Father Accused of Killing 3 Daughters Expands in Washington Forest

The National Guard is joining the search in Washington State for a man the police say killed his three young daughters. More than a hundred law enforcement officers are combing parts of central and northern Washington State for the man, Travis Decker, who the police say has wilderness skills that could help him survive for weeks on his own. The police say Mr. Decker kidnapped the three girls — Olivia Decker, 5; Evelyn, 8; and Paityn, 9 — on May 30. They were found dead on Monday near a campsite roughly 70 miles east of Seattle. Each child had a plastic bag over her head, according to an affidavit from the police in Wenatchee, Wash., where the girls lived with their mother. Large swaths of wilderness around where the girls' bodies were found have been closed to the public as the manhunt has intensified. The U.S. Forest Service closed some trails, roads and campsites in the Okanogan-Wenatchee National Forest. On Thursday, the National Park Service closed parts of the Lake Chelan National Recreation Area, a part of the North Cascades National Park Service Complex to the north of the national forest. On Friday, Gov. Bob Ferguson pledged to support the investigation by sending National Guard resources and emergency funding. Want all of The Times? Subscribe.

Lawyers could face ‘severe' penalties for fake AI-generated citations, UK court warns
Lawyers could face ‘severe' penalties for fake AI-generated citations, UK court warns

Yahoo

time4 hours ago

  • Yahoo

Lawyers could face ‘severe' penalties for fake AI-generated citations, UK court warns

The High Court of England and Wales says lawyers need to take stronger steps to prevent the misuse of artificial intelligence in their work. In a ruling tying together two recent cases, Judge Victoria Sharp wrote that generative AI tools like ChatGPT 'are not capable of conducting reliable legal research." 'Such tools can produce apparently coherent and plausible responses to prompts, but those coherent and plausible responses may turn out to be entirely incorrect,' Judge Sharp wrote. 'The responses may make confident assertions that are simply untrue.' That doesn't mean lawyers cannot use AI in their research, but she said they have a professional duty 'to check the accuracy of such research by reference to authoritative sources, before using it in the course of their professional work.' Judge Sharp suggested that the growing number of cases where lawyers (including, on the U.S. side, lawyers representing major AI platforms) have cited what appear to be AI-generated falsehoods suggests that 'more needs to be done to ensure that the guidance is followed and lawyers comply with their duties to the court,' and she said her ruling will be forwarded to professional bodies including the Bar Council and the Law Society. In one of the cases in question, a lawyer representing a man seeking damages against two banks submitted a filing with 45 citations — 18 of those cases did not exist, while many others 'did not contain the quotations that were attributed to them, did not support the propositions for which they were cited, and did not have any relevance to the subject matter of the application,' Judge Sharp said. In the other, a lawyer representing a man who had been evicted from his London home wrote a court filing citing five cases that did not appear to exist. (The lawyer denied using AI, though she said the citations may have come from AI-generated summaries that appeared in 'Google or Safari.') Judge Sharp said that while the court decided not to initiate contempt proceedings, that is 'not a precedent.' 'Lawyers who do not comply with their professional obligations in this respect risk severe sanction,' she added. Both lawyers were either referred or referred themselves to professional regulators. Judge Sharp noted that when lawyers do not meet their duties to the court, the court's powers range from 'public admonition' to the imposition of costs, contempt proceedings, or even 'referral to the police.' This article originally appeared on TechCrunch at Sign in to access your portfolio

Lawyers could face ‘severe' penalties for fake AI-generated citations, UK court warns
Lawyers could face ‘severe' penalties for fake AI-generated citations, UK court warns

Yahoo

time4 hours ago

  • Yahoo

Lawyers could face ‘severe' penalties for fake AI-generated citations, UK court warns

The High Court of England and Wales says lawyers need to take stronger steps to prevent the misuse of artificial intelligence in their work. In a ruling tying together two recent cases, Judge Victoria Sharp wrote that generative AI tools like ChatGPT 'are not capable of conducting reliable legal research." 'Such tools can produce apparently coherent and plausible responses to prompts, but those coherent and plausible responses may turn out to be entirely incorrect,' Judge Sharp wrote. 'The responses may make confident assertions that are simply untrue.' That doesn't mean lawyers cannot use AI in their research, but she said they have a professional duty 'to check the accuracy of such research by reference to authoritative sources, before using it in the course of their professional work.' Judge Sharp suggested that the growing number of cases where lawyers (including, on the U.S. side, lawyers representing major AI platforms) have cited what appear to be AI-generated falsehoods suggests that 'more needs to be done to ensure that the guidance is followed and lawyers comply with their duties to the court,' and she said her ruling will be forwarded to professional bodies including the Bar Council and the Law Society. In one of the cases in question, a lawyer representing a man seeking damages against two banks submitted a filing with 45 citations — 18 of those cases did not exist, while many others 'did not contain the quotations that were attributed to them, did not support the propositions for which they were cited, and did not have any relevance to the subject matter of the application,' Judge Sharp said. In the other, a lawyer representing a man who had been evicted from his London home wrote a court filing citing five cases that did not appear to exist. (The lawyer denied using AI, though she said the citations may have come from AI-generated summaries that appeared in 'Google or Safari.') Judge Sharp said that while the court decided not to initiate contempt proceedings, that is 'not a precedent.' 'Lawyers who do not comply with their professional obligations in this respect risk severe sanction,' she added. Both lawyers were either referred or referred themselves to professional regulators. Judge Sharp noted that when lawyers do not meet their duties to the court, the court's powers range from 'public admonition' to the imposition of costs, contempt proceedings, or even 'referral to the police.' Error in retrieving data Sign in to access your portfolio Error in retrieving data

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store