Latest news with #JudgeVictoriaSharp

Yahoo
8 hours ago
- Business
- Yahoo
Lawyers could face ‘severe' penalties for fake AI-generated citations, UK court warns
The High Court of England and Wales says lawyers need to take stronger steps to prevent the misuse of artificial intelligence in their work. In a ruling tying together two recent cases, Judge Victoria Sharp wrote that generative AI tools like ChatGPT 'are not capable of conducting reliable legal research." 'Such tools can produce apparently coherent and plausible responses to prompts, but those coherent and plausible responses may turn out to be entirely incorrect,' Judge Sharp wrote. 'The responses may make confident assertions that are simply untrue.' That doesn't mean lawyers cannot use AI in their research, but she said they have a professional duty 'to check the accuracy of such research by reference to authoritative sources, before using it in the course of their professional work.' Judge Sharp suggested that the growing number of cases where lawyers (including, on the U.S. side, lawyers representing major AI platforms) have cited what appear to be AI-generated falsehoods suggests that 'more needs to be done to ensure that the guidance is followed and lawyers comply with their duties to the court,' and she said her ruling will be forwarded to professional bodies including the Bar Council and the Law Society. In one of the cases in question, a lawyer representing a man seeking damages against two banks submitted a filing with 45 citations — 18 of those cases did not exist, while many others 'did not contain the quotations that were attributed to them, did not support the propositions for which they were cited, and did not have any relevance to the subject matter of the application,' Judge Sharp said. In the other, a lawyer representing a man who had been evicted from his London home wrote a court filing citing five cases that did not appear to exist. (The lawyer denied using AI, though she said the citations may have come from AI-generated summaries that appeared in 'Google or Safari.') Judge Sharp said that while the court decided not to initiate contempt proceedings, that is 'not a precedent.' 'Lawyers who do not comply with their professional obligations in this respect risk severe sanction,' she added. Both lawyers were either referred or referred themselves to professional regulators. Judge Sharp noted that when lawyers do not meet their duties to the court, the court's powers range from 'public admonition' to the imposition of costs, contempt proceedings, or even 'referral to the police.' This article originally appeared on TechCrunch at Sign in to access your portfolio
Yahoo
9 hours ago
- Business
- Yahoo
Lawyers could face ‘severe' penalties for fake AI-generated citations, UK court warns
The High Court of England and Wales says lawyers need to take stronger steps to prevent the misuse of artificial intelligence in their work. In a ruling tying together two recent cases, Judge Victoria Sharp wrote that generative AI tools like ChatGPT 'are not capable of conducting reliable legal research." 'Such tools can produce apparently coherent and plausible responses to prompts, but those coherent and plausible responses may turn out to be entirely incorrect,' Judge Sharp wrote. 'The responses may make confident assertions that are simply untrue.' That doesn't mean lawyers cannot use AI in their research, but she said they have a professional duty 'to check the accuracy of such research by reference to authoritative sources, before using it in the course of their professional work.' Judge Sharp suggested that the growing number of cases where lawyers (including, on the U.S. side, lawyers representing major AI platforms) have cited what appear to be AI-generated falsehoods suggests that 'more needs to be done to ensure that the guidance is followed and lawyers comply with their duties to the court,' and she said her ruling will be forwarded to professional bodies including the Bar Council and the Law Society. In one of the cases in question, a lawyer representing a man seeking damages against two banks submitted a filing with 45 citations — 18 of those cases did not exist, while many others 'did not contain the quotations that were attributed to them, did not support the propositions for which they were cited, and did not have any relevance to the subject matter of the application,' Judge Sharp said. In the other, a lawyer representing a man who had been evicted from his London home wrote a court filing citing five cases that did not appear to exist. (The lawyer denied using AI, though she said the citations may have come from AI-generated summaries that appeared in 'Google or Safari.') Judge Sharp said that while the court decided not to initiate contempt proceedings, that is 'not a precedent.' 'Lawyers who do not comply with their professional obligations in this respect risk severe sanction,' she added. Both lawyers were either referred or referred themselves to professional regulators. Judge Sharp noted that when lawyers do not meet their duties to the court, the court's powers range from 'public admonition' to the imposition of costs, contempt proceedings, or even 'referral to the police.' Error in retrieving data Sign in to access your portfolio Error in retrieving data


TechCrunch
9 hours ago
- Business
- TechCrunch
Lawyers could face ‘severe' penalties for fake AI-generated citations, UK court warns
The High Court of England and Wales says lawyers need to take stronger steps to prevent the misuse of artificial intelligence in their work. In a ruling tying together two recent cases, Judge Victoria Sharp wrote that generative AI tools like ChatGPT 'are not capable of conducting reliable legal research.' 'Such tools can produce apparently coherent and plausible responses to prompts, but those coherent and plausible responses may turn out to be entirely incorrect,' Judge Sharp wrote. 'The responses may make confident assertions that are simply untrue.' That doesn't mean lawyers cannot use AI in their research, but she said they have a professional duty 'to check the accuracy of such research by reference to authoritative sources, before using it in the course of their professional work.' Judge Sharp suggested that the growing number of cases where lawyers (including, on the U.S. side, lawyers representing major AI platforms) have cited what appear to be AI-generated falsehoods suggests that 'more needs to be done to ensure that the guidance is followed and lawyers comply with their duties to the court,' and she said her ruling will be forwarded to professional bodies including the Bar Council and the Law Society. In one of the cases in question, a lawyer representing a man seeking damages against two banks submitted a filing with 45 citations — 18 of those cases did not exist, while many others 'did not contain the quotations that were attributed to them, did not support the propositions for which they were cited, and did not have any relevance to the subject matter of the application,' Judge Sharp said. In the other, a lawyer representing a man who had been evicted from his London home wrote a court filing citing five cases that did not appear to exist. (The lawyer denied using AI, though she said the citations may have come from AI-generated summaries that appeared in 'Google or Safari.') Judge Sharp said that while the court decided not to initiate contempt proceedings, that is 'not a precedent.' Techcrunch event Save $200+ on your TechCrunch All Stage pass Build smarter. Scale faster. Connect deeper. Join visionaries from Precursor Ventures, NEA, Index Ventures, Underscore VC, and beyond for a day packed with strategies, workshops, and meaningful connections. Save $200+ on your TechCrunch All Stage pass Build smarter. Scale faster. Connect deeper. Join visionaries from Precursor Ventures, NEA, Index Ventures, Underscore VC, and beyond for a day packed with strategies, workshops, and meaningful connections. Boston, MA | REGISTER NOW 'Lawyers who do not comply with their professional obligations in this respect risk severe sanction,' she added. Both lawyers were either referred or referred themselves to professional regulators. Judge Sharp noted that when lawyers do not meet their duties to the court, the court's powers range from 'public admonition' to the imposition of costs, contempt proceedings, or even 'referral to the police.'


CNA
2 days ago
- Politics
- CNA
Lawyers face sanctions for citing fake cases with AI, warns UK judge
LONDON :Lawyers who use artificial intelligence to cite non-existent cases can be held in contempt of court or even face criminal charges, London's High Court warned on Friday, in the latest example of generative AI leading lawyers astray. A senior judge lambasted lawyers in two cases who apparently used AI tools when preparing written arguments, which referred to fake case law, and called on regulators and industry leaders to ensure lawyers know their ethical obligations. "There are serious implications for the administration of justice and public confidence in the justice system if artificial intelligence is misused," Judge Victoria Sharp said in a written ruling. "In those circumstances, practical and effective measures must now be taken by those within the legal profession with individual leadership responsibilities ... and by those with the responsibility for regulating the provision of legal services." The ruling comes after lawyers around the world have been forced to explain themselves for relying on false authorities, since ChatGPT and other generative AI tools became widely available more than two years ago. Sharp warned in her ruling that lawyers who refer to non-existent cases will be in breach of their duty to not mislead the court, which could also amount to contempt of court. She added that "in the most egregious cases, deliberately placing false material before the court with the intention of interfering with the administration of justice amounts to the common law criminal offence of perverting the course of justice". Sharp noted that legal regulators and the judiciary had issued guidance about the use of AI by lawyers, but said that "guidance on its own is insufficient to address the misuse of artificial intelligence".


Reuters
2 days ago
- Reuters
Lawyers face sanctions for citing fake cases with AI, warns UK judge
LONDON, June 6 (Reuters) - Lawyers who use artificial intelligence to cite non-existent cases can be held in contempt of court or even face criminal charges, London's High Court warned on Friday, in the latest example of generative AI leading lawyers astray. A senior judge lambasted lawyers in two cases who apparently used AI tools when preparing written arguments, which referred to fake case law, and called on regulators and industry leaders to ensure lawyers know their ethical obligations. "There are serious implications for the administration of justice and public confidence in the justice system if artificial intelligence is misused," Judge Victoria Sharp said in a written ruling, opens new tab. "In those circumstances, practical and effective measures must now be taken by those within the legal profession with individual leadership responsibilities ... and by those with the responsibility for regulating the provision of legal services." The ruling comes after lawyers around the world have been forced to explain themselves for relying on false authorities, since ChatGPT and other generative AI tools became widely available more than two years ago. Sharp warned in her ruling that lawyers who refer to non-existent cases will be in breach of their duty to not mislead the court, which could also amount to contempt of court. She added that "in the most egregious cases, deliberately placing false material before the court with the intention of interfering with the administration of justice amounts to the common law criminal offence of perverting the course of justice". Sharp noted that legal regulators and the judiciary had issued guidance about the use of AI by lawyers, but said that "guidance on its own is insufficient to address the misuse of artificial intelligence".