logo
#

Latest news with #FrankieJohnson

Alabama paid a law firm millions to defend its prisons. It used AI and turned in fake citations
Alabama paid a law firm millions to defend its prisons. It used AI and turned in fake citations

The Guardian

time24-05-2025

  • The Guardian

Alabama paid a law firm millions to defend its prisons. It used AI and turned in fake citations

In less than a year-and-a-half, Frankie Johnson, a man incarcerated at the William E Donaldson prison outside Birmingham, Alabama, says he was stabbed around 20 times. In December of 2019, Johnson says, he was stabbed 'at least nine times' in his housing unit. In March of 2020, an officer handcuffed him to a desk following a group therapy meeting, and left the unit, after which another prisoner came in and stabbed him five times. In November of the same year, Johnson says, he was handcuffed by an officer and brought to the prison yard, where another prisoner attacked him with an ice pick, stabbing him 'five to six times', as two correctional officers looked on. According to Johnson, one of the officers had actually encouraged his attacker to carry out the assault in retaliation for a previous argument between Johnson and the officer. In 2021, Johnson filed a lawsuit against Alabama prison officials for failing to keep him safe, rampant violence, understaffing, overcrowding and pervasive corruption in Alabama prisons. To defend the case, the Alabama attorney general's office turned to a law firm that for years has been paid millions of dollars by the state to defend its troubled prison system: Butler Snow. State officials have praised Butler Snow for their experience in defending prison cases – and specifically William Lunsford, head of the constitutional and civil rights litigation practice group at the firm. But now, the firm is facing sanctions by the federal judge overseeing Johnson's case after an attorney at the firm, working with Lunsford, cited cases generated by artificial intelligence – which turned out not to exist. It is one of a growing number of instances in which attorneys around the country have faced consequences for including false, AI-generated information in official legal filings. A database attempting to track the prevalence of the cases has identified 106 instances around the globe in which courts have found 'AI hallucinations' in court documents. Last year, an attorney was suspended for one year from practicing law in the federal middle district of Florida, after a committee found he had cited fabricated AI-generated cases. In California earlier this month, a federal judge ordered a firm to pay more than $30,000 in legal fees after they included false AI-generated research in a brief. At a hearing in Birmingham on Wednesday in Johnson's case, the US district judge Anna Manasco said that she was considering a wide range of sanctions – including fines, mandated continuing legal education, referrals to licensing organizations and temporary suspensions – against Butler Snow, after the attorney, Matthew Reeves, used ChatGPT to add false citations to filings related to ongoing deposition and discovery disputes in the case. She suggested that so far, the disciplinary actions that have been meted out around the country have not gone far enough. The current case is 'proof positive that those sanctions were insufficient', she told the lawyers. 'If they were, we wouldn't be here.' During the hearing, attorneys with Butler Snow were effusively apologetic, and said they would accept whatever sanctions Manasco determined were appropriate. They also pointed to a firm policy that requires attorneys to seek approval when using AI for legal research. Reeves attempted to take full responsibility. 'I was aware of the limitations on use [of AI], and in these two instances I did not comply with policy,' Reeves said. 'I would hope your honor would not punish my colleagues.' Attorneys with Butler Snow were appointed by the Alabama attorney general's office and are being paid by the state to defend Jefferson Dunn, the former commissioner of the Alabama department of corrections, in the case. Lunsford, who holds the contract with the state for the case, said that he had begun conducting a review of prior filings to make sure that there weren't more instances of false citations. 'This is very fresh and raw,' Lunsford told Manasco. 'The firm's response to this is not complete yet.' Manasco said that she would allow Butler Snow to file a motion within 10 days to explain what their process will be for addressing the problem before making a decision regarding sanctions. The use of the fake AI citations in the case came to light in relation to a scheduling dispute in the case. Attorneys with Butler Snow had contacted Johnson's attorneys to set up a deposition of Johnson, who is still in prison. Johnson's attorneys objected to the proposed dates, pointing to outstanding documents that they felt they were entitled to prior to Johnson being deposed. But in a court filing on 7 May, Butler Snow countered that case law mandated Johnson be deposed expeditiously. 'The Eleventh Circuit and district courts routinely authorize incarcerated depositions when proper notice is given and the deposition is relevant to claims or defenses, notwithstanding other discovery disputes,' they wrote. The attorneys listed four cases ostensibly backing up their assertion. It turns out they were all made up. While some of the cited cases resembled citations for real cases, none of them were relevant to the issue before the court. For instance, one was for a 2021 case entitled Kelley v City of Birmingham, but according to lawyers for Johnson, 'the sole existing case styled as Kelley v. City of Birmingham that Plaintiff's counsel could identify was decided by the Alabama Court of Appeals in 1939 regarding the resolution of a speeding ticket'. Earlier this week, lawyers for Johnson filed a motion pointing out the fabrications, and suggested they were the product of 'generative artificial intelligence'. They also found another apparently fabricated citation in a prior filing related to a dispute over discovery. The very next day, Manasco scheduled a hearing to determine whether the Butler Snow attorneys should be sanctioned. 'In the light of the seriousness of the accusation, the court has conducted independent searches for each allegedly fabricated citation, to no avail,' she wrote. In a declaration to the court, Reeves said that he had been reviewing the filings that were drafted by a more junior colleague, and wanted to include citations for what he 'believed to be well-established points of law'. 'I knew generally about ChatGPT,' Reeves wrote, continuing that he put in a search for supporting case law he needed for the motions, which 'immediately identified purportedly applicable citations for those points of law'. But in his 'haste to finalize the motions and get them filed', he 'failed to verify the case citations returned by ChatGPT through independent review in Westlaw or Pacer before including them.' 'I sincerely regret this lapse in diligence and judgment,' Reeves wrote. 'I take full responsibility.' Cases in which false AI content is making its way into legal filings appear to be increasing in frequency, said Damien Charlotin, a Paris-based legal researcher and academic who is attempting to track the cases. 'I'm seeing an acceleration,' he said. 'There are so many cases from the past few weeks and months compared to before.' So far, though, the response by courts to the problem has been remarkably lenient, Charlotin said. The more serious sanctions – including large fines and suspensions – tend to come when lawyers fail to take responsibility for their mistakes. 'I don't expect it to last,' Charlotin said. 'I think at some point everyone will be on notice.' In addition to the Johnson case, Lunsford and Butler Snow have contracts to work on several expansive civil rights cases against the Alabama department of corrections – including one brought by the United States Department of Justice under Donald Trump in 2020 that identifies many of the same wide-ranging systemic issues that Johnson pointed to in his suit, and alleges that the conditions violate the eighth amendment prohibition on cruel and unusual punishment. The contract for that case alone was worth nearly $15m dollars over two years at one point. Some Alabama lawmakers have questioned the amount that the state is spending on the firm to defend the cases. But it doesn't appear that the mistake this week has shaken the attorney general's confidence in Lunsford or Butler Snow to continue with their work, so far. On Wednesday, Manasco asked a lawyer with the attorney general's office, who was present at the hearing, whether or not they would stick with Butler Snow. 'Mr Lunsford remains the attorney general's counsel of choice,' he responded.

Trouble with AI 'hallucinations' spreads to big law firms
Trouble with AI 'hallucinations' spreads to big law firms

Reuters

time23-05-2025

  • Business
  • Reuters

Trouble with AI 'hallucinations' spreads to big law firms

May 23 (Reuters) - Another large law firm was forced to explain itself to a judge this week for submitting a court filing with made-up citations generated by an artificial intelligence chatbot. Attorneys from Mississippi-founded law firm Butler Snow apologized to U.S. District Judge Anna Manasco in Alabama after they inadvertently included case citations generated by ChatGPT in two court filings. Butler Snow partner Matthew Reeves said in a Monday filing that he regretted his "lapse in diligence and judgment" for failing to verify the citations. The 400-lawyer firm, which did not immediately respond to a request for comment, is defending former Alabama Department of Corrections Commissioner Jeff Dunn in an inmate's lawsuit alleging he was repeatedly attacked in prison. Dunn has denied wrongdoing. The judge has not yet said whether she will impose sanctions over the filings. Jamila Mensah of Norton Rose Fulbright, one of the lawyers representing plaintiff Frankie Johnson, declined to comment. AI-generated fictions, known as "hallucinations," have cropped up in court filings and landed attorneys in hot water ever since ChatGPT and other generative AI programs became widely available more than two years ago. Courts have sanctioned and admonished attorneys around the country for violating professional rules that require them to vet their work however it is produced. Many of the cases have involved small law firms or self-represented litigants. But examples of big firms or big companies grappling with AI hallucinations are growing. Last week a lawyer at law firm Latham & Watkins, which is defending AI company Anthropic in a copyright lawsuit related to music lyrics, apologized to a California federal judge after submitting, opens new tab an expert report that cited an article title invented by AI. Lawyers for the music publishers suing Anthropic have asked the judge to exclude the report. The judge has not yet ruled on the request. Earlier this month, a court-appointed special master imposed sanctions, opens new tab and ordered law firm K&L Gates and a smaller firm, Ellis George, to pay $31,100 for what he called a "collective debacle" in which they included inaccurate case citations and quotations stemming from the use of AI in a filing. Lawyers from the two firms are representing former Los Angeles County District Attorney Jackie Lacey in a dispute with insurance giant State Farm. A spokesperson for K&L Gates and a lawyer at Ellis George did not immediately respond to requests for comment. State Farm and its lawyers at Sheppard Mullin did not immediately respond to similar requests. The special master, retired judge Michael Wilner, wrote in the order that he had been "affirmatively misled" by the filing. "I read their brief, was persuaded (or at least intrigued) by the authorities that they cited, and looked up the decisions to learn more about them – only to find that they didn't exist," he wrote. "That's scary." Daniel Linna, a senior lecturer and director of law and technology initiatives at Northwestern's law and engineering schools, said the inclusion of AI-generated fabrications in court briefs is in part a result of a lack of education and training. Despite the perils, AI also has the potential to increase the quality of legal briefs and improve access to justice, he said.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store