logo
#

Latest news with #Pinto

South African lawyers call for rules for AI use in court
South African lawyers call for rules for AI use in court

Time of India

time24-07-2025

  • Business
  • Time of India

South African lawyers call for rules for AI use in court

A search with ChatGPT, an artificial intelligence (AI)chatbot, was intended to facilitate the work of a team of lawyers and find supplementary case examples for their arguments in a dispute before the High Court in Pietermaritzburg, in the coastal province of KwaZulu-Natal. Tired of too many ads? go ad free now The AI tool did just that, and the legal representatives submitted a notice of appeal in which they cited several authorities and case studies highlighted by the tool. But when the judge conducted an independent search using ChatGPT to verify one of the citations, he found to his utter amazement that many of the cited cases were not included in any recognized legal database. The court ultimately ruled against the plaintiff, stating in the written judgment: "The court has gained the impression that the lawyers placed false trust in the veracity of AI-generated legal research and, out of laziness, failed to check this research." Facts plucked out of thin air Tayla Pinto, a lawyer specializing in AI, data protection and IT law in Cape Town, sees a growing threat to the profession. "When asked how this happened and where the citations came from, the legal counsel admitted to using generative AI," Pinto told DW. "This shows that the problem of lawyers not knowing how to use generative AI responsibly and ethically is growing." According to Pinto, there are three cases in South Africa in which the legal advisers involved used AI to create their court documents. In June, there was a similar misapplication of AI in the case brought by mining company Northbound Processing against the South African Diamond and Precious Metals Regulatory Authority. This was also the case in 2023 in a defamation trial and the Pietermaritzburg High Court case, which caused a stir in court in 2024 and is now being reviewed by the Legal Practice Council and the provincial bar association. AI use must be 'ethical, responsible and consistent' The Pietermaritzburg case was brought by Philani Godfrey Mavundla, who was suspended as mayor of the Umvoti municipality in KwaZulu-Natal. Tired of too many ads? go ad free now At first instance, he even prevailed against the responsible regional authority. However, the latter lodged an appeal — and his lawyers apparently blindly relied on the truthfulness of the case studies provided by the AI before the High Court. This is not a technological problem, said lawyer Pinto. "We've always used technology in the form of calculators, spell and grammar checkers and so on. Now it's becoming a man-made problem," she said. "Given the way and pace at which AI is developing, if we are to use AI, we must ensure that we do so in a way that is ethical, responsible and consistent with the duties we have undertaken as a legal profession." The court dismissed Mavundla's application to appeal the community leadership case on the grounds of a low prospect of success, and criticized the pleading of the case as flawed and unprofessional. The judge ordered Mavundla's law firm to pay the costs of additional court appearances. With this order, the court expressed its disapproval of the law firm's conduct in submitting unverified and fictitious legal evidence. A copy of the judgment was sent to the Legal Practice Council in KwaZulu-Natal for investigation and possible disciplinary action against the lawyers involved. Abuse of AI-generated content 'creates mistrust among judges' Very few formal complaints have been lodged, although a number of matters are now starting to be referred to the Legal Practice Council (LPC) to look into, confirmed Kabelo Letebele, spokesperson for the Legal Practice Court in Johannesburg. The LPC continues to monitor developments and trends around artificial intelligence, he said. "At this stage the LPC holds the view that there is not yet a need for a new ethical rule and that our existing rules, regulation and code of conduct are adequately to deal with complaints that regulate the usage of AI — even though the debate on this continues within the LPC," he told DW. According to Letebele, legal practitioners are cautioned against blindly citing case law picked up using AI tools, as instances where there are inaccuracies will be deemed as negligence and as potentially misleading to the court. He stressed that the LPC Law Library is available to the legal practitioners at no cost, and practitioners are able to verify and find latest information regarding case laws and legal research required when preparing legal matters. In addition, awareness webinars are conducted for legal practitioners to highlight specific issues and tell them how they can avoid being in contravention of rules, regulations and code of conduct of the LPC. Judges, prosecutors and court officials need to be aware that briefs and arguments can now contain not only human errors, but also AI errors. "Judges rely heavily on the submissions of lawyers during court hearings, especially on legal aspects," said Mbekezeli Benjamin, a human rights lawyer and speaker at Judges Matter, which advocates for more transparency and accountability. Benjamin said he was concerned lawyers were relying too heavily on the use of AI, whose susceptibility to error could mislead the court. "This significantly weakens the judicial process because, unfortunately, it creates mistrust among judges regarding the accuracy of the statements made by lawyers in their arguments," he said. 'Clear guidelines,' reviewed code of conduct needed for AI use Lawyer Tayla Pinto sees no need for specific regulation of the use of AI for judicial research, but does see a need for special attention to the review of references submitted using AI and compliance with ethical standards. However, Benjamin said warnings within the legal profession to review the use of AI tools in production were not sufficient. "The Chamber should issue clear guidelines, including an amendment to the Code of Conduct, to regulate how AI should be used in judicial proceedings. But also make it clear that excessive reliance without reviewing AI content constitutes professional misconduct," he said. Benjamin also called for a revision of the profession's code of conduct so that the inappropriate use of artificial intelligence can be punished as a breach of duty with a hefty fine, or even exclusion or removal from the register of legal professionals. The South African Law Society has warned that even the inadvertent submission of false information can ruin a career.

South African lawyers call for rules for AI use in court – DW – 07/23/2025
South African lawyers call for rules for AI use in court – DW – 07/23/2025

DW

time23-07-2025

  • DW

South African lawyers call for rules for AI use in court – DW – 07/23/2025

The use of artificial intelligence proved disadvantageous for a legal team in South Africa when an AI tool fabricated case studies. Lawyers are now demanding clear guidelines and adherence to ethical standards. A search with ChatGPT, an artificial intelligence (AI)chatbot, was intended to facilitate the work of a team of lawyers and find supplementary case examples for their arguments in a dispute before the High Court in Pietermaritzburg, in the coastal province of KwaZulu-Natal. The AI tool did just that, and the legal representatives submitted a notice of appeal in which they cited several authorities and case studies highlighted by the tool. But when the judge conducted an independent search using ChatGPT to verify one of the citations, he found to his utter amazement that many of the cited cases were not included in any recognized legal database. The court ultimately ruled against the plaintiff, stating in the written judgment: "The court has gained the impression that the lawyers placed false trust in the veracity of AI-generated legal research and, out of laziness, failed to check this research." Tayla Pinto, a lawyer specializing in AI, data protection and IT law in Cape Town, sees a growing threat to the profession. "When asked how this happened and where the citations came from, the legal counsel admitted to using generative AI," Pinto told DW. "This shows that the problem of lawyers not knowing how to use generative AI responsibly and ethically is growing." According to Pinto, there are three cases in South Africa in which the legal advisers involved used AI to create their court documents. In June, there was a similar misapplication of AI in the case brought by mining company Northbound Processing against the South African Diamond and Precious Metals Regulatory Authority. This was also the case in 2023 in a defamation trial and the Pietermaritzburg High Court case, which caused a stir in court in 2024 and is now being reviewed by the Legal Practice Council and the provincial bar association. The Pietermaritzburg case was brought by Philani Godfrey Mavundla, who was suspended as mayor of the Umvoti municipality in KwaZulu-Natal. At first instance, he even prevailed against the responsible regional authority. However, the latter lodged an appeal — and his lawyers apparently blindly relied on the truthfulness of the case studies provided by the AI before the High Court. This is not a technological problem, said lawyer Pinto. "We've always used technology in the form of calculators, spell and grammar checkers and so on. Now it's becoming a man-made problem," she said. "Given the way and pace at which AI is developing, if we are to use AI, we must ensure that we do so in a way that is ethical, responsible and consistent with the duties we have undertaken as a legal profession." The court dismissed Mavundla's application to appeal the community leadership case on the grounds of a low prospect of success, and criticized the pleading of the case as flawed and unprofessional. The judge ordered Mavundla's law firm to pay the costs of additional court appearances. With this order, the court expressed its disapproval of the law firm's conduct in submitting unverified and fictitious legal evidence. A copy of the judgment was sent to the Legal Practice Council in KwaZulu-Natal for investigation and possible disciplinary action against the lawyers involved. Very few formal complaints have been lodged, although a number of matters are now starting to be referred to the Legal Practice Council (LPC) to look into, confirmed Kabelo Letebele, spokesperson for the Legal Practice Court in Johannesburg. The LPC continues to monitor developments and trends around artificial intelligence, he said. "At this stage the LPC holds the view that there is not yet a need for a new ethical rule and that our existing rules, regulation and code of conduct are adequately to deal with complaints that regulate the usage of AI — even though the debate on this continues within the LPC," he told DW. According to Letebele, legal practitioners are cautioned against blindly citing case law picked up using AI tools, as instances where there are inaccuracies will be deemed as negligence and as potentially misleading to the court. He stressed that the LPC Law Library is available to the legal practitioners at no cost, and practitioners are able to verify and find latest information regarding case laws and legal research required when preparing legal matters. In addition, awareness webinars are conducted for legal practitioners to highlight specific issues and tell them how they can avoid being in contravention of rules, regulations and code of conduct of the LPC. Judges, prosecutors and court officials need to be aware that briefs and arguments can now contain not only human errors, but also AI errors. To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video "Judges rely heavily on the submissions of lawyers during court hearings, especially on legal aspects," said Mbekezeli Benjamin, a human rights lawyer and speaker at Judges Matter, which advocates for more transparency and accountability. Benjamin said he was concerned lawyers were relying too heavily on the use of AI, whose susceptibility to error could mislead the court. "This significantly weakens the judicial process because, unfortunately, it creates mistrust among judges regarding the accuracy of the statements made by lawyers in their arguments," he said. Lawyer Tayla Pinto sees no need for specific regulation of the use of AI for judicial research, but does see a need for special attention to the review of references submitted using AI and compliance with ethical standards. However, Benjamin said warnings within the legal profession to review the use of AI tools in production were not sufficient. "The Chamber should issue clear guidelines, including an amendment to the Code of Conduct, to regulate how AI should be used in judicial proceedings. But also make it clear that excessive reliance without reviewing AI content constitutes professional misconduct," he said. Benjamin also called for a revision of the profession's code of conduct so that the inappropriate use of artificial intelligence can be punished as a breach of duty with a hefty fine, or even exclusion or removal from the register of legal professionals. The South African Law Society has warned that even the inadvertent submission of false information can ruin a career. To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video

When AI makes mistakes in the courtroom – DW – 07/23/2025
When AI makes mistakes in the courtroom – DW – 07/23/2025

DW

time23-07-2025

  • DW

When AI makes mistakes in the courtroom – DW – 07/23/2025

The use of artificial intelligence proved disadvantageous for a legal team in South Africa when an AI tool fabricated case studies. Lawyers are now demanding clear guidelines and adherence to ethical standards. Artificial intelligence (AI) tricked lawyers in a South African court in the coastal province of KwaZulu-Natal, with serious consequences: the AI invented facts. The search with ChatGPT, an AI chatbot, was intended to facilitate the work of a team of lawyers and find supplementary case examples for their arguments in a dispute before the High Court in Pietermaritzburg. The AI did just that. The legal representatives submitted a notice of appeal in which they cited several authorities and case studies. The judge conducted an independent search using ChatGPT to verify one of the citations. To his utter amazement, many of the cited cases were not included in any recognized legal database. The court ultimately ruled against the plaintiff, stating in the written judgment: "The court has gained the impression that the lawyers placed false trust in the veracity of AI-generated legal research and, out of laziness, failed to check this research." Tayla Pinto, a lawyer specializing in AI, data protection and IT law, sees a growing threat to the profession: "When asked how this happened and where the citations came from, the legal counsel admitted to using generative AI," Pinto told DW. "This shows that the problem of lawyers not knowing how to use generative AI responsibly and ethically is growing." According to Pinto, there are three cases in South Africa in which the legal advisors involved used AI to create their court documents: In June, there was a similar misapplication of AI in the case brought by mining company Northbound Processing against the South African Diamond and Precious Metals Regulatory Authority. This was also the case in 2023 in a defamation trial and the Pietermaritzburg High Court case, which caused a stir in court in 2024 and is now being reviewed by the Legal Practice Council the provincial bar association. The case was brought by Philani Godfrey Mavundla, who was suspended as mayor of the Umvoti municipality in KwaZulu-Natal. At first instance, he even prevailed against the responsible regional authority. However, the latter lodged an appeal - and his lawyers apparently blindly relied on the truthfulness of the case studies provided by the AI before the High Court. This is not a technological problem, says lawyer Pinto. "We've always used technology in the form of calculators, spell and grammar checkers and so on. Now it's becoming a man-made problem." She added: "Given the way and pace at which AI is developing, if we are to use AI, we must ensure that we do so in a way that is ethical, responsible and consistent with the duties we have undertaken as a legal profession." The court dismissed Mavundla's application to appeal the community leadership case on the grounds of a low prospect of success and criticized the pleading of the case as flawed and unprofessional. The judge ordered Mavundla's law firm to pay the costs of additional court appearances. With this order, the court expressed its disapproval of the law firm's conduct in submitting unverified and fictitious legal evidence. A copy of the judgment was sent to the Legal Practice Council in KwaZulu-Natal for investigation and possible disciplinary action against the lawyers involved. Very few formal complaints have been lodged, although a number of matters are now starting to be referred to the Legal Practice Council (LPC) to look into, confirms Kabele Letebele, spokesperson of the Legal Practice Court in Johannesburg. The LPC continues to monitor developments and trends around artificial intelligence, he says. "At this stage the LPC holds the view that there is not yet a need for a new ethical rule and that our existing rules, regulation and code of conduct are adequately to deal with complaints that regulate the usage of AI - even though the debate on this continues within the LPC", Letebele says to DW. According to Letebele, legal practitioners are cautioned against blindly citing caselaw picked up using AI tools, as instances where there are inaccuracies will be deemed as negligence and as potentially misleading to the court. He stresses, that the LPC Law Library is available to the legal practitioners at no cost, the practitioners are able to verify and find latest information regarding case laws and legal researech required when preparing legal matters. In addition, there are awareness webinars conducted for legal practitioners to highlight issues that the LPC is pickung up and how they can ensure to avoid being in contravention of rules, regulations and code of conduct of the LPC. Judges, prosecutors and court officials need to be aware that briefs and arguments can now contain not only human errors, but also AI errors. "Judges rely heavily on the submissions of lawyers during court hearings, especially on legal aspects," says Mbekezeli Benjamin, human rights lawyer and speaker at Judges Matter, in a DW interview. The organization advocates for more transparency and accountability. Benjamin expresses great concern when lawyers rely too heavily on the use of AI, whose susceptibility to error could mislead the court. "This significantly weakens the judicial process because, unfortunately, it creates mistrust among judges regarding the accuracy of the statements made by lawyers in their arguments," he says. Lawyer Tayla Pinto sees no need for specific regulation of the use of AI for judicial research, but does see a need for special attention to the review of references submitted using AI and compliance with ethical standards. However, Benjamin, says warnings within the legal profession to review the use of AI tools in production is not sufficient. "The Chamber should issue clear guidelines, including an amendment to the Code of Conduct, to regulate how AI should be used in judicial proceedings. But also make it clear that excessive reliance without reviewing AI content constitutes professional misconduct." Benjamin also calls for a revision of the profession's code of conduct so that the inappropriate use of artificial intelligence can be punished as a breach of duty with a hefty fine or even exclusion or removal from the register of legal professionals. The South African Law Society also warns that even the inadvertent submission of false information can ruin a career. To view this video please enable JavaScript, and consider upgrading to a web browser that supports HTML5 video

Freida Pinto to lead series 'Unaccustomed Earth'
Freida Pinto to lead series 'Unaccustomed Earth'

Time of India

time18-07-2025

  • Entertainment
  • Time of India

Freida Pinto to lead series 'Unaccustomed Earth'

Picture Credit: X Actress Freida Pinto is all set to star in the upcoming series ' Unaccustomed Earth ,' an adaptation of the Jhumpa Lahiri short story collection. Pinto will play Parul Chaudhury in the series, which was originally announced in April. As previously reported, Netflix has commissioned eight episodes, reports The official tagline for the show states that it is "an epic, soapy, and culturally vibrant drama about a tight-knit Indian American community navigating love, desire, and belonging." "Rich with nuance, passion, and unforgettable characters, 'Unaccustomed Earth' invites you into the elite and insular Indian-American community of Cambridge. When a star-crossed romance between a devoted wife and her long lost love comes to light, a scandalous affair is born and new battle lines are drawn in this intensely interconnected immigrant community." by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like American Investor Warren Buffett Recommends: 5 Books For Turning Your Life Around Blinkist: Warren Buffett's Reading List Undo Pinto is known for her breakout role in the Oscar-winning film "Slumdog Millionaire" by Danny Boyle, which marked her debut in cinema. She was later seen in films like "Rise of the Planet of the Apes," "Knight of Cups," "Hillbilly Elegy," and "Mr. Malcolm's List." In television, Pinto is known for her roles in "Guerilla," "Surface," and "The Path." John Wells will co-write the adaptation with Madhuri Shekar . Nisha Ganatra, who had originally optioned the book and developed the series, will also executive produce. Lahiri will executive produce as well, along with Erica Saleh, Erin Jontow of JWP, and Celia Costas. Warner Bros. Television, where Wells is under an overall deal, will produce. This is the latest Lahiri book to be adapted for the screen. Her novel "The Namesake" was made into a film in 2006 starring Kal Penn, Tabu, and late star Irrfan Khan. The Namesake was directed by Mira Nair. It told the story of immigrant Bengali parents Ashima and Ashoke, who try to adjust to life in America, while their son Gogol tries to find his identity and choose between the two worlds.

Freida Pinto to lead Netflix's 'Unaccustomed Earth' adaptation
Freida Pinto to lead Netflix's 'Unaccustomed Earth' adaptation

UPI

time17-07-2025

  • Entertainment
  • UPI

Freida Pinto to lead Netflix's 'Unaccustomed Earth' adaptation

1 of 5 | Freida Pinto, seen at the 2023 New York Women in Film and Television 43rd Annual Muse Awards, will star in "Unaccustomed Earth." File Photo by John Angelillo/UPI | License Photo July 17 (UPI) -- Netflix announced Thursday that Freida Pinto will lead an adaptation of Unaccustomed Earth on the streaming service. The series is based on the Jhumpa Lahiri short story collection. Pinto plays Parul Chaudhury in the show, which is set in an Indian-American community in Cambridge, Mass. A married woman has an affair with her forbidden lover, creating drama within an ensemble of characters in the community. Madhuri Shekar adapts the stories as writer and showrunner. Shekar executive produces with John Wells, Ritesh Batra, Nisha Ganatra, Erica Saleh, Erin Jontow, Celia Costas and Lahiri. The series will be eight episodes. Batra will direct the first two. Pinto made her feature film debut in the Oscar-winning Slumdog Millionaire. She has since appeared in Rise of the Planet of the Apes, Immortals and series The Path, Guerilla and Surface.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store