logo
An AI-generated shooting victim forgave his killer in a U.S. court. Could it happen in Canada?

An AI-generated shooting victim forgave his killer in a U.S. court. Could it happen in Canada?

For years, Stacey Wales tried to brainstorm what she would say at the sentencing of her brother's killer.
'I wanted to yell,' Wales told the Star in an interview on Friday. 'I would have these thoughts bubble up, while I was driving or in the shower, often of anger or frustration, and just read them into my phone.'
In 2021, Wales' brother, Christopher Pelkey, was fatally shot while at a red light in Chandler, Arizona. His killer, Gabriel Horcasitas, first faced a jury in 2023, but the case ended in a mistrial. After a retrial in March, he was found guilty of manslaughter.
When it came time for Wales to put pen to paper, all she could hear was Pelkey's voice. So, she began to write in his words. It worked.
Then, with the help of her husband, who has experience using generative artificial intelligence, Wales set off to create a video of her brother's likeness, reading the statement in his own voice.
The video was the last of 10 statements read out at the May 1 sentencing hearing.
'To Gabriel Horcasitas, the man who shot me, it is a shame we encountered each other that day in those circumstances,' Pelkey's facsimile, donning a grey baseball cap, told in court. 'In another life, we probably could have been friends.'
'I believe in forgiveness, and a God who forgives. I always have and I still do.'
It wasn't a perfect likeness. The recreation of Pelkey jolts unnaturally throughout the nearly four-minute video. But it seemed to leave a favourable impression on Maricopa County Superior Court Justice Todd Lang, who described it as 'genuine.'
'I loved that AI,' Lang said. 'Thank you for that. And as angry as you are, and justifiably angry as the family is, I heard the forgiveness.
Horcasitas received just over 10.5 years' jail time.
The case joins a growing list of U.S. court proceedings in which parties have reached for generative artificial intelligence.
In a high-profile example from 2023, former lawyer for President Trump, Michael Cohen, claimed he'd
unwittingly sent his attorney fake AI-generated legal citations
. More recently, a plaintiff in a New York court tried to employ
an AI-generated avatar to argue on his behalf
— an attempt that was quickly swatted down by the judge.
For Ryan Fritsch, policy counsel of the Law Commission of Ontario, the rise in use 'speaks to the interest and enthusiasm out there for new forms of efficiencies in the criminal justice system.'
'There are some considerable promises,' Fritsch told the Star on Friday. 'But at the same time, concerns should arise if there are not sufficient rules, guardrails or governance models in place.'
As it stands, the use of AI in the criminal justice system is more commonly found in policing, often controversially, in which services across the country have employed technology such as facial recognition systems and automatic licence plate readers.
In Canadian courts, AI has been less prevalent – though Fritsch says he's starting to see upticks in its use. Just this week, the conduct of an Ontario lawyer was
called into question
after a judge suspected ChatGPT had been used to craft a factum submitted in civil proceedings. She has since been ordered to attend a hearing with the judge to explain the discrepancies.
Where it's becoming most common, he says, is in cases where people are self-represented.
'Right now, what we're mostly seeing is an increasing number of self- and un-represented people relying on generalist AI tools like ChatGPT to make their case for them,' he said. 'And the consequence is that they're actually spending more time disavowing the errors than reaping any benefits.'
There are currently no laws specific to the use of artificial intelligence in the Canadian justice system. In the absence of that framework, whether AI-generated material is permitted into a legal case often falls on the individual judge or justice.
As a result, some individual courts, police services and legal associations have started to come up with policies. Toronto police, for example, were the first service in Canada to introduce
their own AI policy
, in 2022.
A patchwork of policies, however, can open the court up to unnecessary litigation, says Fritsch, and worsen backlogs and delays.
'Without a framework, there's going to be a lot of struggle for courts, cops and Crowns to interpret how our existing laws, and our civil rights, are going to apply to the use of AI,' Fritsch said. 'And there's going to be a lot of varying opinions on that.'
Amending laws to regulate AI will take time, plus there's the 'long leg' problem that court cases come months or years after new technology develops, Fristch said. 'There could be years of misuse in the meantime,' he added.
One of the most significant concerns for Fritsch is whether AI technologies can effectively understand and uphold Canadian standards of law.
'We know that AI is prone to bias,' Fritsch said. 'So if it's going to be used, we really need to make sure we're interpreting its use through the lens of the Charter of Rights and Freedoms and procedural fairness.'
For example, in the U.S., algorithms have long been used to assess risk in bail and release decisions, but Fritsch says they've been known to miss the mark.
'What we've seen from a couple of cases in the US is some really, really harsh recommendations about people who are facing first offences, or who are who are doing time for minor offences.'
As a result, the need for human oversight remains, whether through the due diligence of staff or the discretion of a judge.
For most, the criminal justice system is unfamiliar, and navigating its nuances can be a daunting task. For older citizens or otherwise vulnerable populations, AI, if used properly and transparently, 'could actually increase access and justice for a lot of people,' Fritsch said.
The most common case for the use of AI in the public sector is efficiency, says Shion Guha, assistant professor at the University of Toronto's Faculty of Information – something the courts are not known for.
'A lot of public sector agencies are basically looking towards generative AI as a way to reduce administrative overhead,' Guha told the Star Friday. 'The idea is that this will increase human efficiency and reduce costs.'
Those promises, he says, have not been properly vetted, though.
'There hasn't been any formal, finished research on whether or not this evaluative statement is true.'
In the absence of laws governing AI use, it's hard to say — it would come down to the presiding judge or justice, says Fritsch.
In the Arizona case, he said, the judge likely admitted the video on the basis it served as an expression of the family's feelings, not as a statement from Pelkey.
'I think the court, in their generosity, likely admitted it as almost a courtesy, and it might not be given a whole lot of weight.'
While Wales wrote the script for her brother's video, Fritsch pointed out that AI could also be used to generate the statements read out by a person's likeliness, further complicating the issue.
'AI that can be trained on the sum total of all the comments a person may have made on social media or in emails or texts over years, and then used to simulate the person,' Fritsch said.
'There's no doubt it would not be admitted for the truth of its contents — because it's all made up — but might it be allowed for, say, compassionate reasons only, and with no bearing on the sentencing?' he asked. 'Who knows?'

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Miscarriages of justice more likely due to forensic science crisis, report finds
Miscarriages of justice more likely due to forensic science crisis, report finds

Yahoo

timean hour ago

  • Yahoo

Miscarriages of justice more likely due to forensic science crisis, report finds

The forensic science sector is in a 'graveyard spiral', according to a parliamentary inquiry that has warned of biased criminal investigations, a rising risk of wrongful convictions, and murder and sexual offence cases collapsing due to missing evidence. The three-year inquiry set up by the all-party parliamentary group on miscarriages of justice has outlined how a series of 'reckless policy decisions' over the past decade have brought forensic science to a point of crisis. A near-monopoly in the commercial sector means there is now a dangerous single point of failure and the increasing reliance on in-house police laboratories risks compromising scientific impartiality, the inquiry found. 'Forensic science in England and Wales as currently configured isn't working for anyone – not for the police, not for the lawyers or for the courts, not for the scientists themselves, and not for the general public who get caught up in the criminal justice system,' said Prof Angela Gallop, co-chair of the Westminster Commission on Forensic Science. 'Like a plane hurtling downwards in what has become known as a 'graveyard spiral', with the pilot in desperation making increasingly erratic decisions, it can only be a short time now before it impacts the ground.' Since the closure of the Forensic Science Service in 2012, work has been divided between commercial providers and, increasingly, in-house police laboratories. However, the inquiry said there were now real concerns about bias due to the expanding types of investigations the police conduct, paired with inadequate legal aid funding for defence experts. The report recommends an immediate halt to the expansion of police in-house forensic provision and, in the long term, removing forensic science provision from police oversight. 'The increased risk of miscarriages of justice is self-evident and the potential for investigative failures leading to further injustices is continuing to grow,' said Gallop. The report also highlights the imminent collapse of the forensics market, which is heading toward a 'last man standing' monopoly after the UK's largest forensics provider, Eurofins, acquired the second largest provider, Cellmark, which was on the brink of insolvency last year. Eurofins now delivers more than 85% of external science provision, the report estimates, a proportion that could rise further, creating the risk of a single point of failure. Another concern raised is the police handling of crucial evidence used to prosecute the most serious crimes, with a growing number of cases dropped due to 'disappearing material' including DNA samples, CCTV footage, weapons, drugs and mobile phone data. Missing evidence was recorded as the reason for 30,552 prosecutions being dropped between October 2020 and September 2024, the report reveals. This included 70 homicides and 554 sexual offences, and represented just over 2% of all prosecutions nationally. The Metropolitan police had 4.6% of cases dropped because of missing evidence. Prof Carole McCartney, a criminologist at the University of Leicester, who helped compile the figures, said: 'The most obvious cause for alarm is that each of these cases is potentially a victim who will never see their perpetrator in court or see their case brought to justice. 'And if we're not retaining evidence, people who are victims of miscarriages of justice can't get out of prison and cold cases will stay unsolved if you lose the evidence. It's a fundamental part of the criminal justice process.' The inquiry adds to continuing criticism of the Criminal Cases Review Commission (CCRC), after the cases of Andrew Malkinson, who spent 17 years in jail for a rape he did not commit, and Peter Sullivan, whose murder conviction was overturned last month after 38 years. The report describes a culture of 'complacency in respect of a lack of scientific knowledge and understanding' among CCRC staff and recommends the recruitment of permanent staff members with scientific backgrounds. Kim Johnson MP, the chair of the APPG on miscarriages of justice, said the Post Office Horizon scandal and the exonerations of Malkinson and Sullivan highlighted the need for urgent reform of forensic provision. 'These cases are not isolated incidents but symptoms of deep, systemic failings in our criminal justice system,' she said. 'We owe it to victims, their families, and the wider public to demand transparency, accountability, and meaningful reform. We must see the government act on this report without delay to restore trust and prevent future injustices.' A CCRC spokesperson said that in response to an independent review of its handling of the Malkinson case it had taken a number of steps to improve its forensics provision, including training sessions for staff. It said its Forensic Opportunities Programme, announced last year, was analysing pre-2016 convictions to assess whether advances in DNA technology could identify an offender and that it had recently recruited a full-time forensic science and evidence adviser. A government spokesperson said: 'We understand the importance of high-quality, timely forensic evidence for an effective criminal justice system that prevents crime, prosecutes suspects and gives victims the justice they deserve. 'In November 2024 at the NPCC/APCC summit, the home secretary acknowledged that the adoption of forensic science across the board has been uncoordinated. That's why we are appointing a national forensic science lead who will transform our approach by helping to create a new model of delivery with the police and forensic leaders to raise standards and improve efficiency, and ultimately build greater public confidence in our criminal justice system.'

Police Release New Images of Travis Decker, Dad Accused of Killing His 3 Daughters, as He Remains on the Run
Police Release New Images of Travis Decker, Dad Accused of Killing His 3 Daughters, as He Remains on the Run

Yahoo

time16 hours ago

  • Yahoo

Police Release New Images of Travis Decker, Dad Accused of Killing His 3 Daughters, as He Remains on the Run

Police have released new images of Travis Decker, the 32-year-old Washington man accused of killing his three young daughters Paityn, 9, Evelyn, 8, and Olivia, 5, were last seen during "planned visitation" with their father on May 30, before their bodies were discovered days later The new images of Decker, shared by the Chelan County Sheriff's Office (CCSO), give the public a closer look at his tattoos and clothing around the time of his disappearanceAuthorities in Washington are continuing their search for Travis Decker, the man accused of murdering his three young daughters, and have released new images as they ask for the public's assistance as he remains on the run. On Saturday, June 7, the Chelan County Sheriff's Office (CCSO) released a wanted poster on Facebook that features multiple new photos, and a few previously released images, of the murder suspect, including some that highlight his tattoos and wardrobe. In the photos, Decker — who has been charged with one count of kidnapping and three counts of first-degree murder in the deaths of his daughters Paityn, 9, Evelyn, 8, and Olivia, 5 — can be seen with his hair pulled back and tattoos on his arm and ankles. The poster states that Decker, 32, was last seen wearing a tan or green T-shirt, which he was previously photographed in, with dark shorts. He is 5'8 with black hair and brown eyes, and the CCSO described him as "dangerous" and said he "may be armed." In the office's June 7 update, the CCSO wrote that hundreds of law enforcement personnel are searching "dozens of structures and the forests" in the area. "We continue these search efforts, acting upon gathered information and tips from the public, and leads developed through even more search warrants," the organization's statement continued, adding that a local road was reopened after authorities found "no credible threat." "However, we ask the public remain vigilant as they venture back out to the recreation areas of Chelan County. We have notified the USFS that they can reopen the recreation areas as well," the CCSO said. Authorities are encouraging locals to check their doorbell cameras, to contact authorities if they see something helpful to the search and to "not attempt or contact or approach" Decker if they see the suspect. Per a U.S. Marshals Service affidavit obtained by Fox 13 Seattle, the Independent and NBC Right Now, authorities said they were worried that Decker was attempting to flee the United States after allegedly looking up phrases including "how to relocate to Canada" and "how does a person move to Canada" on May 26. He also reportedly searched for information tied to a Canadian job site, the outlets said, citing the affidavit. Decker's daughters were last seen during "planned visitation" with their father on May 30. The Wenatchee Police Department (WPD) then issued an endangered missing persons alert the next day. On June 2, authorities canceled the alert and revealed they had discovered the three girls' remains. Fox 13 Seattle reported that Decker's pickup truck was found near the Rock Island Campground in Leavenworth, Wash., where the bodies were located. A preliminary report seen by the outlet lists their believed cause of death as "asphyxiation." Fox 13 also reported that investigators found a blanket, a wallet, food and car seats for the girls inside Decker's vehicle, which had two bloody handprints on it. The suspect "drove to and left the same campground a day prior to the kidnapping," court documents obtained by the outlet said. Police are now offering a $20,000 reward for any information leading to Decker's arrest, according to ABC News. Arianna Cozart, an attorney who represents the girls' mother, Whitney Decker, told PEOPLE on June 6 that "everybody cares that Travis is found for peace of mind if nothing else." Authorities are asking anyone who has seen Decker to call 911 immediately, or call the CCSO at 509-667-6845. A form can also be submitted to a tip line. If you suspect child abuse, call the Childhelp National Child Abuse Hotline at 1-800-4-A-Child or 1-800-422-4453, or go to All calls are toll-free and confidential. The hotline is available 24/7 in more than 170 languages. Read the original article on People

Gerry Adams's lawyer to pursue chatbots for libel
Gerry Adams's lawyer to pursue chatbots for libel

Yahoo

time18 hours ago

  • Yahoo

Gerry Adams's lawyer to pursue chatbots for libel

The high-profile media lawyer who represented Gerry Adams in his libel trial against the BBC is now preparing to sue the world's most powerful AI chatbots for defamation. As one of the most prominent libel lawyers in the UK, Paul Tweed said that artificial intelligence was the 'new battleground' in trying to prevent misinformation about his clients from being spread online. Mr Tweed is turning his attention to tech after he recently helped the former Sinn Fein leader secure a €100,000 (£84,000) payout over a BBC documentary that falsely claimed he sanctioned the murder of a British spy. The Belfast-based solicitor said he was already building a test case against Meta that could trigger a flurry of similar lawsuits, as he claims to have exposed falsehoods shared by chatbots on Facebook and Instagram. It is not the first time tech giants have been sued for defamation over questionable responses spewed out by their chatbots. Robby Starbuck, the US activist known for targeting diversity schemes at major companies, has sued Meta for defamation alleging that its AI chatbot spread a number of false claims about him, including that he took part in the Capitol riots. A Norwegian man also filed a complaint against OpenAI after its ChatGPT software incorrectly stated that he had killed two of his sons and been jailed for 21 years. Mr Tweed, who has represented celebrities such as Johnny Depp, Harrison Ford and Jennifer Lopez, said: 'My pet subject is generative AI and the consequences of them repeating or regurgitating disinformation and misinformation.' He believes statements put out by AI chatbots fall outside the protections afforded to social media companies, which have traditionally seen them avoid liability for libel. If successful, Mr Tweed will expose social media companies that have previously argued they should not be responsible for claims made on their platforms because they are technology companies rather than traditional publishers. Mr Tweed said: 'I've been liaising with a number of well-known legal professors on both sides of the Atlantic and they agree that there's a very strong argument that generative AI will fall outside the legislative protections.' The lawyer said that chatbots are actually creating new content, meaning they should be considered publishers. He said that the decision by many tech giants to move their headquarters to Ireland for lower tax rates had also opened them up to being sued in Dublin's high courts, where libel cases are typically decided by a jury. This setup is often seen as more favourable to claimants, which Mr Tweed himself says has fuelled a wave of 'libel tourism' in Ireland. He also said Dublin's high courts are attractive as a lower price option compared to London, where he said the costs of filing libel claims are 'eye-watering'. He said: 'I think it's absurd now, the level of costs that are being claimed. The libel courts in London are becoming very, very expensive and highly risky now. The moment you issue your claim form, the costs go into the stratosphere. 'It's not in anyone's interest for people to be deprived of access to justice. It will get to the point where nobody sues for libel unless you're a billionaire.' Meta was contacted for for comment. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store