logo
#

Latest news with #digitalafterlife

The victim delivered a searing impact statement. Just one thing felt off
The victim delivered a searing impact statement. Just one thing felt off

Irish Times

time15 hours ago

  • Irish Times

The victim delivered a searing impact statement. Just one thing felt off

It was a routine enough tableau; a judge, sitting at the bench, watching the victim of a violent attack address a courtroom via video as they forgave their attacker and asked for leniency. The judge held the fate of the perpetrator, already found guilty and awaiting sentencing, in their hands. As the video statement ended, the judge commented that he 'loved' it, that he 'heard the forgiveness'. It was a moving moment. The only issue was that the victim had been dead for three and a half years. The video was an AI -generated victim impact statement from a murdered man, Christopher Pelkey. This use of synthetically generated video and audio of a murder victim in an Arizona court last month felt like another 'puffer jacket pope' moment. The viral AI-generated image of Pope Francis in a white Balenciaga-style down jacket fooled millions and catapulted image generation tools into the cultural mainstream. Now, along with popes in puffer jackets, we have another watershed moment in 'ghostbots'. READ MORE Unlike the people it depicts, the 'digital afterlife industry', as it is more formally known, is alive and kicking. Companies with names such as HereAfter AI and You Only Virtual allow users to create digital archives of themselves so that the people they leave behind can interact with 'them' once they are gone. These apps market themselves to the living or bypass the person being digitally cloned altogether. The bereaved are now offered the promise of 'regenerating' their deceased relatives and friends. People out there are, at this moment, interacting with virtual renderings of their mothers and spouses on apps with names such as Re:memory and Replika. They don't need the participation or consent of the deceased. The video used to reanimate Christopher Pelkey was created using widely available tools and a few simple reference points – a YouTube interview and his obituary photo, according to The New York Times . This gives the generated footage the feel of a decent cheapfake rather than a sophisticated deepfake. Watching it, you find yourself in the so-called 'uncanny valley', that feeling you get when interacting with a bot, when your brain knows something is not quite right. This person is too serene, too poreless, too ethereal as they stare into your eyes and talk about their own death. Pelkey's sister wrote the script, imagining the message she believed her brother would have wanted to deliver. This includes the synthetic version of Pelkey addressing 'his' killer: 'It is a shame we encountered each other that day in those circumstances. In another life, we probably could have been friends. I believe in forgiveness and in God, who forgives. I always have and I still do.' [ Why Greeks are in pole position when it comes to artificial intelligence Opens in new window ] I do not doubt that the Pelkey family had good intentions. They had a point they wanted to make, saw a tool to let them do it, and were permitted to do so by the court. They also likely believe they know what their lost loved one would have wanted. But should anyone really have the power to put words in the mouth and voice of the deceased? We often fret about AI image and video generation tools being used to mislead us, to trick us as voters or targets of scams. But deception and manipulation are not the same thing. In that Arizona courtroom there was no intention to deceive: no one thought this was the actual murder victim speaking. Yet that does not diminish its emotional impact. If we can have the murdered plea for peace, does that mean we could also have AI ghosts asking for vengeance, retribution or war? Political actors have embraced generative AI, with its ability to cheaply make persuasive, memorable content. Despite fears it would be used for disinformation, most public use cases are of non-deceptive 'soft fakes'. An attack ad against Donald Trump, for example, featured audio of a synthetic version of his voice saying out loud something he had only written in a tweet. However, the real political AI innovation is happening in India, where last year candidates did things such as create videos of them speaking in languages they do not know, and even generate digital 'endorsements' from long dead figures. One candidate had the voice of his father, who died from Covid in 2020, tell voters; 'Though I died, my soul is still with all of you ... I can assure you that my son, Vijay, will work for the betterment of Kanniyakumari.' Vijay won. People have long tried to speak for the dead, often to further their own ends. AI turbo charges this into a kind of morbid ventriloquism, rendered in high definition and delivered with reverential sincerity. But the danger isn't that we mistake these digital ghosts for the real thing, it's that we know what they are, and still acquiesce to being emotionally manipulated by them. Maybe now we all need to look into whether we need to write a will with a new kind of DNR: Do Not Regenerate.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store