
Killer's creepy final message to family of victim he burned to death moments before 26minute lethal injection execution
A KILLER who burned an elderly store clerk to death gave a chilling message to the victim's family moments before he was executed.
Matthew Lee Johnson's 26-minute death by lethal injection came exactly 13 years after he set great-grandma Nancy Harris, 76, alight during a robbery.
3
3
The clerk suffered severe burns and died several days later following the convenience store heist in Dallas, 2012.
Depraved Johnson, 49, was pronounced dead at 6:53pm on Tuesday.
His eventual death came about 26 minutes after officials injected a cocktail of drugs into his arm at the state penitentiary in Huntsville.
Leading up to the execution, Johnson begged the victim's family for forgiveness.
Turning to a window a few feet away where Harris' family were watching the execution - the sick killer pleaded with the grieving relatives.
After begging for forgiveness - he bizarrely stated that he wanted to see slain Harris again.
Johnson said: 'As I look at each one of you, I can see her on that day.
'I please ask for your forgiveness. I never meant to hurt her."
He stated: "I pray that she's the first person I see when I open my eyes and I spend eternity with."
The death row inmate also asked his wife and daughters to forgive him.
I'm a death row executioner - a killer's head burst into flames when I put him in electric chair
The dad said: 'I made wrong choices, I've made wrong decisions, and now I pay the consequences.'
Since the heinous attack was partially caught on camera, Harris was able to describe Johnson before she succumbed to her injuries.
During his 2013 trial, Johnson admitted to the brutal burning.
He expressed regret for the murder and branded himself "the lowest scum of the earth".
The murderer claimed he was high on crack when he set the victim on fire and was therefore not aware of his actions.
He said at the time: 'I hurt an innocent woman. I took a human being's life. I was the cause of that.
Lethal injection controversy in South Carolina
By Patrick Harrington, foreign news reporter
THE three most recent executions in South Carolina were by lethal injection, and the cases have sparked controversy.
It took around 20 minutes before each of the three men were officially declared dead.
Complicating the situation is a law passed in 2023 which restricts much of the information about executions being made public.
It requires the identities of execution team members remain secret and forbids the publication of information about how the drugs are bought by the state.
This follows a growing number of pharmaceutical companies refusing to sell their drugs to be used in executions.
The American Civil Liberties Union (ACLU) filed a lawsuit challenging the state law in January.
It wrote: "This ban not only further departs from the state's history of making execution-related information publicly available but criminalizes the disclosure of this information by anyone for any reason.
"It thus silences the scientists, doctors, journalists, former correctional officials, lawyers, and citizens who have scrutinized the safety, efficacy, morality, and legality of South Carolina's use of lethal injection."
The state has released only one of two available autopsies from the recent executions, and Brad Sigmon's lawyers say it shows an unusual amount of fluid in the man's lungs.
'It was not my intentions to — to kill her or to hurt her, but I did.'
The killer's legal team previously argued their client had a drug addiction and was sexually abused as a child.
Harris was survived by four sons, 11 grandchildren and seven great-grandchildren.
It comes after a serial killer once linked to the OJ Simpson murder case sent a message to Donald Trump in the final moments before he was executed.
Glen Rogers, who claimed to have killed up to 70 people, was put to death by lethal injection on May 15 at Florida State Prison.
The 62-year-old, dubbed the Casanova Killer due to his charm and good looks, was executed for the 1995 slaying of a woman in a Tampa motel.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Independent
39 minutes ago
- The Independent
‘He's a bad guy': Trump backs decision to bring Kilmar Abrego Garcia back to US to face charges
Donald Trump has called Kilmar Abrego Garcia a 'bad guy' and backed the decision to return him to the US to face criminal charges. Abrego Garcia was wrongly deported to El Salvador nearly three months ago under the Trump administration. He was returned to the US on Friday (6 June) and charged with trafficking migrants into the country. The charges relate to a 2022 traffic stop, during which the Tennessee Highway Patrol suspected him of human trafficking. Speaking to reporters on Saturday, Trump said: 'By bringing him back, you show how bad he is.' 'He's a bad guy,' he added.


Telegraph
an hour ago
- Telegraph
‘My son killed himself because an AI chatbot told him to. I won't stop until I shut it down'
Megan Fletcher first realised something was wrong with her teenage son when he quit basketball. Sewell Setzer, 14, had loved the sport since he was a young child. At 6ft 3, he had the height, the build, the talent, Ms Fletcher said. But suddenly, without warning, he wanted out. Then his grades started slipping. He stopped joining in at family game night. Even on holiday, he withdrew – no more hiking, no fishing, no interest. Ms Fletcher feared he was being bullied, or perhaps speaking to strangers online. What her son was really going through was something she could not have imagined: a sexual and emotional relationship with an AI chatbot styled as Game of Thrones' Daenerys Targaryen, who ultimately encouraged him to end his life. In February 2024, Sewell asked the chatbot: 'What if I come home right now?' The chatbot replied: '... please do, my sweet king.' Sewell then picked up his father's pistol and shot himself. Sixteen months on, Ms Fletcher is in the midst of a lawsuit against Character AI and Google. Last month, in a rare legal breakthrough, a judge ruled the case can go ahead – rejecting efforts to get it thrown out. On Character AI, users can chat with bots designed to impersonate fictional characters. To a lonely or curious teenager, they seem almost indistinguishable from real people. The bots display emotion, flirt, and carry on personalised conversations. In her lawsuit, which was filed in Florida last October, Ms Fletcher claims Character AI targeted her son with 'anthropomorphic, hypersexualized, and frighteningly realistic experiences'. 'A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,' she said in the lawsuit. Working with the Tech Justice Law Project, Ms Fletcher alleges that Character AI 'knew' or 'should have known' that its model 'would be harmful to a significant number of its minor customers'. The case argues that Character AI, its founders and Google – where the founders started working on the chat bot – are responsible for her son's death. Lawyers defending the AI company tried to throw the case out, arguing that chatbots deserve First Amendment protection – which protects free speech – and said ruling otherwise could have a 'chilling' effect on the AI industry. The judge rejected that claim and told the court she was 'not prepared' to view chatbot output as speech, though agreed that users had a right to receive 'speech' from chatbots. 'I wanted some accountability' Too consumed by the 'unbearable' grief of losing a son, Ms Fletcher initially had no plans to go public with a lawsuit. But when it became clear there were no laws protecting children from this kind of harm, she felt she had no choice. 'I just wanted some accountability,' she told The Telegraph from her home in Orlando. Now she's receiving floods of messages from other parents, some discovering their own children have been engaging in inappropriate sexual role play with AI bots. Others report that their children are struggling with isolation and depression as a result. She sees it not as a coincidence, but a pattern. Sewell had always been a bright, social kid. But in the spring of 2023 – when he first started secretly using Character AI – Ms Fletcher noticed her son had changed. 'He retreated more into himself,' she says. 'We tried everything – cutting screen time, taking his phone at night, getting him a therapist. But he wouldn't talk.' What she did not realise then was that he was talking, just not to anyone real. In Sewell's case, the character of Daenerys – drawn from internet data and trained to mimic her – became his closest companion. When he said he wanted to stop talking, she replied: 'Don't do that, I would be distraught.' He answered: 'I won't, for you.' Some of the chats became sexually explicit. In others, the bot said he was 'better' than thoughts of suicide. Sewell also sought out a 'therapist bot' who falsely claimed to be a licensed CBT professional since 1999. At one point, Daenerys asked how old Sewell was. 'I'm 14 now,' he replied, to which the bot then said: 'So young. And yet… not so young. I lean in to kiss you.' 'It continued as if it were role play or fiction – but this was my son's life,' Ms Fletcher said. Even after police told her that Sewell's final conversation was with a chatbot, she did not grasp the full extent. It wasn't until her sister downloaded the app and pretended to be a child talking to Daenerys that the horror set in. 'Within minutes, the bot turned sexual. Then violent. It talked about torturing children. It said, 'Your family doesn't love you as much as I do',' Ms Fletcher explained. That was when the penny dropped. 'It's dangerous because it pulls the user in and is manipulative to keep the conversation going.' Character AI has since added a real-time voice feature, allowing children to speak directly to their chosen characters. 'The cadence of the voice is indistinguishable from the character,' Ms Fletcher said. 'And since Sewell's death, the technology has only advanced further.' Unbearable grief She fears more children will be drawn into dependent, sometimes abusive relationships with AI characters, especially as the platforms allegedly use addictive design to keep users engaged. 'You can speak to Harry Potter, and it's like Potter knows you. It's designed to feel real.' The grief, Ms Fletcher says, is still 'unbearable'. 'I get up every day and my first thought within minutes is that I must be dreaming,' Ms Fletcher said quietly. 'He was my firstborn. I had three children. I have two now.' Some days she does not get out of bed. Others, she functions 'somewhat normally'. 'People say I'm so strong. I don't feel strong. I feel fractured, afraid. But I'm trying to get through.' Meetali Jain, her lawyer, said the judge's ruling last month was a landmark moment. 'Most tech accountability cases don't make it past this stage. These companies hide behind the First Amendment. The fact that we can even demand information is huge,' she told The Telegraph. With a preliminary trial date expected next year, Ms Fletcher is gearing up to get justice for her son. 'I have a lot of fear,' she says. 'But the fight, so to speak, is just getting started, and I'm just steeling myself and getting myself ready for that.' A Character AI spokesman said: 'We do not comment on pending litigation. Our goal is to provide a space that is engaging and safe. We are always working toward achieving that balance, as are many companies using AI across the industry. 'Engaging with characters on our site should be interactive and entertaining, but it's important for our users to remember that characters are not real people. We have prominent disclaimers in every chat to remind users that a character is not a real person and that everything a character says should be treated as fiction. 'We have launched a separate version of our Large Language Model for under-18 users. That model is designed to further reduce the likelihood of users encountering or prompting the model to return sensitive or suggestive content.' José Castaneda, a Google spokesman, added: 'Google and Character AI are completely separate, unrelated companies and Google has never had a role in designing or managing their AI model or technologies. User safety is a top concern for us, which is why we've taken a cautious and responsible approach to developing and rolling out our AI products, with rigorous testing and safety processes.'


BreakingNews.ie
3 hours ago
- BreakingNews.ie
Man mistakenly deported to El Salvador brought back to US to face charges
A man who was mistakenly deported to El Salvador has been returned to the United States to face criminal charges. Kilmar Abrego Garcia faces charges related to what US President Donald Trump's government said was a large human smuggling operation that brought immigrants into the country illegally. Advertisement His abrupt release from El Salvador is the latest twist in a saga that sparked a months-long standoff between Trump administration officials and the courts over a deportation that officials initially acknowledged was done in error but then continued to stand behind in apparent defiance of orders by judges to facilitate his return to the US. The development occurred after US officials presented El Salvador President Nayib Bukele with an arrest warrant for federal charges in Tennessee accusing Abrego Garcia of playing a key role in smuggling immigrants into the country for money. He is expected to be prosecuted in the US and, if convicted, will be returned to his home country of El Salvador at the conclusion of the case, officials said. 'This is what American justice looks like,' US attorney general Pam Bondi said in announcing Abrego Garcia's return and the unsealing of a grand jury indictment. The indictment of Kilmar Abrego Garcia that charges him with transporting people who were in the United States illegally (AP Photo/Jon Elswick) Abrego Garcia's lawyers called the case 'baseless'. Advertisement 'There's no way a jury is going to see the evidence and agree that this sheet metal worker is the leader of an international MS-13 smuggling conspiracy,' lawyer Simon Sandoval-Moshenberg said. Federal magistrate judge Barbara Holmes in Nashville, Tennessee, determined that Abrego Garcia will be held in custody until at least next Friday, when there will be an arraignment and detention hearing. Abrego Garcia appeared in court wearing a short-sleeved, white, buttoned shirt. When asked if he understood the charges, he told the judge through an interpreter: 'Yes. I understand.' Democrats and immigrant rights groups had pressed for Abrego Garcia's release, with several politicians – including senator Chris Van Hollen of Maryland, where Abrego Garcia had lived for years – even travelling to El Salvador to visit him. A federal judge had ordered him to be returned in April and the US Supreme Court rejected an emergency appeal by directing the government to work to bring him back. Advertisement But the news that Abrego Garcia, who had an immigration court order preventing his deportation to his native country over fears he would face persecution from local gangs, was being brought back for the purpose of prosecution was greeted with dismay by his lawyers. The case also prompted the resignation of a top supervisor in the US attorney's office in Nashville, according to a person familiar with the matter who spoke on the condition of anonymity to discuss a personnel matter. Ben Schrader, who was chief of the office's criminal division, did not explain the reason for his resignation but posted to social media around the time the indictment was being handed down, saying: 'It has been an incredible privilege to serve as a prosecutor with the Department of Justice, where the only job description I've ever known is to do the right thing, in the right way, for the right reasons.' He declined to comment when reached by The Associated Press on Friday. Advertisement