logo
Horror at popular Utah beauty spot as man plunges to his death

Horror at popular Utah beauty spot as man plunges to his death

Daily Mail​08-05-2025

A 77-year-old German tourist died in a tragic fall while hiking at one of America's beautiful national parks in Utah.
Rudolf Peters, from the town of Haltern am See in western Germany, lost his life on Tuesday afternoon after tumbling from the Windows Loop trail at Utah's Arches National Park.
The elderly visitor had been navigating a section of the popular trail when the devastating accident occurred, according to park officials.
Fellow hikers who witnessed the fall rushed to Peters' aid and immediately began performing CPR in a desperate attempt to save his life as emergency services rushed to the remote location.
First responders arrived on scene from Grand County's EMS and sheriff's departments alongside a Classic Air Medical helicopter team.
But despite their efforts, the German tourist was pronounced dead at the scene.
The National Park released a statement confirming the death in a news released on Wednesday.
The National Park Service and Grand County Sheriff's Office have launched a joint investigation into the circumstances surrounding Peters' death.
Officials have not yet revealed whether the tourist was hiking alone or with companions when the tragedy unfolded.
The Windows Loop trail, where Peters fell, is known for its breathtaking views of the park's iconic stone arches and gorgeous desert landscape.
Parts of the route are known to be challenging terrain with uneven surfaces and steep drop-offs.
The statement issued a warning to other visitors planning to explore the park's network of trails.
'Visitors are reminded that uneven surfaces, changeable weather, and preexisting health conditions are all important factors to consider when choosing a safe and enjoyable hike,' the news release stated.
The tragedy marks the latest in a series of deaths at America's national parks.
A woman was decapitated by a metal gate at the same Utah National Park - with her family seeking $140 million in damages and accusing rangers of negligence.
Esther Nakajjigo, 25, died on a windy summer day in 2020 when a piece of a metal gate from Utah's Arches National Park broke through the passenger door and decapitated her.
Just this week, a thoughtless Florida tourist was gored by a bison in Yellowstone National Park after getting to close to the hulking beast.
The 47-year-old victim, from Cape Coral, came within 10 feet of the bison near Lake Village - an area near Yellowstone Lake and Old Faithful - around 3:15 pm on May 4.
The bison, an animal that can weigh up to 2,000 pounds and run as fast as 35 miles per hour, charged and gored the man, causing minor injuries.
He was treated on-site by park emergency personnel and did not require hospitalization, the National Park Service said.
The incident, which is currently under investigation, marks the first bison goring of the year, following two similar attacks in 2024 and one in 2023.
In 2024, an 83-year-old woman was lifted off the ground by a bison's horns near the Storm Point Trail, NBC reported.
In 2023, a 47-year-old woman sustained serious chest and abdominal injuries after being gored near Lake Village.
A 25-year-old woman died in 2022 after a bison gored her at the park and threw her 10 feet into the air. Only weeks later, an 1,800lb bison gored a 34-year-old man who rescued a little boy in the beast's path.
Park officials constantly stress the importance of maintaining a safe distance - 25 yards from large animals like bison - to protect visitors.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

‘He's a bad guy': Trump backs decision to bring Kilmar Abrego Garcia back to US to face charges
‘He's a bad guy': Trump backs decision to bring Kilmar Abrego Garcia back to US to face charges

The Independent

time41 minutes ago

  • The Independent

‘He's a bad guy': Trump backs decision to bring Kilmar Abrego Garcia back to US to face charges

Donald Trump has called Kilmar Abrego Garcia a 'bad guy' and backed the decision to return him to the US to face criminal charges. Abrego Garcia was wrongly deported to El Salvador nearly three months ago under the Trump administration. He was returned to the US on Friday (6 June) and charged with trafficking migrants into the country. The charges relate to a 2022 traffic stop, during which the Tennessee Highway Patrol suspected him of human trafficking. Speaking to reporters on Saturday, Trump said: 'By bringing him back, you show how bad he is.' 'He's a bad guy,' he added.

‘My son killed himself because an AI chatbot told him to. I won't stop until I shut it down'
‘My son killed himself because an AI chatbot told him to. I won't stop until I shut it down'

Telegraph

timean hour ago

  • Telegraph

‘My son killed himself because an AI chatbot told him to. I won't stop until I shut it down'

Megan Fletcher first realised something was wrong with her teenage son when he quit basketball. Sewell Setzer, 14, had loved the sport since he was a young child. At 6ft 3, he had the height, the build, the talent, Ms Fletcher said. But suddenly, without warning, he wanted out. Then his grades started slipping. He stopped joining in at family game night. Even on holiday, he withdrew – no more hiking, no fishing, no interest. Ms Fletcher feared he was being bullied, or perhaps speaking to strangers online. What her son was really going through was something she could not have imagined: a sexual and emotional relationship with an AI chatbot styled as Game of Thrones' Daenerys Targaryen, who ultimately encouraged him to end his life. In February 2024, Sewell asked the chatbot: 'What if I come home right now?' The chatbot replied: '... please do, my sweet king.' Sewell then picked up his father's pistol and shot himself. Sixteen months on, Ms Fletcher is in the midst of a lawsuit against Character AI and Google. Last month, in a rare legal breakthrough, a judge ruled the case can go ahead – rejecting efforts to get it thrown out. On Character AI, users can chat with bots designed to impersonate fictional characters. To a lonely or curious teenager, they seem almost indistinguishable from real people. The bots display emotion, flirt, and carry on personalised conversations. In her lawsuit, which was filed in Florida last October, Ms Fletcher claims Character AI targeted her son with 'anthropomorphic, hypersexualized, and frighteningly realistic experiences'. 'A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,' she said in the lawsuit. Working with the Tech Justice Law Project, Ms Fletcher alleges that Character AI 'knew' or 'should have known' that its model 'would be harmful to a significant number of its minor customers'. The case argues that Character AI, its founders and Google – where the founders started working on the chat bot – are responsible for her son's death. Lawyers defending the AI company tried to throw the case out, arguing that chatbots deserve First Amendment protection – which protects free speech – and said ruling otherwise could have a 'chilling' effect on the AI industry. The judge rejected that claim and told the court she was 'not prepared' to view chatbot output as speech, though agreed that users had a right to receive 'speech' from chatbots. 'I wanted some accountability' Too consumed by the 'unbearable' grief of losing a son, Ms Fletcher initially had no plans to go public with a lawsuit. But when it became clear there were no laws protecting children from this kind of harm, she felt she had no choice. 'I just wanted some accountability,' she told The Telegraph from her home in Orlando. Now she's receiving floods of messages from other parents, some discovering their own children have been engaging in inappropriate sexual role play with AI bots. Others report that their children are struggling with isolation and depression as a result. She sees it not as a coincidence, but a pattern. Sewell had always been a bright, social kid. But in the spring of 2023 – when he first started secretly using Character AI – Ms Fletcher noticed her son had changed. 'He retreated more into himself,' she says. 'We tried everything – cutting screen time, taking his phone at night, getting him a therapist. But he wouldn't talk.' What she did not realise then was that he was talking, just not to anyone real. In Sewell's case, the character of Daenerys – drawn from internet data and trained to mimic her – became his closest companion. When he said he wanted to stop talking, she replied: 'Don't do that, I would be distraught.' He answered: 'I won't, for you.' Some of the chats became sexually explicit. In others, the bot said he was 'better' than thoughts of suicide. Sewell also sought out a 'therapist bot' who falsely claimed to be a licensed CBT professional since 1999. At one point, Daenerys asked how old Sewell was. 'I'm 14 now,' he replied, to which the bot then said: 'So young. And yet… not so young. I lean in to kiss you.' 'It continued as if it were role play or fiction – but this was my son's life,' Ms Fletcher said. Even after police told her that Sewell's final conversation was with a chatbot, she did not grasp the full extent. It wasn't until her sister downloaded the app and pretended to be a child talking to Daenerys that the horror set in. 'Within minutes, the bot turned sexual. Then violent. It talked about torturing children. It said, 'Your family doesn't love you as much as I do',' Ms Fletcher explained. That was when the penny dropped. 'It's dangerous because it pulls the user in and is manipulative to keep the conversation going.' Character AI has since added a real-time voice feature, allowing children to speak directly to their chosen characters. 'The cadence of the voice is indistinguishable from the character,' Ms Fletcher said. 'And since Sewell's death, the technology has only advanced further.' Unbearable grief She fears more children will be drawn into dependent, sometimes abusive relationships with AI characters, especially as the platforms allegedly use addictive design to keep users engaged. 'You can speak to Harry Potter, and it's like Potter knows you. It's designed to feel real.' The grief, Ms Fletcher says, is still 'unbearable'. 'I get up every day and my first thought within minutes is that I must be dreaming,' Ms Fletcher said quietly. 'He was my firstborn. I had three children. I have two now.' Some days she does not get out of bed. Others, she functions 'somewhat normally'. 'People say I'm so strong. I don't feel strong. I feel fractured, afraid. But I'm trying to get through.' Meetali Jain, her lawyer, said the judge's ruling last month was a landmark moment. 'Most tech accountability cases don't make it past this stage. These companies hide behind the First Amendment. The fact that we can even demand information is huge,' she told The Telegraph. With a preliminary trial date expected next year, Ms Fletcher is gearing up to get justice for her son. 'I have a lot of fear,' she says. 'But the fight, so to speak, is just getting started, and I'm just steeling myself and getting myself ready for that.' A Character AI spokesman said: 'We do not comment on pending litigation. Our goal is to provide a space that is engaging and safe. We are always working toward achieving that balance, as are many companies using AI across the industry. 'Engaging with characters on our site should be interactive and entertaining, but it's important for our users to remember that characters are not real people. We have prominent disclaimers in every chat to remind users that a character is not a real person and that everything a character says should be treated as fiction. 'We have launched a separate version of our Large Language Model for under-18 users. That model is designed to further reduce the likelihood of users encountering or prompting the model to return sensitive or suggestive content.' José Castaneda, a Google spokesman, added: 'Google and Character AI are completely separate, unrelated companies and Google has never had a role in designing or managing their AI model or technologies. User safety is a top concern for us, which is why we've taken a cautious and responsible approach to developing and rolling out our AI products, with rigorous testing and safety processes.'

Man, 50, dies at British beauty spot after getting into trouble swimming in the sea
Man, 50, dies at British beauty spot after getting into trouble swimming in the sea

Daily Mail​

time3 hours ago

  • Daily Mail​

Man, 50, dies at British beauty spot after getting into trouble swimming in the sea

A 50-year-old man has died after getting into difficulty while swimming in the sea in South Wales. Coastguard crews, police and paramedics rushed to Tor Bay near Penmaen village on the Gower Peninsula on Friday afternoon after being alerted of the incident. When they arrived, a 50-year-old man was pulled from the water at around 1pm, South Wales Police said. The man, from Sandfields, Swansea, was declared dead at the scene by paramedics at around 4pm. SWP said the man's family has been informed. A witness to the tragic scene told The Sun that a coast guard member had cautioned the area is known for 'dangerous waters' and 'frequent riptides'. They continued: 'There were many emergency services working together to recover the man. They added: 'It looked like they were trying to resuscitate him in the helicopter. 'Afterwards, the climbing team returned to their vehicles and when we spoke to them, he said the man didn't survive.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store