
Missing TikToker Hannah Moody, 31, posted tragic final video before she vanished as her body is found two weeks later
A TIKTOKER shared a tragic final video just weeks before her body was discovered near a hiking trail.
Hannah Moody, 31, was a social media influencer with a combined total of 48,000 followers across her TikTok, Instagram, and YouTube channel.
Advertisement
4
The body of TikToker Hannah Moody, 31 was found near a hiking trail in Scottsdale, Arizona
Credit: Instagram/itshanrose
4
Just before her death, she posted a video on Instagram, talking about changes in her life and her faith in God
Credit: Instagram/itshanrose
She made lifestyle content centered around her faith and love of hiking.
Her
"You were never meant to go through this life alone," Moody said in the nearly two-minute clip.
"Knowing that I'm not alone, knowing that I don't have to carry this weight alone of trying to change and striving to be the certain way or striving to make my life a certain way.
Advertisement
READ MORE MISSING PEOPLE
"The only way I was actually able to change my life, change my story, was with the help of Jesus."
Moody captioned the video with: "He wants to walk with you every step of the way."
She also
The clip showed Moody walking along a trail as she spoke about her changes in her life, such as finding a new job and the stress that comes along with it.
Advertisement
Most read in US News
"There can always be something positive in every situation, if you just look for it," read the caption.
However, the influencer was reported missing on May 21 after her loved ones said they hadn't heard from her or been able to reach her since she went off on her hike in Scottsdale, Arizona, that day.
In
Police
Department said authorities were at the trail where Moody was last seen, and her car was found in the parking lot.
"Officers began search efforts on foot, with
drones
and assistance from a Phoenix
Police
Department helicopter," read the release.
Advertisement
"Search efforts continued for Hannah for approximately four and a half hours until around 11:30 p.m., when the search was called off for the evening."
A team of more than 20 officers on foot and bikes searched the trail with the help of the Maricopa County Sheriff's Department.
The sheriff's air unit found Moody's body about 600 feet from the Gateway Trailhead of the McDowell Sonoran Preserve in Scottsdale around noon.
Before her death, Moody lived in California, according to the
Advertisement
"Scottsdale detectives and
crime
scene personnel will now conduct a thorough
investigation
to piece together what happened to Hannah and how she died," said the sheriff's department.
"Our
investigation
will be in cooperation with the Maricopa County Office of the Medical Examiner, which will ultimately determine the cause of death."
The temperature in the area where her body was found typically got as high as 100 degrees by noon, including on the day she was discovered.
An investigation is underway, but authorities said that Moody's body didn't show any signs of trauma or foul play.
Advertisement
4
Her last Tiktok was posted in April and had a similar message about her religion
Credit: TikTok/Itshanrose31
4
An investigation into Moody's death in underway
Credit: Instagram/itshanrose

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Extra.ie
3 hours ago
- Extra.ie
Teenage girl dies after eating poisoned cupcake delivered to her home with 'love note'
A girl in Brazil died after eating a a poisoned cupcake that was sent to her with a love note. Ana Neves, 17, ate the cupcakes in the city of Itapeceria, over 300 miles north of São Paolo, which were delivered with a note saying 'a treat for the most beautiful girl I've ever seen.' The cupcakes, however, were laced with arsenic, with Ana initially falling ill around an hour after eating them. Despite being discharged from hospital after falling ill, her condition worsened, and she died of a cardiac arrest on Sunday afternoon (June 1). A girl in Brazil died after eating a a poisoned cupcake that was sent to her with a love note. Pic: Facebook São Paolo Civil Police arrested an unnamed 17-year-old girl who confessed to sending the cupcakes to Ana, saying that she just wanted to 'scare' her. She said that she bought the poison on the internet for 65 Brazilian Reals (approx €10), and sent a courier to deliver the cupcakes. The suspect also said that she bought the cupcakes from Menina Trufa, a local pastry shop, but the owner of the shop said that no couriers who work for her shop made the delivery. 'The product left the store and no one knows where it went,' Josielie Franca, who owns the pastry shop, said in a statement on Instagram. 'It was a delivery boy from an app. All of our deliveries arrive via our delivery boys, who use our pink bag, with our logo.' The teenager ate a cupcake addressed to her as 'the most beautiful girl I've ever seen,' but died just one day later after being poisoned. Pic:'This delivery was not made by our delivery boys.' Ana's school, João Baptista de Oliveira state school, paid tribute to her on social media, writing 'With a huge weight, our classroom says goodbye to a beautiful star. 'No words will be enough to console, but we hope that love and beautiful memories will comfort, little by little, the hurting hearts. Ana was, and will always be, part of our history. We will carry with us the good times, the lessons learned and the affection that she left behind.'


The Irish Sun
3 hours ago
- The Irish Sun
Love Island's Harry accused of dumping stunning girl he was dating just days before cast reveal saying ‘I'm off to Bali'
LOVE Island star Harry Cooksley has been accused of dumping a stunning girl he was dating, just days before the cast reveal - and telling her he was 'off to Bali.' The midfielder, 30, is said to be Advertisement 5 Love Island star Harry Cooksley has been accused of dumping a stunning girl he was dating, just days before the cast reveal - and told her he was 'off to Bali" Credit: Instagram/@harrycooksley8 5 The midfielder, 30, is said to be to be known as the Surrey Zidane by fans of his side Credit: Instagram 5 Harry is hoping to find love on the new and upcoming series of Love Island Now the lad will be But right up until the moment Heartbroken by the revelation that he'd be starring on Love Island she called him a 'liar,' and explained that she thought he was flying out to Bali, not to the villa in Mallorca. She told The Sun: 'I was dating Harry until the moment he sat on the flight going to Love Island. Advertisement Read More on Love Island 'He told me he was going to Bali and I genuinely thought we had a future together, but no. He is a huge manipulator and I'm not the only girl he's done this too.' She continued to explain that other girls had spoken to her, claiming they, too, were dating Harry. She added: 'Turns out he was dating a few people at once.' She continued: 'He called me for three hours before he jetted off to Love Island, and told me he was going to Bali to build a life for our family. It's so scary how he lied to me, and I feel so broken. I've been completely blindsided by him. Advertisement Most read in Love Island Exclusive 'I also got a call from another girl after it came to light that he was going on the show, and she said he'd been dating her for seven months, at the same time that he was dating me. Ten Years of Love Island 'I was even planning on meeting him out in Bali, because it was such a secure conversation and he never ended what we had, despite knowing he was going on Love Island.' In a heartbreaking final revelation she added: 'I've never cried so much and felt so blindsided and lied to like this from another human being.' ITV has responded to the claims, saying: 'All of our Islanders are single and looking for love," when approached for comment. Advertisement Ahead of Harry's arrival on the hit show, a source said: 'Harry's got the full package - a great, athletic bod and great chat. 'Love Island's ladies love a footballer and The Surrey Zidane will be sure to impress as he plays away.' When asked if he has a claim to fame, the Guildford-native said: "I'm the body double for "So when he does a shoot, any body close ups will actually be me. Advertisement "You'll never see my face, but you'll see my shoulder or chest, that kind of thing." 5 The girl claims that other women have come forward saying that they were also dating Harry Credit: Instagram/@harrycooksley8 5 Harry is the body double for Declan Rice Credit: Instagram/@harrycooksley8


RTÉ News
3 hours ago
- RTÉ News
All you need to know about voice spoofing and audio deepfakes
Analysis: Biometric fraud like voice spoofing and audio deepfakes are part of broader social engineering attacks by scammers and criminals Voice spoofing involves using digital technology such as artificial intelligence to mimic someone's voice so accurately that it can deceive both humans and automated speaker verification systems. With recent rapid advancements in AI, creating these fake voices—often called "audio deepfakes"—has become alarmingly easy. Today, with just a few seconds of recorded speech from platforms like podcasts, YouTube or TikTok, machine learning models can generate highly realistic synthetic voices that mimic real individuals. It is a type of biometric fraud and often part of broader social engineering attacks. How does voice spoofing work? AI-powered tools analyses the unique patterns of a person's speech—such as tone, pitch, and rhythm—and use this data to produce synthetic speech that closely resembles the original voice. The technology have become so advanced that distinguishing between a real voice and a fake one is increasingly challenging. From RTÉ Radio 1's The Business, BBC's File on 4. reporter Paul Connolly on how criminals are now using AI-generated voices to scam people out of their money Typically, the process usually begins with an attacker collecting voice clips from online sources like social media or videos. Specialized AI models, like VGGish or YAMNet analyze these voice samples to extract important acoustic patterns from the voice, turning them into digital fingerprints called embeddings. These embeddings are then fed into voice generation systems such as Tacotron, WaveNet, or FastSpeech that produce new speech mimicking the original voice. The resulting fake voice can be used in phone calls or apps to impersonate someone in real time. How is this going to impact us in the real world? Financial scams are a growing problem and we've all had a (fairly ridiculous) phone call where a robot voice purporting to be from a company tries to get information or money, but more sophisticated versions have worked. In the UK, fraudsters used AI-generated voices to impersonate financial advisors, leading to a multi-million euro scam targeting crypto investors. In the US, the FBI has warned about scammers using AI to mimic senior US officials' voices, deceiving individuals into sharing confidential information. There have also even been cases where scammers cloned the voices of loved ones, calling individuals and pretending to be in distress to extract money. These incidents highlight the disturbing reality that even the sound of someone's voice can no longer be trusted. From CNN, can Donie O'Sullivan's parents tell the difference between RealDonie's voice and AI-Donie's voice? Celebrities, politicians, and influencers are particularly at risk because their voices are widely available online. The more audio content (voice data) available publicly, the easier it is for AI tools to replicate their voice. This is a basic principle of AI: more data = better performance. However, it's not just public figures who are at risk. If you've ever posted a video or audio clip on platforms like Facebook, Instagram, or YouTube, your voice could potentially be cloned. What are the difficulties in detecting voice spoofing? Detecting synthetic voices is a complex task. Most traditional security systems and speaker verification systems often rely on voice recognition for authentication, but AI-generated voices have become sophisticated enough to deceive these systems. Some of the core technical challenges include: Spectro-temporal similarity Fake voices closely mimic both pitch and timing patterns of natural speech. Data imbalance: Real-world datasets typically contain fewer examples of spoofed voices, making it harder for AI to recognize these cases. Generalisation: Many detection models struggle when faced with spoofing methods they weren't specifically trained on. How to protect yourself While the threat is real, there are steps you can take to safeguard against voice spoofing: Be sceptical: If you receive an unexpected call requesting sensitive information or money, verify the caller's identity through another channel. Use safe words: Establish a unique code word with family and close contacts that can be used to confirm identities during emergencies. Limit voice sharing: Be cautious about sharing voice recordings online, especially on public platforms. Stay informed: Keep abreast of the latest scams and educate yourself on how to recognize potential threats. Voice spoofing poses a growing threat as AI continues to advance, making it easier than ever to mimic someone's voice convincingly. Whether you're a public figure or an everyday social media user, the potential to become a victim of voice cloning exists. From RTÉ Radio 1's Ray D'Arcy Show, AI expert Henry Ajder talks on how deepfakes are damaging online trust and what some platforms are doing to rebuild it How our research work is helping Our recent research proposes an innovative and effective approach for detecting voice spoofing by using a hybrid deep learning (DL) architecture called VGGish-LSTM. We used VGGish, a pre-trained model developed by Google, to extract robust acoustic embeddings from audio data. These embeddings capture detailed features that are often not noticeable by human listeners but are critical in distinguishing synthetic voices. Once extracted, these acoustic features are then analysed by a Long Short-Term Memory (LSTM) network, a type of artificial neural network designed specifically to detect long-term patterns and dependencies in sequential data. These networks excel at identifying variations in speech rhythm, tone, and pitch that could indicate synthetic or manipulated speech. The advice for users is to stay vigilant, limit how much voice data you share online and adopt simple safety practices Evaluated on the widely used ASV Spoof 2019 dataset, our model achieved an outstanding accuracy of over 90%. This performance demonstrates our model's ability to detect spoofing effectively and can be used in real-world scenarios such as banking authentication, call centre security, or smart home voice verification systems. With ongoing research into detection technologies, such as the VGGish-LSTM model described here, we can continue developing robust defences to cope with voice spoofing scams. But for users, the advice is to stay vigilant, limit how much voice data you share online and adopt simple safety practices.