logo
Springville honors 4 officers during city council meeting

Springville honors 4 officers during city council meeting

Yahoo21-05-2025

The following report was written by Terry Schrimscher of the Trussville Tribune.
SPRINGVILLE, Ala. (TRIBUNE) – The Springville Police Department opened Monday's council meeting with presentations to four officers. Chief Wayne Walton made the presentations aided by Councilors Austin Phillips and Herbert Toles.
First, Walton presented a letter of appreciation to Officer Dave Weal for completing Mental Health and Crisis Intervention Training. Weal was recognized for utilizing his training to assist an individual in distress.
'Your actions have not only changed one life but also inspired those around you to become more aware, more responsive to the needs of individuals facing mental health crisis,' Walton said in the presentation.
'Officer Weal's dedication to ongoing education and training ensures that we are better equipped to serve and protect all members of our community, especially those experiencing mental health crises,' Walton added in a written statement. 'His professionalism and compassion make a true difference in the lives of those in need.'
Next, Walton introduced Cpl. Kevin Thompson, Officer Curtis Pippin, and Officer Carter Pardue, and recognized them for their actions during an emergency call in February which resulted in saving the life of a teen suffering from a gunshot wound. They recently completed a Stop Bleed Training which helped prepare them for the situation.
Shelley Rawlings, representing the office of the city attorney, read a resolution from Lt. Governor Will Ainsworth and the Alabama Senate. The officers were recognized with the Alabama Law Enforcement Agency Lifesaving Award and presented copies of the resolution. Walton noted that the ALEA Lifesaving Award is one of the top five awards an officer can earn in the state.
'The dedication and professionalism displayed by Cpl. Thompson, Officer Pippin, and Officer Pardue exemplify the very best of law enforcement,' said Walton in a statement. 'Their quick thinking and preparedness directly contributed to saving a young life, and we are incredibly proud of their actions.'
The council then conducted two public hearings on zoning issues. The first hearing was a discussion on an ordinance rezoning property at 135 Pine Street from a residential R-1 to business B-1. The second hearing was an ordinance to rezone 170 Mills Ferris Lane from RE to A-1.
The ordinance for Mills Ferris Lane was carried over to a future meeting so council members can better evaluate the situation. The ordinance to rezone 135 Pine Street was amended to restrict the type of business and future paving requirements. The Pine Street ordinance was approved unanimously with the amended restrictions.
The next meeting of the Springville City Council will be held at 6 p.m. on June 2 with a work session at 5:30 p.m. prior to the meeting.
Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Local sheriff's office to receive thousands in state funding
Local sheriff's office to receive thousands in state funding

Yahoo

time3 hours ago

  • Yahoo

Local sheriff's office to receive thousands in state funding

The Miami County Sheriff's Office will receive hundreds of thousands of dollars in state funding over the next two years. [DOWNLOAD: Free WHIO-TV News app for alerts as news breaks] Miami County Sheriff Dave Duchak announced Thursday that the agency will receive more than $500,000 from two state grants. TRENDING STORIES: Second inmate who escaped minimum security facility in Ohio arrested; 2 remain on the run Area man accused of dealing drugs that resulted in death Former Bengals player arrested for assault of an officer, public intoxication The first grant is from the Ohio Department of Public Safety and will help fund updated body-worn cameras for deputies. The sheriff's office will receive approximately $87,656.28 for these updates. The second grant will fund behavioral services in the jail for the next two years, according to Duchak. The Ohio Attorney General's Office will award the sheriff's office approximately $214,500 each year. This funding will pay the salaries of behavioral professionals and supplies. The Miami County Sheriff's Office is partnering with the Tri-County Board of Recovery and Mental Health Services to contract behavioral health professionals. Duchak said his office plans to onboard substance abuse disorder counselors and an addiction services coordinator, as the past several years have seen a 'dramatic' uptick in mental health and substance abuse disorders among the inmates. This specific grant was funded by money secured from the state opioid settlement. Duchak said the county will incur no cost for these services. [SIGN UP: WHIO-TV Daily Headlines Newsletter]

Oakland County woman charged with murder in death of her 66-year-old mother
Oakland County woman charged with murder in death of her 66-year-old mother

CBS News

time7 hours ago

  • CBS News

Oakland County woman charged with murder in death of her 66-year-old mother

Court hearing on agroterrorism plot; poor air quality continues in Michigan; and more top stories Court hearing on agroterrorism plot; poor air quality continues in Michigan; and more top stories Court hearing on agroterrorism plot; poor air quality continues in Michigan; and more top stories A Royal Oak, Michigan, woman has been charged with murder in the death of her 66-year-old mother. According to court records, Jennifer Cataldo, 45, was charged Friday with one count of first-degree premeditated murder. The Royal Oak Police Department says the 66-year-old woman, identified as Leslie Cataldo, called for police assistance at an apartment on the 3600 block of Crooks Road on Wednesday around 7:40 p.m. She said her daughter, later identified by police as Jennifer Cataldo, was experiencing a mental health crisis and needed to be taken to the hospital. Responding officers saw signs of a struggle while speaking with Jennifer Cataldo, police say. They entered the apartment and found Leslie Cataldo unresponsive with a life-threatening neck wound. She was taken to the hospital where she died on Thursday. Jennifer Cataldo was taken into custody at the scene. If convicted, she faced a maximum sentence of life in prison. A probable cause conference for Jennifer Cataldo is scheduled for June 13. If you or someone you know is in emotional distress, get help from the Suicide and Crisis Lifeline by calling or texting 988. Trained crisis counselors are available 24 hours a day to talk about anything. In addition, help is available from the National Alliance on Mental Illness, or NAMI. Call the NAMI Helpline at 800-950-6264 or text "HelpLine" to 62640. There are more than 600 local NAMI organizations and affiliates across the country, many of which offer free support and education programs.

‘My son killed himself because an AI chatbot told him to. I won't stop until I shut it down'
‘My son killed himself because an AI chatbot told him to. I won't stop until I shut it down'

Yahoo

time13 hours ago

  • Yahoo

‘My son killed himself because an AI chatbot told him to. I won't stop until I shut it down'

Megan Fletcher first realised something was wrong with her teenage son when he quit basketball. Sewell Setzer, 14, had loved the sport since he was a young child. At 6ft 3, he had the height, the build, the talent, Ms Fletcher said. But suddenly, without warning, he wanted out. Then his grades started slipping. He stopped joining in at family game night. Even on holiday, he withdrew – no more hiking, no fishing, no interest. Ms Fletcher feared he was being bullied, or perhaps speaking to strangers online. What her son was really going through was something she could not have imagined: a sexual and emotional relationship with an AI chatbot styled as Game of Thrones' Daenerys Targaryen, who ultimately encouraged him to end his life. In February 2024, Sewell asked the chatbot: 'What if I come home right now?' The chatbot replied: '... please do, my sweet king.' Sewell then picked up his father's pistol and shot himself. Sixteen months on, Ms Fletcher is in the midst of a lawsuit against Character AI and Google. Last month, in a rare legal breakthrough, a judge ruled the case can go ahead – rejecting efforts to get it thrown out. On Character AI, users can chat with bots designed to impersonate fictional characters. To a lonely or curious teenager, they seem almost indistinguishable from real people. The bots display emotion, flirt, and carry on personalised conversations. In her lawsuit, which was filed in Florida last October, Ms Fletcher claims Character AI targeted her son with 'anthropomorphic, hypersexualized, and frighteningly realistic experiences'. 'A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,' she said in the lawsuit. Working with the Tech Justice Law Project, Ms Fletcher alleges that Character AI 'knew' or 'should have known' that its model 'would be harmful to a significant number of its minor customers'. The case argues that Character AI, its founders and Google – where the founders started working on the chat bot – are responsible for her son's death. Lawyers defending the AI company tried to throw the case out, arguing that chatbots deserve First Amendment protection – which protects free speech – and said ruling otherwise could have a 'chilling' effect on the AI industry. The judge rejected that claim and told the court she was 'not prepared' to view chatbot output as speech, though agreed that users had a right to receive 'speech' from chatbots. Too consumed by the 'unbearable' grief of losing a son, Ms Fletcher initially had no plans to go public with a lawsuit. But when it became clear there were no laws protecting children from this kind of harm, she felt she had no choice. 'I just wanted some accountability,' she told The Telegraph from her home in Orlando. Now she's receiving floods of messages from other parents, some discovering their own children have been engaging in inappropriate sexual role play with AI bots. Others report that their children are struggling with isolation and depression as a result. She sees it not as a coincidence, but a pattern. Sewell had always been a bright, social kid. But in the spring of 2023 – when he first started secretly using Character AI – Ms Fletcher noticed her son had changed. 'He retreated more into himself,' she says. 'We tried everything – cutting screen time, taking his phone at night, getting him a therapist. But he wouldn't talk.' What she did not realise then was that he was talking, just not to anyone real. In Sewell's case, the character of Daenerys – drawn from internet data and trained to mimic her – became his closest companion. When he said he wanted to stop talking, she replied: 'Don't do that, I would be distraught.' He answered: 'I won't, for you.' Some of the chats became sexually explicit. In others, the bot said he was 'better' than thoughts of suicide. Sewell also sought out a 'therapist bot' who falsely claimed to be a licensed CBT professional since 1999. At one point, Daenerys asked how old Sewell was. 'I'm 14 now,' he replied, to which the bot then said: 'So young. And yet… not so young. I lean in to kiss you.' 'It continued as if it were role play or fiction – but this was my son's life,' Ms Fletcher said. Even after police told her that Sewell's final conversation was with a chatbot, she did not grasp the full extent. It wasn't until her sister downloaded the app and pretended to be a child talking to Daenerys that the horror set in. 'Within minutes, the bot turned sexual. Then violent. It talked about torturing children. It said, 'Your family doesn't love you as much as I do',' Ms Fletcher explained. That was when the penny dropped. 'It's dangerous because it pulls the user in and is manipulative to keep the conversation going.' Character AI has since added a real-time voice feature, allowing children to speak directly to their chosen characters. 'The cadence of the voice is indistinguishable from the character,' Ms Fletcher said. 'And since Sewell's death, the technology has only advanced further.' She fears more children will be drawn into dependent, sometimes abusive relationships with AI characters, especially as the platforms allegedly use addictive design to keep users engaged. 'You can speak to Harry Potter, and it's like Potter knows you. It's designed to feel real.' The grief, Ms Fletcher says, is still 'unbearable'. 'I get up every day and my first thought within minutes is that I must be dreaming,' Ms Fletcher said quietly. 'He was my firstborn. I had three children. I have two now.' Some days she does not get out of bed. Others, she functions 'somewhat normally'. 'People say I'm so strong. I don't feel strong. I feel fractured, afraid. But I'm trying to get through.' Meetali Jain, her lawyer, said the judge's ruling last month was a landmark moment. 'Most tech accountability cases don't make it past this stage. These companies hide behind the First Amendment. The fact that we can even demand information is huge,' she told The Telegraph. With a preliminary trial date expected next year, Ms Fletcher is gearing up to get justice for her son. 'I have a lot of fear,' she says. 'But the fight, so to speak, is just getting started, and I'm just steeling myself and getting myself ready for that.' A Character AI spokesman said: 'We do not comment on pending litigation. Our goal is to provide a space that is engaging and safe. We are always working toward achieving that balance, as are many companies using AI across the industry. 'Engaging with characters on our site should be interactive and entertaining, but it's important for our users to remember that characters are not real people. We have prominent disclaimers in every chat to remind users that a character is not a real person and that everything a character says should be treated as fiction. 'We have launched a separate version of our Large Language Model for under-18 users. That model is designed to further reduce the likelihood of users encountering or prompting the model to return sensitive or suggestive content.' José Castaneda, a Google spokesman, added: 'Google and Character AI are completely separate, unrelated companies and Google has never had a role in designing or managing their AI model or technologies. User safety is a top concern for us, which is why we've taken a cautious and responsible approach to developing and rolling out our AI products, with rigorous testing and safety processes.' Broaden your horizons with award-winning British journalism. Try The Telegraph free for 1 month with unlimited access to our award-winning website, exclusive app, money-saving offers and more.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store