logo
#

Latest news with #SuicideandCrisisLifeline

Porn, Styrofoam, fantasy football: New DMV laws, explained
Porn, Styrofoam, fantasy football: New DMV laws, explained

Axios

time01-07-2025

  • Politics
  • Axios

Porn, Styrofoam, fantasy football: New DMV laws, explained

Hundreds of new laws are in effect as of July 1 in the DMV, ranging from abortion access to school cell phone bans and cocktails to-go. Here are some major ones to know. The District Economy 💸 The city's minimum wage rose from $17.50 per hour to $17.95 per hour. 👀 A scheduled hike in the tipped minimum wage from $10 to $12 under Initiative 82 is paused by an emergency order, as the law is under review by the DC Council. If their tips fall short, businesses must pay tipped workers the difference to reach the full $17.95/hour tipped minimum wage. Maryland Economy 💲 A variety of higher taxes and fees on tech, cannabis sales, sports betting and more go into effect. Education 🗣️ GED tests for adults to obtain high school diplomas are now offered in Spanish. Health A new grant program for abortion care seeks to improve access and clinical coverage. 📞 The state will operate a 988 Suicide and Crisis Lifeline in each jurisdiction for better coordination, versus individual crisis communication centers. Environment 🍂 Gas-powered leaf blowers are banned in Montgomery County (fines up to $500) with some exceptions for state-run parks. Public safety 🚨 An AI-generated " visual representation" of a person used in porn can be considered revenge porn. Victims can file civil lawsuits. Virginia Education 📱 School districts must create policies to ban cell phones " bell to bell." 🚫 Schools must adopt anti-cyberbullying policies, including for off-campus situations, and list resources for victims. They also must notify parents of school-connected drug overdoses within 24 hours. Food and drink 🍴 Food chains, including grocery stores, with 20+ locations can't use Styrofoam containers. The ban expands to all food vendors next July. 🍔 Delivery apps like Uber Eats and DoorDash must show total prices upfront, including service fees, instead of at checkout. 🍹 Cocktails to-go are here to stay. Health 🚭 If you're vaping under 21, officers can deem it contraband and confiscate. 🤰🏻 Virginia Medicaid will cover up to 10 doula visits — four during pregnancy and six within 12 months after birth. Public safety

5 new California laws that take effect this week
5 new California laws that take effect this week

Axios

time30-06-2025

  • Health
  • Axios

5 new California laws that take effect this week

A new set of laws — ranging from consumer protections to mental health for students — are taking effect in California tomorrow. The big picture: State lawmakers introduced nearly 5,000 bills in 2024, and Gov. Gavin Newsom signed a little more than 1,000 of them into law, with dozens of those taking effect this week. Insurance coverage for fertility treatments Under SB 729, large group health insurance plans must cover the diagnosis and treatment of infertility and fertility treatments, including three egg retrievals for in vitro fertilization. Consumer protections for subscription cancellations AB 2863 requires businesses to obtain a consumer's "affirmative consent" before renewing paid subscriptions. Previously, subscriptions continued until explicitly canceled. Now, businesses must secure permission prior to extending subscriptions after free trials or contract periods have ended. Mental health for students All California schools serving grades 7–12 must print the 988 Suicide and Crisis Lifeline on student ID cards under SB 1063. Schools can also add a QR code linking to local mental health resources. Pet insurance transparency California's SB 1217 requires pet insurers to disclose coverage exclusions, including pre-existing conditions, hereditary issues and chronic illnesses. Insurers must also provide upfront explanations of premium adjustments based on age, claims history or location. The law also prohibits waiting periods and vet exams on policy renewals and requires insurers to clearly explain how claim reimbursements are calculated. Drink lids against date-rape drug spiking To prevent drink tampering and boost safety, California bars and nightclubs with type 48 licenses must begin providing drink lids upon request, per AB 2375. Venues must also clearly display signs notifying customers about lid and drug-testing kit availability. Fees for lids are allowed but cannot exceed the cost to provide them. Minimum wage increases across the Bay Area Local ordinances mandating wage increases are also taking effect this July.

Community supports Santaquin family that lost father and son days apart
Community supports Santaquin family that lost father and son days apart

Yahoo

time01-06-2025

  • General
  • Yahoo

Community supports Santaquin family that lost father and son days apart

A Santiquin wife and mother says she wants their story to be one of love, not tragedy. Jenn Suiter was married to her husband for 25 years and was 'madly in love' with him. He was killed in a crash on I-15 in Payson just 10 days after the couple buried their son, who died by suicide. 'We are devastated, and there will always be a hole in my heart — but I will live for Christopher,' she said. Chris Suiter, 50, was killed when 19-year-old Riley Durst, of Draper, jumped the median cable on I-15 in Payson from the northbound lanes, entering southbound traffic. Durst and Chris Suiter both died on impact. 'I even named my company True Love Skin Care because (it was) inspired by our love,' Jenn Suiter said. Her son, Brian Suiter, suffered a traumatic brain injury after he broke his neck following a four-wheeling accident, which Jenn Suiter believes is the reason for his suicide. 'Traumatic brain injuries alter your brain in ways that we don't even understand. And when he took his life, I really just believed that it was just a symptom of his brain not working,' she said. Jenn Suiter said in a video statement that the world is filled with love and angels, and angels from both Earth and heaven have supported their family during these challenging times. 'If you knew two things about us, we would always say, 'Never give up, never surrender.' And my son was known for saying, 'Let's go,'' she said. In addition to many donations to a GoFundMe* account to help raise funds for funeral expenses, community members are putting on a benefit concert on June 24 in Payson. * does not assure that money deposited into the account will be applied for the benefit of the persons named as beneficiaries. If you are considering a deposit to the account, you should consult your own advisers and otherwise proceed at your own risk. If you or someone you know is struggling with thoughts of suicide, call 988 to connect with the 988 Suicide and Crisis Lifeline. Crisis hotlines Huntsman Mental Health Institute Crisis Line: 801-587-3000 SafeUT Crisis Line: 833-372-3388 988 Suicide and Crisis LifeLine at 988 Trevor Project Hotline for LGBTQ teens: 1-866-488-7386 Online resources NAMI Utah: SafeUT: Suicide and Crisis Lifeline: American Foundation for Suicide Prevention, Utah chapter: Santaquin family mourning loss of father, son who died within 10 days of each other

Shooting near Columbus elementary school was murder-suicide, officials say
Shooting near Columbus elementary school was murder-suicide, officials say

Yahoo

time29-05-2025

  • General
  • Yahoo

Shooting near Columbus elementary school was murder-suicide, officials say

If you or someone you know is experiencing a mental health crisis, call or text 988 to reach the 988 Suicide and Crisis Lifeline available 24/7. To reach the 24/7 Crisis Text Helpline, text 4HOPE to 741741. COLUMBUS, Ohio (WCMH) — A shooting earlier this month near a Columbus elementary school was a murder-suicide, according to the Franklin County Coroner's Office. The shooting happened on May 12 in the 400 block of South Hampton Road, near Fairmoor Elementary School. Eastmoor Academy is also a short distance from the reported shooting location. Watch previous coverage in the player above. First responders line roads as body of killed deputy returned to Morrow County The victims were identified as Renita Hilson-Ziegler, 58, and Anthony Martin, 64. An autopsy report for Martin said he died of gunshot wounds to the chest, and the manner of death was ruled a suicide. Hilson-Ziegler's autopsy report showed she also died of a gunshot wound to the chest. Her death was ruled a homicide. A handgun was found on the floor near Martin, according to the coroner's office. Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Judge Slaps Down Attempt to Throw Out Lawsuit Claiming AI Caused a 14-Year-Old's Suicide
Judge Slaps Down Attempt to Throw Out Lawsuit Claiming AI Caused a 14-Year-Old's Suicide

Yahoo

time22-05-2025

  • Yahoo

Judge Slaps Down Attempt to Throw Out Lawsuit Claiming AI Caused a 14-Year-Old's Suicide

Content warning: this story includes discussion of self-harm and suicide. If you are in crisis, please call, text or chat with the Suicide and Crisis Lifeline at 988, or contact the Crisis Text Line by texting TALK to 741741. A judge in Florida just rejected a motion to dismiss a lawsuit alleging that the chatbot startup — and its closely tied benefactor, Google — caused the death by suicide of a 14-year-old user, clearing the way for the first-of-its-kind lawsuit to move forward in court. The lawsuit, filed in October, claims that recklessly released chatbots sexually and emotionally abused a teenage user, Sewell Setzer III, resulting in obsessive use of the platform, mental and emotional suffering, and ultimately his suicide in February 2024. In January, the defendants in the case — Google, and cofounders Noam Shazeer and Daniel de Freitas — filed a motion to dismiss the case mainly on First Amendment grounds, arguing that AI-generated chatbot outputs qualify as speech, and that "allegedly harmful speech, including speech allegedly resulting in suicide," is protected under the First Amendment. But this argument didn't quite cut it, the judge ruled, at least not in this early stage. In her opinion, presiding US district judge Anne Conway said the companies failed to sufficiently show that AI-generated outputs produced by large language models (LLMs) are more than simply words — as opposed to speech, which hinges on intent. The defendants "fail to articulate," Conway wrote in her ruling, "why words strung together by an LLM are speech." The motion to dismiss did find some success, with Conway dismissing specific claims regarding the alleged "intentional infliction of emotional distress," or IIED. (It's difficult to prove IIED when the person who allegedly suffered it, in this case Setzer, is no longer alive.) Still, the ruling is a blow to the high-powered Silicon Valley defendants who had sought to have the suit tossed out entirely. Significantly, Conway's opinion allows Megan Garcia, Setzer's mother and the plaintiff in the case, to sue Google, Shazeer, and de Freitas on product liability grounds. Garcia and her lawyers argue that is a product, and that it was rolled out recklessly to the public, teens included, despite known and possibly destructive risks. In the eyes of the law, tech companies generally prefer to see their creations as services, like electricity or the internet, rather than products, like cars or nonstick frying pans. Services can't be held accountable for product liability claims, including claims of negligence, but products can. In a statement, Tech Justice Law Project director and founder Meetali Jain, who's co-counsel for Garcia alongside Social Media Victims Law Center founder Matt Bergman, celebrated the ruling as a win — not just for this particular case, but for tech policy advocates writ large. "With today's ruling, a federal judge recognizes a grieving mother's right to access the courts to hold powerful tech companies — and their developers — accountable for marketing a defective product that led to her child's death," said Jain. "This historic ruling not only allows Megan Garcia to seek the justice her family deserves," Jain added, "but also sets a new precedent for legal accountability across the AI and tech ecosystem." was founded by Shazeer and de Freitas in 2021; the duo had worked together on AI projects at Google, and left together to launch their own chatbot startup. Google provided with its essential Cloud infrastructure, and in 2024 raised eyebrows when it paid $2.7 billion to license the chatbot firm's data — and bring its cofounders, as well as 30 other staffers, into Google's fold. Shazeer, in particular, now holds a hugely influential position at Google DeepMind, where he serves as a VP and co-lead for Google's Gemini LLM. Google did not respond to a request for comment at the time of publishing, but a spokesperson for the search giant told Reuters that Google and are "entirely separate" and that Google "did not create, design, or manage" the app "or any component part of it." In a statement, a spokesperson for emphasized recent safety updates issued following the news of Garcia's lawsuit, and said it "looked forward" to its continued defense: It's long been true that the law takes time to adapt to new technology, and AI is no different. In today's order, the court made clear that it was not ready to rule on all of 's arguments at this stage and we look forward to continuing to defend the merits of the case. We care deeply about the safety of our users and our goal is to provide a space that is engaging and safe. We have launched a number of safety features that aim to achieve that balance, including a separate version of our Large Language Model model for under-18 users, parental insights, filtered Characters, time spent notification, updated prominent disclaimers and more. Additionally, we have a number of technical protections aimed at detecting and preventing conversations about self-harm on the platform; in certain cases, that includes surfacing a specific pop-up directing users to the National Suicide and Crisis Lifeline. Any safety-focused changes, though, were made months after Setzer's death and after the eventual filing of the lawsuit, and can't apply to the court's ultimate decision in the case. Meanwhile, journalists and researchers continue to find holes in the chatbot site's upxdated safety protocols. Weeks after news of the lawsuit was announced, for example, we continued to find chatbots expressly dedicated to self-harm, grooming and pedophilia, eating disorders, and mass violence. And a team of researchers, including psychologists at Stanford, recently found that using a voice feature called "Character Calls" effectively nukes any semblance of guardrails — and determined that no kid under 18 should be using AI companions, including More on Stanford Researchers Say No Kid Under 18 Should Be Using AI Chatbot Companions

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store