logo
#

Latest news with #SarahGardner

BBC The Repair Shop's experts restore model village's church
BBC The Repair Shop's experts restore model village's church

BBC News

time30-07-2025

  • Entertainment
  • BBC News

BBC The Repair Shop's experts restore model village's church

A historic model church that had fallen into disrepair has been restored by experts who work on the BBC's Repair Shop model village, at Spears Cross, Somerset, was created by Llewellyn Pluck 50 years ago and was so popular it even appeared on postcards. Several decades after the miniature village was dismantled, Llewellyn's granddaughter, Sarah Gardner, turned to social media to track down some of the original ten months, she found the model of Culbone Church and then contacted The Repair Shop producers who agreed to restore it. She said: "I don't have any idea what happened to the others. I believe this might be the only one left in a state that could be repaired."The BBC series shows a workshop filled with expert craftspeople who bring loved pieces of family history back to life. Ms Gardner said her granddad "needed a bit of a hobby" while running the guest house."He opened the model village to the general public which they loved. It was a real labour of love over years," Ms Gardner eventually discovered the model church had been kept in an allotment."It had been outside for 50 years so it was in a really bad state," she added. Ms Gardner contacted the Repair Shop about two years ago and they approached her in early April."I put it in the back of my shed and it got to the point that my husband said we need to get rid of it," Ms Gardner said."I couldn't put it out in my garden, it was too badly damaged. And then suddenly I got the call."I don't have any idea what happened to the others. I believe this might be the only one left in a state that could be repaired. If there are any out there, please get in contact."The episode featuring the model church will air on BBC One at 20:00 BST.

Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety
Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety

Business Wire

time27-05-2025

  • Business
  • Business Wire

Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety

MENLO PARK, Calif.--(BUSINESS WIRE)--Tomorrow, Meta shareholders will vote on a resolution asking Meta to assess its child safety impacts and whether harm to children on its platform has been reduced. The vote follows reports that the company's Instagram Teens feature ' fails spectacularly on some key dimensions ', including promoting sexual, racist, drug and alcohol-related content. The resolution - filed by Proxy Impact on behalf of Dr. Lisette Cooper and co-filed by 18 institutional investors from North America and Europe - will be presented by child safety advocate Sarah Gardner. 'Two weeks ago, I stood outside of Meta's office in NYC with bereaved parents whose children died as a result of sextortion, cyberbullying, and drug purchases on Meta's platforms and demanded stronger protections for kids," said Sarah Gardner, CEO of the Heat Initiative. 'Meta's most recent 'solution' is a bandaid. They promised parents that Instagram Teens would protect their kids from harm. In reality, it still recommends sexual, racist, and violent content on their feeds. We are asking shareholders to hold Mark Zuckerberg and Meta accountable and demand greater transparency about why child safety is still lagging.' 'Meta algorithms designed to maximize user engagement have helped build online abuser networks, normalize cyberbullying, enable the exponential growth of child sexual abuse materials, and flood young users with addictive content that damages their mental health,' said Michael Passoff, CEO of Proxy Impact. 'And now, a major child safety concern is Meta's doubling down on AI despite the unique threats it poses to young users. Just this year, the National Center for Missing and Exploited Children saw 67,000 reports of suspected child sexual exploitation involving Generative AI, a 1,325% increase from 2023. Meta's continued failure to address these issues poses significant regulatory, legal, and reputational risk in addition to innumerable young lives.' The resolution asks the Meta Board of Directors to publish 'a report that includes targets and quantitative metrics appropriate to assessing whether and how Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platforms.' Additional information for shareholders was filed with the SEC. Meta has been under pressure for years linked to online child safety risks, including: 41 States and the District of Columbia Attorney's General filing lawsuits alleging that Meta Platforms has intentionally built programs with addictive features that harm young users. 1 out of 8 eight kids under 16 reported experiencing unwanted sexual advances on Instagram in the last 7 days according to Meta's internal research. A leading psychologist resigned from her position on Meta's SSI expert panel on suicide prevention and self harm, alleging Meta is willfully neglecting harmful content, disregarding expert recommendations, and prioritizing financial gain. As many as 100,000 children were sexually harassed daily on Meta platforms in 2021. Meta took no action until they were called for Senate testimony 3 years later. Internal research leaked by Meta whistleblower Frances Haugen showed that the company is aware of many harms including Instagram's toxic risks to teenage girls mental health including thoughts of suicide and eating disorders. Since 2019, Proxy Impact and Dr. Cooper have worked with members of the Interfaith Center on Corporate Responsibility, pension funds, foundations, and asset managers to empower investors to utilize their leverage to encourage Meta and other tech companies to strengthen child safety measures on social media. Proxy Impact provides shareholder engagement and proxy voting services that promote sustainable and responsible business practices. For more information, visit Heat Initiative works to hold the world's most valuable and powerful tech companies accountable for failing to protect kids from online child sexual exploitation. Heat Initiative sees a future where children's safety is at the forefront of any existing and future technological developments.

Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety
Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety

Yahoo

time27-05-2025

  • Business
  • Yahoo

Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety

Investors will vote on child safety resolution at Meta's Annual General Meeting MENLO PARK, Calif., May 27, 2025--(BUSINESS WIRE)--Tomorrow, Meta shareholders will vote on a resolution asking Meta to assess its child safety impacts and whether harm to children on its platform has been reduced. The vote follows reports that the company's Instagram Teens feature "fails spectacularly on some key dimensions", including promoting sexual, racist, drug and alcohol-related content. The resolution - filed by Proxy Impact on behalf of Dr. Lisette Cooper and co-filed by 18 institutional investors from North America and Europe - will be presented by child safety advocate Sarah Gardner. "Two weeks ago, I stood outside of Meta's office in NYC with bereaved parents whose children died as a result of sextortion, cyberbullying, and drug purchases on Meta's platforms and demanded stronger protections for kids," said Sarah Gardner, CEO of the Heat Initiative. "Meta's most recent 'solution' is a bandaid. They promised parents that Instagram Teens would protect their kids from harm. In reality, it still recommends sexual, racist, and violent content on their feeds. We are asking shareholders to hold Mark Zuckerberg and Meta accountable and demand greater transparency about why child safety is still lagging." "Meta algorithms designed to maximize user engagement have helped build online abuser networks, normalize cyberbullying, enable the exponential growth of child sexual abuse materials, and flood young users with addictive content that damages their mental health," said Michael Passoff, CEO of Proxy Impact. "And now, a major child safety concern is Meta's doubling down on AI despite the unique threats it poses to young users. Just this year, the National Center for Missing and Exploited Children saw 67,000 reports of suspected child sexual exploitation involving Generative AI, a 1,325% increase from 2023. Meta's continued failure to address these issues poses significant regulatory, legal, and reputational risk in addition to innumerable young lives." The resolution asks the Meta Board of Directors to publish "a report that includes targets and quantitative metrics appropriate to assessing whether and how Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platforms." Additional information for shareholders was filed with the SEC. Meta has been under pressure for years linked to online child safety risks, including: 41 States and the District of Columbia Attorney's General filing lawsuits alleging that Meta Platforms has intentionally built programs with addictive features that harm young users. 1 out of 8 eight kids under 16 reported experiencing unwanted sexual advances on Instagram in the last 7 days according to Meta's internal research. A leading psychologist resigned from her position on Meta's SSI expert panel on suicide prevention and self harm, alleging Meta is willfully neglecting harmful content, disregarding expert recommendations, and prioritizing financial gain. As many as 100,000 children were sexually harassed daily on Meta platforms in 2021. Meta took no action until they were called for Senate testimony 3 years later. Internal research leaked by Meta whistleblower Frances Haugen showed that the company is aware of many harms including Instagram's toxic risks to teenage girls mental health including thoughts of suicide and eating disorders. Since 2019, Proxy Impact and Dr. Cooper have worked with members of the Interfaith Center on Corporate Responsibility, pension funds, foundations, and asset managers to empower investors to utilize their leverage to encourage Meta and other tech companies to strengthen child safety measures on social media. Proxy Impact provides shareholder engagement and proxy voting services that promote sustainable and responsible business practices. For more information, visit Heat Initiative works to hold the world's most valuable and powerful tech companies accountable for failing to protect kids from online child sexual exploitation. Heat Initiative sees a future where children's safety is at the forefront of any existing and future technological developments. View source version on Contacts Sloane Perry, sloane@ Sign in to access your portfolio

Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety
Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety

Yahoo

time27-05-2025

  • Health
  • Yahoo

Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety

Investors will vote on child safety resolution at Meta's Annual General Meeting MENLO PARK, Calif., May 27, 2025 /PRNewswire/ -- Tomorrow, Meta shareholders will vote on a resolution asking Meta to assess its child safety impacts and whether harm to children on its platform has been reduced. The vote follows reports that the company's Instagram Teens feature "fails spectacularly on some key dimensions", including promoting sexual, racist, drug and alcohol-related content. The resolution - filed by Proxy Impact on behalf of Dr. Lisette Cooper and co-filed by 18 institutional investors from North America and Europe - will be presented by child safety advocate Sarah Gardner, CEO of the Heat Initiative. "Two weeks ago, I stood outside of Meta's office in NYC with bereaved parents whose children died as a result of sextortion, cyberbullying, and drug purchases on Meta's platforms and demanded stronger protections for kids," said Sarah Gardner, CEO of the Heat Initiative, "Meta's most recent 'solution' is a bandaid. They promised parents that Instagram Teens would protect their kids from harm. In reality, it still recommends sexual, racist, and violent content on their feeds. We are asking shareholders to hold Mark Zuckerberg and Meta accountable and demand greater transparency about why child safety is still lagging." "Meta algorithms designed to maximize user engagement have helped build online abuser networks, normalize cyberbullying, enable the exponential growth of child sexual abuse materials, and flood young users with addictive content that damages their mental health," said Michael Passoff, CEO of Proxy Impact, "And now, a major child safety concern is Meta's doubling down on AI despite the unique threats it poses to young users. Just this year, the National Center for Missing and Exploited Children saw 67,000 reports of suspected child sexual exploitation involving Generative AI, a 1,325% increase from 2023. Meta's continued failure to address these issues poses significant regulatory, legal, and reputational risk in addition to innumerable young lives." The resolution asks the Meta Board of Directors to publish "a report that includes targets and quantitative metrics appropriate to assessing whether and how Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platforms." Additional information for shareholders has been filed with the SEC. Meta has been under pressure for years linked to online child safety risks, including: 41 States and the District of Columbia Attorney's General filing lawsuits alleging that Meta Platforms has intentionally built programs with addictive features that harm young users. 1 out of 8 eight kids under 16 reported experiencing unwanted sexual advances on Instagram in the last 7 days according to Meta's internal research. A leading psychologist resigned from her position on Meta's SSI expert panel on suicide prevention and self harm, alleging Meta is willfully neglecting harmful content, disregarding expert recommendations, and prioritizing financial gain. As many as 100,000 children were sexually harassed daily on Meta platforms in 2021. Meta took no action until they were called for Senate testimony 3 years later. Internal research leaked by Meta whistleblower Frances Haugen showed that the company is aware of many harms including Instagram's toxic risks to teenage girls mental health including thoughts of suicide and eating disorders. Since 2019, Proxy Impact and Dr. Cooper have worked with members of the Interfaith Center on Corporate Responsibility, pension funds, foundations, and asset managers to empower investors to utilize their leverage to encourage Meta and other tech companies to strengthen child safety measures on social media. Proxy Impact provides shareholder engagement and proxy voting services that promote sustainable and responsible business practices. For more information, visit Heat Initiative works to hold the world's most valuable and powerful tech companies accountable for failing to protect kids from online child sexual exploitation. Heat Initiative sees a future where children's safety is at the forefront of any existing and future technological developments. Contact: Sloane Perry, sloane@ View original content: SOURCE Heat Initiative

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store