logo
After their mom was killed in murder-suicide, 2 dogs find forever home in Mass.

After their mom was killed in murder-suicide, 2 dogs find forever home in Mass.

Yahoo18-04-2025

After their owner was killed 'in a tragic domestic violence incident,' workers at a Rhode Island rescue were determined to find two resilient dogs a new home.
'They are truly such resilient and amazing dogs with the kindest eyes & goofiest personalities,' wrote Liv Phelps, a worker for Hotel for Homeless Dogs in Cumberland, Rhode Island. 'Spanky a gentle giant who welcomes you with wet slobbery kisses and big ol hugs. Darla with her ever so soft fur and beautiful face. What's not to love?'
Their owner, Loren Marino, 24, of Haverhill, was found dead in a home along with her boyfriend in March 2024, WCVB reported. Authorities said it was a murder-suicide.
Marino was born in Lowell and lived in East Providence, Rhode Island, before she moved back to Massachusetts to live in Haverhill in 2022. She worked as a mental healthcare professional at Anodyne Medical Services on assignment for Vinfen Corporation, 'where she was dedicated to her work,' her family wrote in her obituary.
The dogs were traumatized after losing their owner, WCVB reported. And one of the dogs didn't easily trust men.
The dogs were brought to the shelter in hopes of finding a new family. But a year later, they were still without a home.
'I can only imagine Loren and the worry she would have for these dogs,' Hotel for Homeless Dogs Executive Director Susan Joseph told the news outlet. 'It was really important for me to close this chapter for her, but we had to close it right.'
On April 14, that changed.
'We did it! Darla and Spanky have found their forever home,' the rescue wrote on Facebook. 'After spending a week with their new mom and dad they just fell in love. Both Darla and Spanky molded perfectly into their lifestyle.'
The Massachusetts couple, Mike Pollock and Karen Roy, said they will care for their two new dogs in Loren's memory.
'I hope she can rest easy because we will love them,' Roy told WCVB.
If you are a victim of domestic or dating violence, you are not alone.
SafeLink offers a 24/7 toll-free hotline:
(877) 785-2020
(877) 521-2601 (TTY)
National Domestic Violence Hotline:
(800) 799-7233
Mass. man used AI to make fake nude images of women he knew; faces sentencing
From 'racist employee' to lead star, this movie filming in Mass. is looking for actors
Mass. State Lottery winner: $15 million ticket sold at gas station
Fearful of a group home resident, Klaus Anderson Road residents want answers and investigation
CPC schedules special meeting to discuss proposal to resurface high school track

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

New Orleans pushes to legalize police use of ‘facial surveillance'
New Orleans pushes to legalize police use of ‘facial surveillance'

Boston Globe

time28 minutes ago

  • Boston Globe

New Orleans pushes to legalize police use of ‘facial surveillance'

Get Starting Point A guide through the most important stories of the morning, delivered Monday through Friday. Enter Email Sign Up In an emailed statement, a police spokesperson said the department 'does not surveil the public,' and that surveillance is 'not the goal of this ordinance revision.' But the word 'surveillance' appears in the proposed ordinance dozens of times, including explicitly giving police authority to use 'facial surveillance.' Advertisement Many police departments use AI to help them identify suspects from still images taken at or near the scene of a crime, but New Orleans police have already taken the technology a step further. Over the past two years, the department relied on a privately owned network of cameras equipped with facial recognition software to constantly monitor the streets for wanted people and automatically ping an app on officers' mobile phones to convey the names and locations of possible matches, The Post reported last month. Advertisement In April, after The Post requested public records about this system, New Orleans Police Superintendent Anne Kirkpatrick paused the automated alerts and ordered a review into how officers used the technology and whether the practice violated local restrictions on facial recognition. David Barnes, a New Orleans police sergeant overseeing legal research and planning, who wrote the proposed ordinance, said he hopes to complete the review and share his findings before the City Council vote. The facial recognition alerts are still paused, he said Wednesday. There are no federal regulations around the use of AI by local law enforcement. New Orleans was one of many cities to ban the technology during the policing overhauls passed in the wake of the Black Lives Matter protests of 2020, with the City Council saying it had 'significant concerns about the role of facial recognition technologies and surveillance databases in exacerbating racial and other bias.' Federal studies have shown the technology to be less reliable when scanning people of color, women, and older people. New Orleans partly rolled back the restrictions in 2022, letting police use facial recognition for searches of specific suspects of violent crimes, but not for general tracking of people in public places. Each time police want to scan a face, they must send a still image to trained examiners at a state facility and later provide details about these scans to the city council — guardrails meant to protect the public's privacy and prevent software errors from leading to wrongful arrests. Advertisement Now, city leaders want to give police broad access to the technology with fewer limitations, arguing that automated surveillance tools are necessary for fighting crime. Violent crime rates in New Orleans, like much of the country, are at historic lows, according to Jeff Asher, a consultant who tracks crime statistics in the region. But facial recognition-equipped cameras have proven useful in a few recent high-profile incidents, including the May 16 escape of 10 inmates from a local jail and the New Year's Day attack on Bourbon Street that left 14 dead. 'Violent crime is at an all-time low but mass murders and shootings are at an all-time high,' Oliver Thomas, one of two council members sponsoring the ordinance, said in an interview this week. 'This is a tool to deal with some of this mass violence and mass murders and attacks.' After The Post informed Thomas there were 310 fatal and nonfatal shootings in New Orleans last year — by far the lowest number in the 14 years the city council has published these statistics on its online crime data dashboard — he acknowledged that shootings are down and partly attributed the decline to his work with young people and ex-offenders. Nora Ahmed, the legal director for the ACLU of Louisiana, said council members are using public concern over recent news to justify the widespread adoption of facial recognition technology, or FRT — a powerful technology with the potential to strip people of their rights. 'In the name of making FRT available for a once-in-a-decade jail break, this bill opens up FRT to being used by federal and state entities, and enterprising local police departments,' Ahmed said in a text message. 'This type of surveillance should not exist in the United States period.' Advertisement The new ordinance would give police the ability to use 'facial surveillance' and 'characteristic tracking' systems to actively monitor the streets looking for people with warrants or people under investigation. It would require them to continue sharing data about facial searches to the City Council and begin reporting details about the software they use and its accuracy. While the ordinance says police cannot use facial surveillance tools to target abortion seekers or undocumented immigrants, Ahmed says those protections are 'paper thin' and worries officers would find ways around them. It's not clear whether New Orleans plans to keep working with Project NOLA, a privately funded nonprofit group that has provided automated facial recognition alerts to officers despite having no contract with the city. Barnes, the police sergeant, said Project NOLA would need to come into a formal data-sharing agreement with the city if it wanted to continue sending automated alerts to officers who have logged into a Project NOLA system to receive them. Under the new ordinance, Project NOLA could also be required to publish information about all of its searches to the city council. Such data reporting could be complicated with a live facial recognition system, in which cameras are constantly scanning every face in their vicinity. With hundreds of cameras potentially scanning thousands of faces a day, Project NOLA, or the city, could theoretically need to report information about millions of facial recognition scans in each of its quarterly data reports the department is required to provide to the City Council. Bryan Lagarde, Project NOLA's founder, declined to comment this week, saying he was on vacation. Advertisement New Orleans's embrace of the term 'surveillance' — which appears 40 times in the text of the proposed ordinance — appears at odds with statements made by Kirkpatrick, the city's top police official. In an interview last month, Kirkpatrick said she believes governments should be prevented from surveilling their citizens, especially when they are in public exercising their constitutional rights. 'I do not believe in surveilling the citizenry and residents of our country,' Kirkpatrick said at the time. 'Surveilling is an invasion of our privacy.'

Consultant on trial in N.H. for AI-generated robocalls mimicking Biden says he has no regrets
Consultant on trial in N.H. for AI-generated robocalls mimicking Biden says he has no regrets

Boston Globe

time10 hours ago

  • Boston Globe

Consultant on trial in N.H. for AI-generated robocalls mimicking Biden says he has no regrets

Advertisement Kramer, who faces decades in prison if convicted of voter suppression and impersonating a candidate, said his goal was to Get N.H. Morning Report A weekday newsletter delivering the N.H. news you need to know right to your inbox. Enter Email Sign Up 'This is going to be my one good deed this year,' he recalled while testifying in Belknap County Superior Court. He said his goal wasn't to influence an election, because he didn't consider the primary a real election. At Biden's request, the Democratic National Committee Advertisement Kramer, who owns a firm specializing in get-out-the-vote projects, argued that the primary was a meaningless straw poll unsanctioned by the DNC. At the time the calls went out, voters were disenfranchised, he said. Asked by his attorney, Tom Reid, whether he did anything illegal, Kramer said, 'I'm positive I did not.' Later, he said he had no regrets and that his actions likely spurred AI regulations in multiple states. Kramer, who will be questioned by prosecutors Thursday, also faces a $6 million fine by the Federal Communications Commission but told The Associated Press on Wednesday that he won't pay it. Lingo Telecom, the company that transmitted the calls, agreed to pay $1 million in a settlement in August. The robocalls appeared to come from a former New Hampshire Democratic Party chair, Kathy Sullivan, and told voters to call her number to be removed from the call list. On the witness stand earlier Wednesday, Sullivan said she was confused and then outraged after speaking to one of the recipients and later hearing the message. 'I hung up the phone and said, 'There is something really crazy going on,'' she said. 'Someone is trying to suppress the vote for Biden. I can't believe this is happening.' Months later, she got a call from Kramer in which he said he used her number because he knew she would contact law enforcement and the media. He also described his motive — highlighting AI's potential dangers — but she didn't believe him, she testified. Advertisement 'My sense was he was trying to convince me that he'd done this defensible, good thing,' she said. 'I'm listening to this thinking to myself, 'What does he thing I am, stupid?' He tried to suppress the vote.'

New Orleans pushes to legalize police use of ‘facial surveillance'
New Orleans pushes to legalize police use of ‘facial surveillance'

Washington Post

time14 hours ago

  • Washington Post

New Orleans pushes to legalize police use of ‘facial surveillance'

New Orleans is considering easing restrictions on the police use of facial recognition, weeks after The Washington Post reported that police there secretly relied on a network of AI-powered surveillance cameras to identify suspects on the street and arrest them. According to the draft of a proposed ordinance posted to a city website, police would be permitted to use automated facial recognition tools to identify and track the movements of wanted subjects, missing people or suspected perpetrators of serious crimes — reversing the city's broad prohibition against using facial recognition as a 'surveillance tool.' The proposed rule, which was written by a New Orleans police official, is scheduled for a city council vote later this month, according to a person briefed on the council's plans who spoke on the condition of anonymity because the person was not authorized to speak about them publicly. If the rule passes, New Orleans would become the first U.S. city to formally allow facial recognition as a tool for surveilling residents in real time. In an emailed statement, a police spokesperson said the department 'does not surveil the public,' and that surveillance is 'not the goal of this ordinance revision.' But the word 'surveillance' appears in the proposed ordinance dozens of times, including explicitly giving police authority to use 'facial surveillance.' Many police departments use AI to help them identify suspects from still images taken at or near the scene of a crime, but New Orleans police have already taken the technology a step further. Over the past two years, the department relied on a privately owned network of cameras equipped with facial recognition software to constantly monitor the streets for wanted people and automatically ping an app on officers' mobile phones to convey the names and locations of possible matches, The Post reported last month. In April, after The Post requested public records about this system, New Orleans Police Superintendent Anne Kirkpatrick paused the automated alerts and ordered a review into how officers used the technology and whether the practice violated local restrictions on facial recognition. David Barnes, a New Orleans police sergeant overseeing legal research and planning, who wrote the proposed ordinance, said he hopes to complete the review and share his findings before the city council vote. The facial recognition alerts are still paused, he said Wednesday. There are no federal regulations around the use of AI by local law enforcement. New Orleans was one of many cities to ban the technology during the policing overhauls passed in the wake of the Black Lives Matter protests of 2020, with the city council saying it had 'significant concerns about the role of facial recognition technologies and surveillance databases in exacerbating racial and other bias.' Federal studies have shown the technology to be less reliable when scanning people of color, women and older people. New Orleans partly rolled back the restrictions in 2022, letting police use facial recognition for searches of specific suspects of violent crimes, but not for general tracking of people in public places. Each time police want to scan a face, they must send a still image to trained examiners at a state facility and later provide details about these scans to the city council — guardrails meant to protect the public's privacy and prevent software errors from leading to wrongful arrests. Now, city leaders want to give police broad access to the technology with fewer limitations, arguing that automated surveillance tools are necessary for fighting crime. Violent crime rates in New Orleans, like much of the country, are at historic lows, according to Jeff Asher, a consultant who tracks crime statistics in the region. But facial recognition-equipped cameras have proven useful in a few recent high-profile incidents, including the May 16 escape of 10 inmates from a local jail and the New Year's Day attack on Bourbon Street that left 14 dead. 'Violent crime is at an all-time low but mass murders and shootings are at an all-time high,' Oliver Thomas, one of two council members sponsoring the ordinance, said in an interview this week. 'This is a tool to deal with some of this mass violence and mass murders and attacks.' After The Post informed Thomas there were just over 300 fatal and nonfatal shootings in New Orleans last year — by far the lowest number in the 14 years the city council has published these statistics on its online crime data dashboard — he acknowledged that shootings are down and partly attributed the decline to his work with young people and ex-offenders. Nora Ahmed, the legal director for the ACLU of Louisiana, said council members are using public concern over recent news to justify the widespread adoption of facial recognition technology, or FRT — a powerful technology with the potential to strip people of their rights. 'In the name of making FRT available for a once-in-a-decade jail break, this bill opens up FRT to being used by federal and state entities, and enterprising local police departments,' Ahmed said in a text message. 'This type of surveillance should not exist in the United States period.' The new ordinance would give police the ability to use 'facial surveillance' and 'characteristic tracking' systems to actively monitor the streets looking for people with warrants or people under investigation. It would require them to continue sharing data about facial searches to the city council and begin reporting details about the software they use and its accuracy. While the ordinance says police cannot use facial surveillance tools to target abortion seekers or undocumented immigrants, Ahmed says those protections are 'paper thin' and worries officers would find ways around them. It's not clear whether New Orleans plans to keep working with Project NOLA, a privately funded nonprofit group that has provided automated facial recognition alerts to officers despite having no contract with the city. Barnes, the police sergeant, said Project NOLA would need to come into a formal data-sharing agreement with the city if it wanted to continue sending automated alerts to officers who have logged into a Project NOLA system to receive them. Under the new ordinance, Project NOLA could also be required to publish information about all of its searches to the city council. Such data reporting could be complicated with a live facial recognition system, in which cameras are constantly scanning every face in their vicinity. With hundreds of cameras potentially scanning thousands of faces a day, Project NOLA or the city could theoretically need to report information about millions of facial recognition scans in each of its quarterly data reports the department is required to provide city council. Bryan Lagarde, Project NOLA's founder, declined to comment this week, saying he was on vacation. New Orleans's embrace of the term 'surveillance' — which appears 40 times in the text of the proposed ordinance — appears at odds with statements made by Kirkpatrick, the city's top police official. In an interview last month, Kirkpatrick said she believes governments should be prevented from surveilling their citizens, especially when they are in public exercising their constitutional rights. 'I do not believe in surveilling the citizenry and residents of our country,' Kirkpatrick said at the time. 'Surveilling is an invasion of our privacy.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store