logo
11-year-old accidentally shot by sibling dies in Wisconsin

11-year-old accidentally shot by sibling dies in Wisconsin

Yahoo20-05-2025

An 11-year-old boy died after his sibling accidentally shot him on Friday, authorities said. The incident happened at a home in Racine, Wisconsin, about a half hour's drive south of Milwaukee.
Officers went to a hospital in the area where the boy's family had taken him to treat his gunshot wound, the Racine Police Department said in a news release. He died from his injuries, according to the department.
Police said a suspect has been "identified and apprehended" and that their investigation is ongoing.
"Racine Police investigators are interested in any additional information that anyone may have about this incident," the department said, asking that anyone with knowledge of the shooting contact the police investigations unit or report tips anonymously through Crime Stoppers.
Hundreds of unintentional shootings by children happen every year in the United States, according to the nonprofit Everytown for Gun Safety, which has tracked such incidents annually since 2015. In 2023, Everytown recorded 411 unintentional shootings by children nationwide, which resulted in 158 deaths and 269 injuries. It was the highest number of incidents counted in a single year since the nonprofit started tracking them.
At the time, a study published by the U.S. Centers for Disease Control and Prevention looked at a two-decade rise in children's deaths while playing with guns and found the vast majority of cases involved guns that were loaded and not securely stored. The study's authors concluded that unintentional deaths from firearms were preventable.
Not including Friday's incident in Racine, at least 63 unintentional shootings by children have already occurred this year, according to Everytown. They resulted in 28 deaths and 36 injuries reported in 28 states. In Wisconsin, a 6-year-old boy unintentionally shot and killed himself with a handgun on April 1 at a home in Milwaukee, the data shows.
Car bomb outside Palm Springs fertility clinic was act of terrorism, officials say
Why bonding is the heart of infant care
Greg Tarzan Davis switches sides in "Mission: Impossible – The Final Reckoning"

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Opinion - Mission possible: An alternative to facial recognition technology
Opinion - Mission possible: An alternative to facial recognition technology

Yahoo

time20 hours ago

  • Yahoo

Opinion - Mission possible: An alternative to facial recognition technology

For decades, Hollywood has presented audiences with futuristic disguises that were once thought only possible in science fiction. Silicone masks, fake contact lenses, and 3D printed biometrics are staples of popular spy movies like the 'Mission: Impossible' franchise. But these forms of 'spyware,' once found only on the silver screen, are, in fact, a reality. The advent of the internet and facial recognition technology has turned disguise work into a matter of national security. Just look at how our adversaries abuse facial recognition technology. Whether it is the Social Credit System or cameras lining public streets to monitor dissent against the Chinese state or oppress minority groups, China's surveillance state is built on facial recognition technology — some of the most sophisticated in the world, due to the amount of data it can access through measures such as its National Security Law. The Chinese Communist Party's monitoring system can essentially control the life of any individual across its regions, freeze payments and track purchases anywhere in the country. With China as an example of how not to use facial recognition technology, the U.S. should be clear-eyed about the vulnerabilities and potential abuses posed by these increasing outmoded forms of biometric security. Advancements in artificial intelligence, deepfakes and three-dimensional printing are successfully tricking facial recognition tools, which should affirm that we cannot continue to rely on them to protect locations critical to national security. Threat actors have developed a variety of tactics to spoof facial recognition software. Some are known as 'replay attacks' and occur when a video is presented to a facial biometric system by an actor other than the intended user. Static photos are another form of attack with the same intention. Currently, the success rate for bypassing facial recognition technologies with these methods is 98 percent and 96 percent respectively — a staggering statistic. Europol even recently noted how artificial intelligence is successfully compromising phones, issuing a warning about 'increased use of artificial fingerprints, deepfake media, and voice cloning to bypass security protocols.' Facial recognition technology also presents challenges with accurate identification. While cheap or generic silicone masks perform poorly in fooling individual biometric devices, they prove effective in avoiding facial recognition technology in a crowd. Let us also not forget that some older versions of facial recognition technology struggled to distinguish between people with darker skin. Knowing this challenge, we must seriously question efforts to use facial recognition to verify voter identity before casting a ballot — one of America's foundational processes. There are three things we must do as we move away from facial recognition technology. First, to protect the privacy of Americans, and until new technology is put into place, the U.S. should begin by exploring implementation of proven biometric security tools on a solely voluntary basis. For example, the Transportation Security Administration uses facial verification technology, and permits individuals to opt out. Second, consequences must be imposed when unsecure technology is developed, or adversaries cross the line. While the Committee on Homeland Security is currently undertaking the challenge to change these economic models in cybersecurity, we have an opportunity to get ahead of them now by pursuing more secure and accurate biometric security tools. We cannot become overly dependent on fallible technology — the risks are simply too high. Finally, while we seek alternatives to facial recognition technology that ensure U.S. law enforcement entities have the best tools to protect us, the U.S. must clearly call out China for its abuse of facial recognition technology. The Chinese Communist Party's use of facial recognition technology to control its citizenry is unacceptable and should concern all Americans. The U.S. must address the risks of facial recognition technology head on to protect the liberties we cherish. It is time we work closely with our innovators to champion biometric solutions that are secure, reliable and aligned with American values. Mark Green, M.D., represents Tennessee's 7th Congressional District and is chairman of the House Homeland Security Committee. Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Mission possible: An alternative to facial recognition technology
Mission possible: An alternative to facial recognition technology

The Hill

timea day ago

  • The Hill

Mission possible: An alternative to facial recognition technology

For decades, Hollywood has presented audiences with futuristic disguises that were once thought only possible in science fiction. Silicone masks, fake contact lenses, and 3D printed biometrics are staples of popular spy movies like the 'Mission: Impossible' franchise. But these forms of 'spyware,' once found only on the silver screen, are, in fact, a reality. The advent of the internet and facial recognition technology has turned disguise work into a matter of national security. Just look at how our adversaries abuse facial recognition technology. Whether it is the Social Credit System or cameras lining public streets to monitor dissent against the Chinese state or oppress minority groups, China's surveillance state is built on facial recognition technology — some of the most sophisticated in the world, due to the amount of data it can access through measures such as its National Security Law. The Chinese Communist Party's monitoring system can essentially control the life of any individual across its regions, freeze payments and track purchases anywhere in the country. With China as an example of how not to use facial recognition technology, the U.S. should be clear-eyed about the vulnerabilities and potential abuses posed by these increasing outmoded forms of biometric security. Advancements in artificial intelligence, deepfakes and three-dimensional printing are successfully tricking facial recognition tools, which should affirm that we cannot continue to rely on them to protect locations critical to national security. Threat actors have developed a variety of tactics to spoof facial recognition software. Some are known as 'replay attacks' and occur when a video is presented to a facial biometric system by an actor other than the intended user. Static photos are another form of attack with the same intention. Currently, the success rate for bypassing facial recognition technologies with these methods is 98 percent and 96 percent respectively — a staggering statistic. Europol even recently noted how artificial intelligence is successfully compromising phones, issuing a warning about 'increased use of artificial fingerprints, deepfake media, and voice cloning to bypass security protocols.' Facial recognition technology also presents challenges with accurate identification. While cheap or generic silicone masks perform poorly in fooling individual biometric devices, they prove effective in avoiding facial recognition technology in a crowd. Let us also not forget that some older versions of facial recognition technology struggled to distinguish between people with darker skin. Knowing this challenge, we must seriously question efforts to use facial recognition to verify voter identity before casting a ballot — one of America's foundational processes. There are three things we must do as we move away from facial recognition technology. First, to protect the privacy of Americans, and until new technology is put into place, the U.S. should begin by exploring implementation of proven biometric security tools on a solely voluntary basis. For example, the Transportation Security Administration uses facial verification technology, and permits individuals to opt out. Second, consequences must be imposed when unsecure technology is developed, or adversaries cross the line. While the Committee on Homeland Security is currently undertaking the challenge to change these economic models in cybersecurity, we have an opportunity to get ahead of them now by pursuing more secure and accurate biometric security tools. We cannot become overly dependent on fallible technology — the risks are simply too high. Finally, while we seek alternatives to facial recognition technology that ensure U.S. law enforcement entities have the best tools to protect us, the U.S. must clearly call out China for its abuse of facial recognition technology. The Chinese Communist Party's use of facial recognition technology to control its citizenry is unacceptable and should concern all Americans. The U.S. must address the risks of facial recognition technology head on to protect the liberties we cherish. It is time we work closely with our innovators to champion biometric solutions that are secure, reliable and aligned with American values. Mark Green, M.D., represents Tennessee's 7th Congressional District and is chairman of the House Homeland Security Committee.

Supreme Court: US Gun Makers Not Liable for Cartel Violence
Supreme Court: US Gun Makers Not Liable for Cartel Violence

Yahoo

timea day ago

  • Yahoo

Supreme Court: US Gun Makers Not Liable for Cartel Violence

In a unanimous blow to gun control advocacy groups, he Supreme Court shut down Mexicos $10 billion claim targeting U.S. gun makers in a cross-border lawsuit. Mexico originally filed the suit in 2021, arguing that U.S. gun companies were responsible for the weapons that fueled cartel violence. Mexico received support in its lawsuit from American gun control advocacy groups such as Everytown and March for our Lives Action Fund. The Supreme Court ruling, written by Justice Elena Kagan, found that the manufacturers alleged failure to exercise "reasonable care" does not meet the standard necessary to be found liable for "aiding and abetting" the sale of illegal firearms in Mexico. Mexico had asked the court for $10 billion in damages and additional court-imposed injunctive relief in the form of restrictions on manufacturers. According to a lawyer who spoke to RCP, siding with Mexico on the injunctive relief "would have likely severely prohibited the distribution of the manufacturers products" within the United States. A federal district court judge initially ruled that the Protection of Lawful Commerce in Arms Act protected the gun manufacturers from the suit. In 2024, the First Circuit Court of Appeals revitalized the lawsuit. In response, gun manufacturer Smith & Wesson brought the case to the Supreme Court. The PLCAA, signed into law in 2005 by President George W. Bush, shields gun manufacturers and dealers from liability when crimes are committed with their products. The law includes exceptions which Mexicos lawyers sought to invoke. The original suit by Mexico, which named multiple U.S.-based gun manufacturers as defendants, claimed that Mexicans "have been victimized by a deadly flood of military-style and other particularly lethal guns that flows from the U.S. across the border." It also argued that U.S. companies were negligent in their sales practices, claiming that the gun companies "are not accidental or unintentional players in this tragedy; they are deliberate and willing participants, reaping profits from the criminal market they knowingly supply." In response, lawyers for Smith & Wesson argued in a filing that the lawsuit "faults the defendants for producing common firearms" and for "failing to restrict the purchase of firearms by regular citizens." They made the case that "aiding and abetting criminal activity must involve something more than making products generally." Ultimately, the Supreme Court agreed with this reasoning. In reference to the injunctive relief that Mexico asked the court to grant, lawyers for Smith & Wesson asserted that the lawsuit was "inflicting costly and intrusive discovery at the hands of a foreign sovereign that is trying to bully the industry into adopting a host of gun-control measures that have been repeatedly rejected by American voters." According to some estimates, more than 250,000 firearms are smuggled from the United States into Mexico each year. In contrast, Mexico has one gun store and issues fewer than 50 new gun permits each year. The U.S. is the largest firearm exporter in the world, partly due to relaxed gun laws within the country. The unanimous decision marks the first ruling by the Supreme Court where the PLCAA is cited and could serve as precedent for protecting weapons manufacturers in future cases. The 9-0 ruling suggests strong judicial consensus on the limits of civil liability for gun manufacturers under federal law. It is seen as a win by gun rights activists, with the NRA arguing in their amicus brief on the case that "Mexico has extinguished its constitutional arms right and now seeks to extinguish Americas." Justices Clarence Thomas and Ketanji Brown Jackson each issued concurring opinions, with Jackson writing that Mexicos lawsuit targeted industry-wide practices that Congress has chosen not to prohibit and Thomas arguing that violations of U.S. law must be established in court for the PLCAA exceptions to be valid. James Eustis is an intern at RealClearPolitics. He studies politics at Washington & Lee University.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store