Latest news with #Autopilot


Daily Mail
2 days ago
- Automotive
- Daily Mail
Terrifying dashcam footage shows what it's like trapped inside a self-driving Tesla that crashed and KILLED woman
Terrifying dash camera footage shows the moment a self-driving Tesla rammed into a couple killing a young woman and leaving her boyfriend with life changing injuries. Naibel Benavides Leon, 22, died after the Tesla Model S slammed in to her and boyfriend Dillon Angulo, then 27, in 2019. The couple had pulled over to look at the stars at the side of a road near Key Largo, Florida, when they were struck by the vehicle after driver George McGee took his eye off the road to reach for his phone. Footage from the Tesla's front camera showed McGee blow through a red light as he speeds down the road at nearly 70mph. The car passes a stop sign and crashes through several other road signs before striking the couple's vehicle, which was parked 40 feet off Card Sound Road by County Road 905. Benavides Leon was thrown 75 feet and died at the scene, while Angulo suffered serious injuries, according to a wrongful death lawsuit filed against Tesla by the woman's estate. The filings accuse Tesla of advertising its self-driving system Autopilot, 'in a way that greatly exaggerated its capabilities and hid its deficiencies '. 'The McGee Tesla Model S had an Autopilot system that was still in Beta, meaning it was not fully tested for safety, and, further, the system was not designed to be used on roadways with cross-traffic or intersections,' the lawsuit states. 'Nevertheless, Tesla programed the system so that it could be operated in such areas.' The documents call-out Tesla boss Elon Musk and allege he ignored previous reports about issues with the Autopilot feature, listing out 56 alleged incidents. 'Tesla and its CEO, Elon Musk, made the intentional decision to continue encouraging Tesla drivers to over-rely on its Autopilot system,' the filing states. 'Tesla chose to continue profiting from the sales of their defective vehicles and software systems rather than heed warnings from government agencies, experts, and other car companies.' The lawsuit provides allegations about the fatal crash on April 25, 2019. 'While McGee was reaching for his phone, the vehicle detected a stop sign, a stop bar, the road's edge, a pedestrian, and a parked Chevrolet Tahoe, but the Vehicle did not provide McGee with any audio alert or other warning of the obstacles and never engaged its emergency brakes,' court documents said. McGee told cops he was driving in 'cruise' and he took his hand off the wheel to retrieve his dropped cell phone and then hit the truck, according to the complaint. 'McGee stated to officers, "[i]t was actually because I was driving. I looked down and I've been using cruise control, and I looked down, I didn't realize (INAUDIBLE) and then I sat up. The minute I sat up, I hit the brakes and saw his truck,' the legal filings state. The lawsuit also claims that McGee told a 911 operator he was not paying attention during the drive. 'Shortly after the crash, McGee called 911, telling the operator: 'Oh my God, I wasn't looking," "I don't know what happened. I ended up missing the turn. I was looking down," and "I dropped my phone. Oh my God,"' the document said. The complaint claims that McGee relied on Tesla's autopilot feature to drive him home. Tesla Traffic Aware Cruise Control claims to help drivers maintain safe distance from the car in front, automatically break and with lane control. The Elon Musk founded company has said its features are meant for 'fully attentive' drivers, opens new tab holding the steering wheel, and the features do not make its vehicles autonomous. Proceedings in the case are scheduled to being on July 14, marking the first time a wrongful death case against Tesla is heading to trial. Daily Mail has contacted Tesla for comment on this story. In June, Tesla failed to persuade a federal judge to end the lawsuit after the judge said the plaintiffs offered sufficient evidence that Autopilot defects were a 'substantial factor' in their injuries. While McGee, who is not a defendant, conceded he was not driving safely, but that didn't automatically make him solely responsible, 'particularly given McGee's testimony that he expected Autopilot to avoid the collision,' US District Judge Beth Bloom said. Bloom said the failure to warn claim survived in part because Autopilot's risks might be hard to extract from the owner's manual on Model S touchscreens. 'Tesla deliberately blurs the distinction between whether its automation system is merely a 'driver assist' system or a fully autonomous system that does not require the driver's constant attention,' the complaint states. They go on to quote Musk in September, 2016 when he asserted that: 'The exciting thing is that even if the vision system doesn't recognize what the object is because it could be a very strange looking vehicle, it could be a multi-car pileup, it could be a truck crossing the road, it really could be anything – an alien spaceship, a pile of junk metal that fell off the back of a truck, per the lawsuit. 'It actually doesn't matter what the object is, it just knows that there's something dense that it is going to hit – and it should not hit that.' But the lawsuit claims Tesla is liable because its promises about Autopilot are what motivated McGee to purchase the vehicle. 'At all material times, George McGee purchased the vehicle in large part because of the Autopilot and other safety features advertised by Tesla,' the document states. A summary judgement denial from Judge Bloom allowing the plaintiffs to pursue punitive damages states that: 'McGee testified that his beliefs about the capabilities of Autopilot came from "looking at information on the [V]ehicle" . . . [and] Plaintiffs contend that he likely watched videos online or on Tesla's website about the [V]ehicle's features and how they work . . . [including] [o]ne video show[ing] Tesla['s] drivers operating the vehicle without their hands.' Bloom also dismissed the estate's manufacturing defect and negligent misrepresentation claims. The lawsuit is seeking unspecified damages and the trial got underway this week.


Canada News.Net
2 days ago
- Automotive
- Canada News.Net
High-stakes Miami trial puts Tesla's safety claims under scrutiny
NEW YORK CITY, New York: A high-stakes trial involving Tesla began this week in Miami, where a jury will determine whether the company bears any responsibility for the death of a university student and the serious injury of her boyfriend in a 2019 crash involving one of its vehicles. The incident occurred near Key West, Florida, when a Tesla Model S, traveling nearly 70 mph, ran through flashing red lights, a stop sign, and a T-intersection before crashing into a parked Chevrolet Tahoe. The collision killed Naibel Benavides Leon, who had been stargazing nearby, and seriously injured her boyfriend, Dillon Angulo. She was thrown 75 feet into a wooded area. The plaintiffs argue that Tesla's driver-assistance feature, Autopilot, should have recognized the vehicle ahead and either warned the driver or slowed down. They claim Tesla's system failed to do so, despite detecting the Tahoe. According to the lawsuit, the driver, George McGee, relied on Autopilot and was distracted, reaching for a dropped phone when the car crashed. McGee was sued separately, and that case has been settled. Tesla, however, rejects any blame. In a statement, the company said, "The evidence clearly shows that this crash had nothing to do with Tesla's Autopilot technology. Instead, like so many unfortunate accidents since cellphones were invented, this was caused by a distracted driver." Tesla also emphasized that its user manuals instruct drivers to remain alert and ready to take control at all times, noting that its vehicles are not fully autonomous. What makes this case particularly significant is that U.S. District Judge Beth Bloom has allowed the plaintiffs to seek punitive damages, a rare development in lawsuits against Tesla. In her ruling last month, she dismissed claims of manufacturing defects and negligent misrepresentation but allowed other liability claims to move forward. "A reasonable jury could find that Tesla acted in reckless disregard of human life for the sake of developing their product and maximizing profit," Judge Bloom wrote. The lawsuit contends that Tesla should have restricted the use of Autopilot to major roads for which it was designed, preventing drivers from activating it on smaller, rural roads like the one where the crash occurred. The plaintiffs cite data and video evidence showing the system detected the Tahoe but failed to act appropriately. Tesla has since updated its Autopilot and Full Self-Driving systems, but concerns remain. In 2023, the company recalled 2.3 million vehicles after federal safety regulators found Autopilot did not do enough to ensure driver attention. Regulators later opened an investigation into whether Tesla had truly addressed the issue. Despite ongoing scrutiny, Elon Musk continues to tout the capabilities of Tesla's "Full Self-Driving" technology, which he claims allows vehicles to operate independently. Federal officials have cautioned that such claims can mislead drivers into overreliance, potentially leading to crashes. The Full Self-Driving system has been linked to at least three fatal accidents and is under investigation for poor performance in conditions like sun glare and fog. Tesla is pushing forward with plans to deploy a fleet of driverless robotaxis in the U.S. by the end of next year. Early test runs in Austin, Texas, have been largely successful, though isolated incidents—such as a car veering into the wrong lane—highlight persistent challenges.


The Star
3 days ago
- Automotive
- The Star
Tesla spars in court over Autopilot alert two seconds before crash
A Tesla vehicle passes the Wilkie D. Ferguson Jr. U.S. Courthouse as jury selection began in connection with allegations regarding the safety of Tesla's autopilot system on July 14, 2025 in Miami, Florida. The federal case follows a fatal crash in 2019 of a Tesla on autopilot that crashed into a parked car in Key Largo, Florida. The collision led to the death of 22-year-old Naibel Benavides Leon and the serious injury of her boyfriend, Dillon Angulo. — AFP The final two seconds before a Tesla Model S crashed into a parked SUV took centre stage on July 17 in a court showdown over who's responsible for the 2019 collision – the distracted driver or his car's Autopilot system. Tesla is seeking to show a jury that the company's technology performed as it should and that the driver is fully to blame for running through a stop sign at a T intersection in the Florida Keys and ramming into a Chevrolet Tahoe, killing a woman who stood next to the SUV and seriously injuring her boyfriend. A three-week trial in Miami federal court over a suit filed by the woman's family and the boyfriend is putting close scrutiny on a decade-long experiment with semi-autonomous driving at Elon Musk's electric vehicle maker. A verdict against Tesla would be a blow at a time when the company is staking its future on self-driving and pushing to launch a long-promised robotaxi business. The first few days of the trial have taken jurors deep into how the technology works and what its limitations are. The company's lawyer, Joel Smith, pressed a key witness for the plaintiffs to agree that an audible alert 1.65 seconds before impact – when the car's automated steering function aborted – would have been enough time for the driver to avoid or at least mitigate the accident. Smith demonstrated what the alarm sounds like for jurors to hear. Data recovered from the car's computer shows that driver George McGee was pressing the accelerator to 17 miles (27.4 kilometers) per hour over the posted speed limit, leading him to override the vehicle's adaptive cruise control before he went off the road. He hit the brakes just .55 seconds before impact, but it remains in dispute whether he saw or heard warnings from the Model S while he was reaching to the floorboard for his dropped cell phone. Safety expert Mary "Missy' Cummings, an engineering professor at George Mason University, acknowledged in her second day on the witness stand that McGee may have braked in response to the alert, but she suggested his reaction time was too slow to know for sure. Cummings, who has criticised Tesla's technology in the past and previously served as an adviser to the National Highway Safety Administration, didn't yield much to Smith's questioning. At one point the lawyer highlighted past comments by Musk, in which the Tesla chief executive officer said the use of "beta' to describe the Autopilot system is meant to convey that the software is not a final product and to discourage drivers from "complacency' and taking their hands off the steering wheel. "I do not have any evidence in front of me that the word 'beta' is trying to communicate anything to drivers,' Cummings said. "What it is trying to do, in my professional opinion, is avoid legal liability.' The jury also heard Thursday from an accident reconstruction specialist, Alan Moore, who argued that if Tesla had programmed its software not to operate on roadways it wasn't designed for – like the one on Key Largo – "this crash would not have happened'. But he also testified that McGee had a history of disregarding alerts. Moore explained to jurors that Autopilot automatically disengages if a driver fails to put hands on the wheel after receiving three audible warnings. "Almost every time he commuted from his office to his condo, he would get a strikeout,' Moore said. When that happened, McGee would pull over, put the car in park, shift it back into drive and turn Autopilot back on, the witness said. In his opening argument, Smith had said the data history for McGee showed that he'd safely travelled through the intersection where the crash happened almost 50 times in the same Model S. "The only thing that changed was his driver behaviour,' Smith told the jury. "He dropped something and was trying to pick it up.' – Bloomberg


Toronto Star
3 days ago
- Automotive
- Toronto Star
Tesla's Autopilot system is in the spotlight at a Miami trial over a student killed while stargazing
NEW YORK (AP) — A rare trial against Elon Musk's car company began Monday in Miami where a jury will decide if it is partly to blame for the death of a stargazing university student after a runaway Tesla sent her flying 75 feet through the air and severely injured her boyfriend. Lawyers for the plaintiff argue that Tesla's driver-assistance feature called Autopilot should have warned the driver and braked when his Model S sedan blew through flashing red lights, a stop sign and a T-intersection at nearly 70 miles an hour in the April 2019 crash. Tesla lays the blame solely on the driver, who was reaching for a dropped cell phone.
&w=3840&q=100)

Business Standard
3 days ago
- Automotive
- Business Standard
Tesla spars in court over autopilot alert 2 seconds before 2019 crash
Tesla is seeking to show a jury that the company's technology performed as it should and that the driver is fully to blame for running through a stop sign at a T intersection Bloomberg The final two seconds before a Tesla Model S crashed into a parked SUV took center stage Thursday in a court showdown over who's responsible for the 2019 collision — the distracted driver or his car's Autopilot system. Tesla is seeking to show a jury that the company's technology performed as it should and that the driver is fully to blame for running through a stop sign at a T intersection in the Florida Keys and ramming into a Chevrolet Tahoe, killing a woman who stood next to the SUV and seriously injuring her boyfriend. A three-week trial in Miami federal court over a suit filed by the woman's family and the boyfriend is putting close scrutiny on a decade-long experiment with semi-autonomous driving at Elon Musk's electric vehicle maker. A verdict against Tesla would be a blow at a time when the company is staking its future on self-driving and pushing to launch a long-promised robotaxi business. The first few days of the trial have taken jurors deep into how the technology works and what its limitations are. The company's lawyer, Joel Smith, pressed a key witness for the plaintiffs to agree that an audible alert 1.65 seconds before impact — when the car's automated steering function aborted — would have been enough time for the driver to avoid or at least mitigate the accident. Smith demonstrated what the alarm sounds like for jurors to hear. Data recovered from the car's computer shows that driver George McGee was pressing the accelerator to 17 miles (27.4 kilometers) per hour over the posted speed limit, leading him to override the vehicle's adaptive cruise control before he went off the road. He hit the brakes just .55 seconds before impact, but it remains in dispute whether he saw or heard warnings from the Model S while he was reaching to the floorboard for his dropped cell phone. Safety expert Mary 'Missy' Cummings, an engineering professor at George Mason University, acknowledged in her second day on the witness stand that McGee may have braked in response to the alert, but she suggested his reaction time was too slow to know for sure. Cummings, who has criticized Tesla's technology in the past and previously served as an adviser to the National Highway Safety Administration, didn't yield much to Smith's questioning. At one point the lawyer highlighted past comments by Musk, in which the Tesla chief executive officer said the use of 'beta' to describe the Autopilot system is meant to convey that the software is not a final product and to discourage drivers from 'complacency' and taking their hands off the steering wheel. 'I do not have any evidence in front of me that the word 'beta' is trying to communicate anything to drivers,' Cummings said. 'What it is trying to do, in my professional opinion, is avoid legal liability.' The jury also heard Thursday from an accident reconstruction specialist, Alan Moore, who argued that if Tesla had programmed its software not to operate on roadways it wasn't designed for — like the one on Key Largo — 'this crash would not have happened.' But he also testified that McGee had a history of disregarding alerts. Moore explained to jurors that Autopilot automatically disengages if a driver fails to put hands on the wheel after receiving three audible warnings. 'Almost every time he commuted from his office to his condo, he would get a strikeout,' Moore said. When that happened, McGee would pull over, put the car in park, shift it back into drive and turn Autopilot back on, the witness said. In his opening argument, Smith had said the data history for McGee showed that he'd safely traveled through the intersection where the crash happened almost 50 times in the same Model S. 'The only thing that changed was his driver behavior,' Smith told the jury. 'He dropped something and was trying to pick it up.'