logo
Elon Musk's Tesla fined over ₹1,996 crore in Florida autopilot crash case; jury flags ‘responsibility' amid tech failure

Elon Musk's Tesla fined over ₹1,996 crore in Florida autopilot crash case; jury flags ‘responsibility' amid tech failure

Minta day ago
Elon Musk's Tesla has been ordered to pay more than $240 million in damages to victims of a deadly car crash in Florida that involved its Autopilot driver assist technology after a Miami jury found the EV maker responsible for the incident.
Tesla had significant responsibility as its technology failed, the federal jury observed. Not all the blame can be put on the driver of the car, even the one who confessed that he was distracted by his mobile phone when he hit a young strargazing couple.
The jury's decision comes at a time when Elon Musk is trying to convince Americans that Tesla's cars are safe to be self-driven, as he seeks to roll out a driverless taxi firm in several cities shortly.
The jury's decision ends a four-year-long case, which stands out not just for its outcome but the very fact that it even made it to trial. Several such cases against Tesla have earlier been dismissed or settled by the company to avoid controversial trials.
'This will open the floodgates,' said Miguel Custodio, a car crash lawyer not involved in the Tesla case. 'It will embolden a lot of people to come to court.'
The case also included startling charges by lawyers for the family of the deceased, 22-year-old, Naibel Benavides Leon, and for her injured boyfriend, Dillon Angulo. They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident. Tesla said it made a mistake after being shown the evidence and honestly hadn't thought it was there.
'We finally learned what happened that night, that the car was actually defective,' said Benavides' sister, Neima Benavides. 'Justice was achieved.'
Tesla has previously faced criticism that it is slow to cough up crucial data by relatives of other victims in Tesla crashes, accusations that the car company has denied. In this case, the plaintiffs showed Tesla had the evidence all along, despite its repeated denials, by hiring a forensic data expert who dug it up.
'Today's verdict is wrong," Tesla said in a statement, 'and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement lifesaving technology,' They said the plaintiffs concocted a story 'blaming the car when the driver – from day one – admitted and accepted responsibility.'
In addition to a punitive award of $200 million, the jury said Tesla must also pay $43 million of a total $129 million in compensatory damages for the crash, bringing the total borne by the company to $243 million.
'It's a big number that will send shock waves to others in the industry,' said financial analyst Dan Ives of Wedbush Securities. 'It's not a good day for Tesla.'
Tesla said it will appeal.
Even if that fails, the company says it will end up paying far less than what the jury decided because of a pre-trial agreement that limits punitive damages to three times Tesla's compensatory damages. Translation: $172 million, not $243 million. But the plaintiff says their deal was based on a multiple of all compensatory damages, not just Tesla's, and the figure the jury awarded is the one the company will have to pay.
It's not clear how much of a hit to Tesla's reputation for safety the verdict in the Miami case will make. Tesla has vastly improved its technology since the crash on a dark, rural road in Key Largo, Florida, in 2019.
But the issue of trust generally in the company came up several times in the case, including in closing arguments Thursday. The plaintiffs' lead lawyer, Brett Schreiber, said Tesla's decision to even use the term Autopilot showed it was willing to mislead people and take big risks with their lives because the system only helps drivers with lane changes, slowing a car and other tasks, falling far short of driving the car itself.
Schreiber said other automakers use terms like 'driver assist' and 'copilot' to make sure drivers don't rely too much on the technology.
'Words matter,' Schreiber said. 'And if someone is playing fast and lose with words, they're playing fast and lose with information and facts.'
Schreiber acknowledged that the driver, George McGee, was negligent when he blew through flashing lights, a stop sign and a T-intersection at 62 miles an hour before slamming into a Chevrolet Tahoe that the couple had parked to get a look at the stars.
The Tahoe spun around so hard it was able to launch Benavides 75 feet through the air into nearby woods where her body was later found. It also left Angulo, who walked into the courtroom Friday with a limp and cushion to sit on, with broken bones and a traumatic brain injury.
But Schreiber said Tesla was at fault nonetheless. He said Tesla allowed drivers to act recklessly by not disengaging the Autopilot as soon as they begin to show signs of distraction and by allowing them to use the system on smaller roads that it was not designed for, like the one McGee was driving on.
'I trusted the technology too much,' said McGee at one point in his testimony. 'I believed that if the car saw something in front of it, it would provide a warning and apply the brakes.'
The lead defense lawyer in the Miami case, Joel Smith, countered that Tesla warns drivers that they must keep their eyes on the road and hands on the wheel yet McGee chose not to do that while he looked for a dropped cellphone, adding to the danger by speeding. Noting that McGee had gone through the same intersection 30 or 40 times previously and hadn't crashed during any of those trips, Smith said that isolated the cause to one thing alone: 'The cause is that he dropped his cellphone.'
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

US Tightens Green Card Rules For Married Couples
US Tightens Green Card Rules For Married Couples

NDTV

time32 minutes ago

  • NDTV

US Tightens Green Card Rules For Married Couples

The US Citizenship and Immigration Services (USCIS) has issued new guidelines to tighten scrutiny of family-based immigrant visa petitions, especially marriage-based applications. This aims to weed out fraudulent claims and ensure only genuine relationships lead to green card approval. The updated guidance, published on August 1 in the USCIS Policy Manual under the section titled "Family Based Immigrants", is now in effect and applies to all pending and newly filed petitions. "Fraudulent, frivolous, or otherwise non-meritorious family-based immigrant visa petitions erode confidence in family-based pathways to lawful permanent resident (LPR) status and undermine family unity in the United States," USCIS said in its release. "We are committed to keeping Americans safe by detecting aliens with potentially harmful intent so they can be processed for removal from the United States." What Has Changed? The updated guidance outlines stricter vetting and documentation procedures, including: Improved eligibility checks and adjudication processes for family-based petitions. Clear documentation requirements, such as photos, shared finances, and affidavits from friends and family to prove bona fide marriages. Mandatory in-person interviews for couples, aimed at evaluating the authenticity of the relationship. Review of prior applications, including multiple petitions filed by the same sponsor or on behalf of the same beneficiary. Closer scrutiny of immigration history, particularly for applicants already in the US on other visas (example - H-1B) seeking adjustment of status through marriage. Issuance of Notices to Appear (NTA) in removal proceedings, even if the green card petition is approved, if the applicant is found to be otherwise ineligible or removable. USCIS clarified that approval of a family-based petition does not automatically protect the beneficiary from deportation. "This guidance will improve USCIS' capacity to vet qualifying marriages and family relationships to ensure they are genuine, verifiable, and compliant with all applicable laws," the agency said. USCIS has also added tougher checks to its policy. If a US citizen sponsors a foreign spouse, such as someone from India, the couple must now submit strong evidence of a real relationship. This includes joint financial records, photos, and possibly letters from friends or family. They must also attend detailed interviews to prove they truly know each other. If USCIS finds anything suspicious, like repeated sponsorships by the same person or mismatches in immigration history, it can trigger deeper investigation or even deportation proceedings. The policy update follows several marriage fraud cases, including one involving Indian national Aakash Prakash Makwana. In May, he pleaded guilty to entering a fake marriage after overstaying his J-1 visa. He used forged documents to show cohabitation and falsely claimed domestic abuse to secure a green card.

'Got Humbled': Vibe Coder Caught Using AI By Boss Gets Schooled
'Got Humbled': Vibe Coder Caught Using AI By Boss Gets Schooled

NDTV

timean hour ago

  • NDTV

'Got Humbled': Vibe Coder Caught Using AI By Boss Gets Schooled

For a long time, writing code meant that software engineers sat long hours in front of a computer, typing out lines of instructions in a programming language. But in recent times, the rise of artificial intelligence (AI) has allowed anyone to 'vibe code', meaning the technology churns out the code after a user feeds it what they want. Now, an intern working at two places who used a similar modus operandi has revealed how the vibe conding tactic backfired for them. As per the now-viral post, the user said they were using Cursor/GPT to ship the product quickly whilst working at two companies. "I'm currently interning at 2 companies SRE at one, and SDE at a very early-stage startup (like 20 employees). At the startup, it's just me and the CTO in tech. They're funded ($5M), but super early," wrote the user in the r/developersIndia subreddit. While all was going well, the CTO of one of the companies started asking them in-depth questions about their code and this is where things turned pear-shaped. "The CTO started asking deep dive questions about the code. Stuff like, "Why did you structure it this way?" or "Explain what this function does internally." The code was mostly AI-generated, and I honestly couldn't explain parts of it properly." "He straight up told me: "I don't mind if you use AI, but you have to know what your code is doing." Then he started explaining my code to me. Bruh. I was cooked." The OP said the entire experience was 'super humbling' as he had been vibe coding without really understanding the "deeper stuff like architecture, modularisation, and writing clean, production-level code". 'How did you even...' As the post went viral, garnering hundreds of upvotes, social media users agreed with the CTO's remarks, while others questioned how the OP had landed the internship without knowing what the code meant. "I am working as QA, and you can't replace experience. You will have to learn over time. But asking questions is also a good approach. Why and how," said one user while another added: "Get to know your application's core system design. Decide your architecture which can scale in production later. Now use this as a knowledge base in Cursor/ChatGPT." A third commented: "If you can't say what that code is doing by looking at it, then how did you even get 2 internships?" A fourth said: "Best way to learn how to write clean code is reading open source project code. Hands down its the best way to learn plus have a curious mind." Notably, the term vibe coding has been popularised by Andrej Karpathy, who has worked with companies like Tesla and OpenAI.

Bad news for Elon Musk as Tesla asked to pay Rs 21190320470 for..., company's autopilot tech under fire due to...
Bad news for Elon Musk as Tesla asked to pay Rs 21190320470 for..., company's autopilot tech under fire due to...

India.com

time3 hours ago

  • India.com

Bad news for Elon Musk as Tesla asked to pay Rs 21190320470 for..., company's autopilot tech under fire due to...

Elon Musk (File) Tesla accident: In a major setback for Tesla CEO Elon Musk, a US court has ordered the automaker to pay a whopping $243 million (about Rs 2,119 crore) to victims of a fatal 2019 accident, which was reportedly caused by a malfunction in the Tesla's patented autopilot driver-assistant technology. What did the victims say? The families of the victims have accused Tesla's autopilot system of being partly responsible for the crash that left 22-year-old Naibel Benavides Leon dead, while her boyfriend, Dillon Angulo sustained grievous injuries. According to the prosecution, the Tesla car, which was on Autopilot, collided with the victims' vehicle, which parked on the roadside as the young couple were star gazing late at night. The Tesla driver confessed he was distracted by his cell phone, but the jury believes that Tesla's malfunctioning autopilot tech was also partly responsible for the tragedy, and must pay damages to the victims. How much damages will Tesla pay? As per the court order, Tesla has been ordered to pay $200 million in punitive damages and $59 million in compensatory damages to Leon's family, while Angulo was awarded a $70 million compensation. The court observed that moral responsibility must be imposed if and when a tragedy is caused by technology malfunction, such as the failure of a driver-assist or autopilot system. It noted that companies must shoulder responsibility in such cases, even if human-error is involved. Notably, its believed that Tesla has previously settled similar cases out of court or had them dismissed before trial, however, this case breaks that pattern, and will encourage others to seek justice against big corporations. Did Tesla bury evidence of malfunction? Meanwhile, the victims' counsel also accused Elon Musk-led Tesla of hiding crucial evidence in the case, including video and data recorded moments before the fatal accident. Tesla told the court that the video footage did not exist, but a forensic expert managed to dig up the data, following which the company termed it as an 'honest mistake'. Why is this a concern for Elon Musk? The court order against Tesla comes at a time when Elon Musk is preparing to launch a fleet of driverless Tesla taxis in several US cities. The court ruling also brings Tesla's autopilot system, which it claimed has been improved since the 2019 accident, under further scrutiny.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store