Latest news with #Challinger
Yahoo
13-02-2025
- Automotive
- Yahoo
A Cybertruck on Autopilot slammed into a light pole, and it went viral
A Florida man recently shared an image on social media showing his Cybertruck crashed head-first into a light pole. The image is grist for the mill of Cybertruck haters, but there's a more profound concern: the vehicle did this to itself. The driver used Tesla's self-driving mode, which handles the vehicle's basic functionality, like steering and braking. Unfortunately, the driver wasn't paying attention, and the rest is now-totaled Cybertruck was driving on what Jonathan Challinger, the software developer from Florida, says was an empty road. The image he shared on social media suggests he was driving at night, possibly very late. Challinger says his Cybertruck "failed to merge out of a lane that was ending and made no attempt to slow down.' Sleuths have discovered where the crash happened, and the pole does appear to be in an awkward spot on a curb that juts into the road near a crosswalk. The lane has arrows indicating it will end well ahead of the pole and curb, but we imagine that even humans paying attention sometimes have trouble merging out of the lane before they are forced to stop to avoid hitting the pole. "I don't expect [FSD] to be infallible, but I definitely didn't have a utility pole in my face while driving slowly on an empty road on my bingo card,' he added. Challinger says his Cybertruck was on FSD (full self-driving) version admits he was 'complacent' in the crash, and it doesn't seem to be the first time he has allowed Tesla's Autopilot feature to take the wheel. In a social media post from January, Challinger muses, "Sometimes I decide to go somewhere and turn on Tesla FSD, and then I forget where I decided to go, and then it starts turning into Taco Bell or whatever, and I'm like wtf is it doing and then I'm like oh right Taco Bell.' He admits the crash was a 'big fail' and accepts his role in allowing Autopilot to act autonomously and unsupervised. When version 13 of FSD launched, Tesla CEO Elon Musk said it was 'mind-blowing' and promised it would be a big step toward unsupervised self-driving before the end of 2025. This has caused many who want to be on the bleeding edge of unsupervised, autonomous self-driving technology to take unnecessary risks. In a comedic sense, this story has it all: a Florida man, a tech guy with a Cybertruck, poor city planning, autonomous driving, and Taco Bell shoutouts. It's gold. As compassionate humans, we're happy Challinger escaped unharmed. His was a serious accident, autonomous driving or not, and he's lucky to be alive. This incident distills to one thing: the driver not paying attention. It's important to note that while Musk and Tesla influencers hype full self-driving for Tesla vehicles, the promise of unsupervised FSD has not been realized. Drivers must pay attention, even if the vehicle handles everything they don't want to do, like navigation, steering, accelerating, and braking.
Yahoo
10-02-2025
- Automotive
- Yahoo
Tesla Driver Issues Warning After His Cybertruck Totals Itself on "Full Self-Driving" Mode
The owner of a Tesla Cybertruck had a terrifying crash while using the carmaker's infamous "Full Self-Driving" (FSD) feature. In a now-viral thread on X-formerly-Twitter, Florida-based owner Jonathan Challinger "crashed into a curb and then a light post" after failing to "merge out of a lane that was ending." His vehicle "made no attempt to slow down or turn until it had already hit the curb." Challinger used the opportunity to issue a warning "Big fail on my part, obviously," he tweeted. "Don't make the same mistake I did. Pay attention. It can happen." "It is easy to get complacent now — don't," he added. Puzzlingly, Challinger also took the opportunity to praise Tesla for "engineering the best passive safety in the world." For years now, regulators have been investigating Tesla's driver assistance software, finding last year that owners like Challinger are often lulled into a false sense of security, in large part due to Tesla and its CEO Elon Musk's woefully misleading marketing. Musk has promised that autonomous driving will be realized "next year" every single year for over a decade now. But reality has struggled to catch up. Tesla currently still maintains that its FSD software requires drivers to be engaged and ready to take over at all times. In practice, drivers like Challinger often zone out. Over the years, FSD and its overarching Autopilot suite have been linked to hundreds of crashes and dozens of deaths and have been implicated in countless lawsuits. The timing of Challinger's easily avoided collision is especially noteworthy, considering Musk has promised that an "unsupervised" version of FSD will be made available later this year. Challinger's comments also highlight a bizarre and difficult-to-reconcile loyalty to Tesla's brand and its mercurial CEO. "I do have the dashcam footage," he wrote in his tweet. " I want to get it out there as a PSA that it can happen, even on v13, but I'm hesitant because I don't want the attention and I don't want to give the bears/haters any material." "Spread my message and help save others from the same fate or far worse," he added. Challinger's controversial account of the collision had other netizens shaking their heads. "It's completely wild to me that the car's own built-in paid software totals an $80k vehicle and the owner's response is to say "thank you Tesla, the passive safety is so good," one Reddit user wrote in response. "Feels like satire, and yet here we are..." "I don't understand how on current versions of FSD a person is able to look away from the road long enough to drive straight into a pole," another user wrote. "I can barely shoulder check a lane change without the eye tracking nagging at me. And it's at night too so it's not like they had sunglasses on." The latest incident highlights some glaring shortcomings of the tech — and a baffling level of trust on the part of Tesla drivers. More on FSD: Elon Musk Finally Admits That Teslas Don't Have What It Takes for Full Self-Driving
Yahoo
10-02-2025
- Automotive
- Yahoo
Tesla's ‘self-drive' tech accused of causing Cybertruck crash as owner urges Musk to ‘save others from the same fate or far worse'
Jonathan Challinger claims his Cybertruck drove headlong into a streetlight while in Full Self-Driving mode. Elon Musk wants the technology to be ready for a June robotaxi launch. Wrapped around a pole with its right wheel dangling, the image of a wrecked Tesla Cybertruck lying motionless on the side of the road is shocking. Driver Jonathan Challinger posted the undated picture on Sunday. He claims Tesla's automated Full Self-Driving (FSD) software caused his vehicle to crash into a light post while he wasn't looking. While Challinger escaped without harm, he warned others might not be so lucky. 'Spread my message and help save others from the same fate or far worse,' he wrote. The post received 2 million views, sparking fierce debate as to whether FSD is good enough to be used without humans behind the wheel. It comes less than five months before Tesla CEO Elon Musk's crucial launch of an autonomous driving robotaxi service, which is a core pillar supporting Tesla's more than $1.1 trillion market cap. According to Challinger's account, the car failed to depart a lane that was ending, despite no vehicles that might have impeded him merging into another, and making no attempt to slow down or turn until it was too late. Google Maps and Street View imagery show that the road layout matches the photo in Challinger's post. An official for the Reno Police Department confirmed to Fortune that there was a crash involving a driver named Challinger on Feb. 6, but declined to give further details pending a full report being filed. Challinger tagged Musk, AI director Ashok Elluswamy, Tesla's entire AI team, and Cybertruck lead engineer Wes Morrill in the tweet. The carmaker constantly collects data from FSD for training. In the past it has immediately denied crash accounts when they have not been true. At the time of publication, Tesla had not responded to Fortune's request for comment. Fortune also contacted Challinger for comment but did not receive a reply. Tesla only rolled out FSD to the Cybertruck in September, a full 10 months after the vehicle launched. The pickup has larger dimensions, a higher stance on the road, and more complex engineering—it uses all four wheels to steer—than a Tesla saloon. One of the best-known and most impartial Tesla FSD testers attested to the plausibility of Challinger's account of the crash. 'The situation you describe is very common, where the planner or decision-making to get in the appropriate lane early enough often gets the vehicle in a bind, where it runs out of options,' replied Chuck Cook, who was tagged in the post by Challinger. 'You are NOT the only one.' Challinger was quick to admit negligence and accept ultimate responsibility for failing to supervise the system, as Tesla requires of all its owners who use FSD. 'Big fail on my part, obviously. Don't make the same mistake I did. Pay attention. It can happen,' he warned, requesting a means to deliver dashcam footage to Tesla's AI team for analysis. Challinger, moreover, dismissed accusations that he was acting in bad faith by trying to capitalize on the scrutiny surrounding Musk, Tesla, and FSD ahead of the commercial launch in June. 'I just want to get the data to Tesla if I can. I tried everything I could think of to get in touch with them,' he said. Challinger had previously acknowledged early last month—in a separate post he didn't flag widely—that he had been involved in a serious accident. Responding to a question about the Cybertruck's structural ability to absorb energy in a frontal collision, he wrote in early January: 'Having crashed mine, can confirm that it crumples just fine.' He said he had repeatedly tried to get the dashcam footage to Tesla. Challinger specified that the crash occurred while using FSD v13.2.4, a software version that had only been widely rolled out to all FSD users roughly a week after his earlier post. Just months ahead of a planned June launch, CEO Musk has yet to publish any independently verifiable data to back up his claim that FSD is ready to be used in an unsupervised, fully autonomous robotaxi. By comparison, other rivals like Waymo report their disengagements to state regulators. Tesla, however, has used a legal loophole to avoid this transparency for years. Musk has also repeatedly misstated facts. Tesla's AI director, Elluswamy, testified in court that Musk ordered him to doctor a marketing video to mislead consumers about Tesla's FSD capabilities. More recently, Musk admitted Teslas running on older AI3 inference computers have, in fact, failed to live up to his claim that all cars built after 2016 are capable of autonomous driving. He plans to replace that hardware with the newest generation in those vehicles where customers purchased FSD. How exactly that can be done and at what cost is unclear. This story was originally featured on