logo
#

Latest news with #DawnProject

The Dawn Project Urges Legislators To Ban Tesla Full Self-Driving Over Critical Safety Defects Published in New Report to Congress
The Dawn Project Urges Legislators To Ban Tesla Full Self-Driving Over Critical Safety Defects Published in New Report to Congress

Yahoo

time08-07-2025

  • Automotive
  • Yahoo

The Dawn Project Urges Legislators To Ban Tesla Full Self-Driving Over Critical Safety Defects Published in New Report to Congress

SANTA BARBARA, Calif., July 08, 2025 (GLOBE NEWSWIRE) -- Public safety advocacy group The Dawn Project has called on legislators to take action to ban Tesla's Full Self-Driving software from public roads over the litany of critical safety defects uncovered by the group's safety tests of the software. Today, The Dawn Project shared a report of its findings with legislators and key regulators, including the National Highway Traffic Safety Administration (NHTSA), which has numerous open investigations into Tesla Full Self-Driving. The report comes amid Tesla's rollout of its 'Robotaxi' service in Austin, Texas, which has been plagued with safety critical errors such as the Robotaxis driving on the wrong side of the road, blocking intersections, and nearly colliding with other cars. The Dawn Project recently held a live demonstration of its safety tests in Austin which proved that the latest publicly available version of Tesla Full Self-Driving will run down a child crossing the road while illegally blowing past a stopped school bus with its red lights flashing and stop sign extended. The Dawn Project was recreating a tragic incident in North Carolina in which a child was run down by a self-driving Tesla as they exited a school bus. The self-driving Tesla had blown past the school bus's red flashing lights and stop sign before running down the child, who suffered a fractured neck and broken leg and was placed on a ventilator. The Austin test was run eight times, and Full Self-Driving ran down the child mannequin while illegally blowing past the school bus on every single test. Tesla's Full Self-Driving software did not disengage or even alert the driver to the fact there had been a collision on any of the test runs. The Dawn Project has catalogued thousands of safety critical errors committed by Tesla's Full Self-Driving software on public roads and constantly updates this database with new errors as they are identified. The Dawn Project also maintains a publicly accessible database of safety critical and other driving errors committed by Tesla's Robotaxis in Austin. NHTSA has reported 50 fatalities and 2,185 crashes involving Tesla's self-driving technology. The Dawn Project's Report to Congress outlines the key findings from the group's safety tests and demands that legislators take immediate action to protect road users from Tesla Full Self-Driving by banning the software until Tesla conclusively proves it is safe. Dan O'Dowd, Founder of The Dawn Project, commented: 'Self-driving software that illegally blows past stopped school buses and runs down children crossing the road must be banned immediately. It is only a matter of time before a child is killed while getting off a school bus because of Elon Musk and Tesla's utter negligence and contempt for public safety.' 'The National Highway Traffic Safety Administration must step up and ban Tesla Full Self-Driving from public roads to protect children, pedestrians, and other road users. It is disappointing that the federal regulator in charge of road safety has taken no action to hold Tesla accountable. NHTSA must do its job and ban Tesla's defective Full Self-Driving technology from public roads before more people are killed.' 'Legislators should protect their constituents from Tesla Full Self-Driving by calling for this dangerous and defective software to be banned immediately.' A copy of The Dawn Project's annual report can be viewed Contact: info@

The Dawn Project Urges Legislators To Ban Tesla Full Self-Driving Over Critical Safety Defects Published in New Report to Congress
The Dawn Project Urges Legislators To Ban Tesla Full Self-Driving Over Critical Safety Defects Published in New Report to Congress

Associated Press

time08-07-2025

  • Automotive
  • Associated Press

The Dawn Project Urges Legislators To Ban Tesla Full Self-Driving Over Critical Safety Defects Published in New Report to Congress

SANTA BARBARA, Calif., July 08, 2025 (GLOBE NEWSWIRE) -- Public safety advocacy group The Dawn Project has called on legislators to take action to ban Tesla's Full Self-Driving software from public roads over the litany of critical safety defects uncovered by the group's safety tests of the software. Today, The Dawn Project shared a report of its findings with legislators and key regulators, including the National Highway Traffic Safety Administration (NHTSA), which has numerous open investigations into Tesla Full Self-Driving. The report comes amid Tesla's rollout of its 'Robotaxi' service in Austin, Texas, which has been plagued with safety critical errors such as the Robotaxis driving on the wrong side of the road, blocking intersections, and nearly colliding with other cars. The Dawn Project recently held a live demonstration of its safety tests in Austin which proved that the latest publicly available version of Tesla Full Self-Driving will run down a child crossing the road while illegally blowing past a stopped school bus with its red lights flashing and stop sign extended. The Dawn Project was recreating a tragic incident in North Carolina in which a child was run down by a self-driving Tesla as they exited a school bus. The self-driving Tesla had blown past the school bus's red flashing lights and stop sign before running down the child, who suffered a fractured neck and broken leg and was placed on a ventilator. The Austin test was run eight times, and Full Self-Driving ran down the child mannequin while illegally blowing past the school bus on every single test. Tesla's Full Self-Driving software did not disengage or even alert the driver to the fact there had been a collision on any of the test runs. The Dawn Project has catalogued thousands of safety critical errors committed by Tesla's Full Self-Driving software on public roads and constantly updates this database with new errors as they are identified. The Dawn Project also maintains a publicly accessible database of safety critical and other driving errors committed by Tesla's Robotaxis in Austin. NHTSA has reported 50 fatalities and 2,185 crashes involving Tesla's self-driving technology. The Dawn Project's Report to Congress outlines the key findings from the group's safety tests and demands that legislators take immediate action to protect road users from Tesla Full Self-Driving by banning the software until Tesla conclusively proves it is safe. Dan O'Dowd, Founder of The Dawn Project, commented: 'Self-driving software that illegally blows past stopped school buses and runs down children crossing the road must be banned immediately. It is only a matter of time before a child is killed while getting off a school bus because of Elon Musk and Tesla's utter negligence and contempt for public safety.' 'The National Highway Traffic Safety Administration must step up and ban Tesla Full Self-Driving from public roads to protect children, pedestrians, and other road users. It is disappointing that the federal regulator in charge of road safety has taken no action to hold Tesla accountable. NHTSA must do its job and ban Tesla's defective Full Self-Driving technology from public roads before more people are killed.'A copy of The Dawn Project's annual report can be viewed here. Contact: [email protected]

First Tesla Drives Autonomously From Dealer to Buyer's House, Ends in Embarrassing Flub
First Tesla Drives Autonomously From Dealer to Buyer's House, Ends in Embarrassing Flub

Yahoo

time30-06-2025

  • Automotive
  • Yahoo

First Tesla Drives Autonomously From Dealer to Buyer's House, Ends in Embarrassing Flub

On Saturday, Tesla announced that it had made the world's first fully driverless delivery of a car, achieving a key promise Elon Musk had made ahead of the rocky launch of his robotaxi service. In a promotional video shared by the automaker, a Model Y rolls out of Tesla's Gigafactory in Austin, Texas. With no one inside, the gleaming EV drives itself across highways and city streets until finally reaching its new owner's apartment, making automotive history in the process. All's well — except for what seems to be one major, embarrassing oversight: where this genius feat of engineering decided to park. We draw your attention to the brightly red-colored curb that the Tesla stopped at, which reads in conspicuous white text: "NO PARKING FIRE LANE." It's theoretically possible that Tesla somehow received special permission to park the vehicle there. But if it didn't, it's another glaring example of the automaker's autonomous cars flaunting traffic laws. "As usual, Elon Musk's latest PR stunt prioritizes showmanship over public safety," Dan O'Dowd, CEO and founder of the watchdog group the Dawn Project, wrote on X. "Is the fine for blocking a fire lane included in the purchase price of a new Tesla?" Tesla's self-driving software has been the subject of intense public and regulatory scrutiny, which has ramped up following the launch of its robotaxi service in Austin earlier this month. Up until that point, the automaker had never demonstrated it was capable of deploying a fully autonomous driving system. Its popular Full Self-Driving (Supervised) feature still requires the driver to remain alert and ready to take over at a moment's notice. Predictably, major cracks began to show once the ten to 20 robotaxis in Tesla's fleet began offering rides to its exclusively Tesla-fanboy clientele, who eagerly documented their driving experiences. Thanks to them, we have footage of the robotaxis committing errors including randomly slamming the brakes, nearly rear-ending a UPS truck, and dropping off passengers in the middle of a busy intersection. Notably, a few of the videos appear to show Tesla's robotaxis violating traffic laws. In one instance, a robotaxi blazes through a 15 mile per hour zone at 27 miles per hour. In another, a robotaxi wildly starts turning the steering wheel side-to-side before clearly crossing the road's solid double yellow lines to barge into a left-turn lane. These incidents earned Tesla the attention of the National Highway Traffic Safety Administration, which is in talks with the automaker regarding the apparent traffic violations. No investigation has been launched yet, but these talks are sometimes the precursor to one. This latest stunt may add fuel to the fire. Nonetheless, Musk is claiming victory. On X, he proclaimed the delivery was the "first fully autonomous drive with no people in the car or remotely operating the car on a public highway." This isn't true. As CNBC notes, robotaxi leader Waymo has been testing its fully autonomous cars on highways in Phoenix, Arizona, since 2024, though it's currently only offering rides in this capacity to employees. In any case, we're pretty much just taking Tesla at its word, which is a precarious thing. It admitted to staging a popular promotional video from 2016 that purported to show one of its fully driving itself with someone behind the wheel; it turned out that engineers had pre-mapped the route taken in the video and that the car had crashed at least once during the shoot. More on Tesla: While Tesla's Robotaxi Program Crumbles, Its Sales Are Falling Apart

Disturbing Test Shows What Happens When Tesla Robotaxi Sees a Child Mannequin Pop Out From Behind a School Bus
Disturbing Test Shows What Happens When Tesla Robotaxi Sees a Child Mannequin Pop Out From Behind a School Bus

Yahoo

time18-06-2025

  • Automotive
  • Yahoo

Disturbing Test Shows What Happens When Tesla Robotaxi Sees a Child Mannequin Pop Out From Behind a School Bus

After promising self-driving robotaxis for more than a decade, it's probably no surprise that Tesla CEO Elon Musk has blown way past his promised rollout date of June 12. Now scheduled to formally roll out on the streets in Austin, Texas on either June 22 or 28 — not even Musk seems to know at this point — the tech billionaire's self-driving charade is running seriously behind. As consumer interest in Tesla continues to plummet, one generational hater is showing just how far Musk's last ditch effort to save Tesla is really lagging. In a live demonstration on public roads in Austin, a recent media stunt showed that "self-driving" Tesla robotaxis make no qualms about running down children in cold blood. The demonstration, sponsored by the Dawn Project — a watchdog group founded by Musk's fellow billionaire and longtime Pentagon contractor Dan O'Dowd — showed what happens when a Tesla running on current-gen self-driving software comes up on a stopped school bus. Making no attempt to slow down, the Tesla model Y barrels past the bus and its blinking stop light. When a child mannequin darts across the road, the EV plows right through it, taking a full car's length to come to a complete stop. "What it shows is a full self-driving Tesla will not stop at a bus, and if a kid steps out, it will mow them down," O'Dowd told Futurism. Despite a two-year campaign by the Dawn Project to highlight the school bus issue — which included a full-page in the New York Times and a Superbowl commercial — O'Dowd says lawmakers, not to mention Tesla itself, have done nothing to fix the issue. Sadly, that inaction has already had real consequences. In North Carolina in 2023, for example, a student was struck while exiting a school bus by a self-driving Tesla, and had to be airlifted to a hospital with life-threatening injuries. "What's happened since? Tesla has not fixed the bug," O'Dowd said. "And still the government hasn't forced a recall." For O'Dowd and the Dawn Project, Tesla is unique among carmakers adding self-driving features to their vehicles. Asked about the Dawn Project's stance on Waymo — which has its own history of dangerous traffic maneuvers — O'Dowd says "Waymo gets the job done." "They actually can do ten million rides and not kill anybody," he notes. "Tesla can't do any self-driving and it's killed a lot of people. So that's my question, how many people have they killed? How many people have they injured?" (As a point of fact, Waymo has logged a total of 696 accidents since 2021, with one Waymo-involved fatality recorded so far. Teslas, for comparison, have notched well over 2,146 incidents, with 553 Tesla-involved fatalities.) While the writing might be on the wall for Tesla, O'Dowd continues to pour resources into raising awareness about the dangers Musk's vehicles pose to civilians, partnering with movements like the nationwide Tesla Takedown initiative. Despite the obvious, O'Dowd says it's not over 'til it's over. "I've never been willing to go out there and short Tesla," the billionaire told us. "It's too risky. The guy's too good at making up stories to convince people he can make a bunch of money." More on Tesla: Terrifying Footage Shows Self-Driving Tesla Get Confused by the Sun, Mow Down Innocent Grandmother

Tesla's FSD runs over child mannequin
Tesla's FSD runs over child mannequin

Daily Telegraph

time17-06-2025

  • Automotive
  • Daily Telegraph

Tesla's FSD runs over child mannequin

Don't miss out on the headlines from On the Road. Followed categories will be added to My News. Two Tesla foes have joined forces to attack Elon and his automotive semi-autonomous driving technology. The Dawn Project and the Tesla Takedown movement have partnered to highlight what they claim are 'critical safety defects' in Tesla's Full Self-Driving (Supervised) software. In a recent test conducted in the United States (US), a Tesla Model Y equipped with the latest version of Full Self-Driving (version 13.2.9) was presented with a common scenario: a school bus stopped on the side of the road with its flashing lights and stop signs activated. A child-sized mannequin was then pulled across the street, simulating a child attempting to catch the bus. Anti-Tesla activists testing FSD system. (Picture: The Dawn Project) Anti-Tesla activists testing FSD system. (Picture: The Dawn Project) MORE: Inside China's total domination of Australia The Tesla, travelling at an average speed of approximately 32 km/h, failed to stop at the bus stop sign and proceeded to strike the mannequin in each of the eight test runs. The system also reportedly failed to alert the driver to the collision. The tests come as Tesla prepares to launch robotaxis in the US, fully autonomous vehicles designed for taxi services. While Tesla CEO Elon Musk has stated that the company is 'being super paranoid about safety' regarding its forthcoming robotaxi launch, organisers like The Dawn Project and Tesla Takedown aren't convinced. Tesla runs passed stop sign. (Picture: The Dawn Project) MORE: Crisis sends Australian fuel prices soaring The Dawn Project said, 'Full Self-Driving ran down the child mannequin while illegally blowing past the school bus on every single attempt.' 'Tesla's Full Self-Driving software did not disengage or even alert the driver to the fact there had been a collision on any of the test runs,' they added. However, it's important to note that the Full Self-Driving (Supervised) is not fully autonomous but rather semi-autonomous. Tesla states explicitly that the system is designed for 'use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.' Autonomous driving is a key pillar of investment for Tesla. Having introduced its 'Autopilot' driver assistance system more than a decade ago, Tesla doubled down on 'full self-driving' in the US. Anti-Tesla activists testing FSD system. (Picture: The Dawn Project) MORE: Magic mushies, booze kill off 'soft' utes Recently, Tesla was faced with a significant challenge after Chinese electric vehicle manufacturer Build Your Dreams (BYD) unveiled its new driver-assistance system, 'God's Eye.' This innovative technology, which BYD has installed for free in some of its models, enables cars to drive themselves on highways and in urban environments. Some experts argue that 'God's Eye' is more advanced than Tesla's Full Self-Driving (FSD) system, which costs nearly US $9,000 ($13,800) in China. Tesla's Full Self-Driving capability in Australia is currently being tested and is not yet fully legal for public use. However, the system could be arriving soon. Earlier this year, the EV giant published a video on of a Tesla Model 3 with prototype software successfully negotiating busy streets in inner-city Melbourne. 2025 Tesla Model Y. Picture: Mark Bean The brand's country director for Australia, Thom Drew, says an expansion of Tesla's driverless features is high on Elon Musk's list of priorities. 'That's Elon's push,' Drew said. 'We have a global engineering team that are working across markets around a lot of FSD… actively working across all our markets to roll it out.' Critics are watching closely as Tesla's Autopilot and FSD systems remain under investigation following a series of crashes and fatalities. Originally published as Tesla's Full Self-Driving system fails in 'safety test'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store