Concerns raised over Tesla Full Self-Driving, but some tests didn’t use it right

Several tests showed Tesla vehicles hitting child mannequins using FSD, but at least one test failed to engage the system correctly

Update 2022/08/10 at 4:01pm ET: According to Electrek, the Dawn Project may not have properly engaged Tesla’s Full Self-Driving during the safety test. Some have questioned the Dawn Project’s motives as well, given its founder, Dan O’Dowd, launched a Senate campaign in California focused on one issue: Tesla’s FSD.

However, the other video from Taylor Ogan appears to be from a separate test unrelated to the Dawn Project.

Despite concerns with the accuracy of these tests, criticism of the safety of Tesla’s FSD remains legitimate (and it’s absolutely insane that people want to test FSD on actual, real children).

Studies show Tesla’s ‘Full Self-Driving’ (FSD) Beta technology fails to recognize children, further building on concerns about its safety as the company makes it available to more users.

First, a safety test conducted by the ‘Dawn Project‘ (via The Guardian) found that a Tesla Model 3 with FSD “repeatedly struck [a] child mannequin in a manner that would be fatal to an actual child.” Dawn Project seeks to improve the safety and reliability of software by stopping the use of commercial-grade software in safety-critical systems.

Further, investor Taylor Ogan shared a short video on Twitter showing a comparison between a Tesla and a vehicle equipped with LiDAR tech from Luminar — in the video, the Tesla hits the child mannequin while the LiDAR-equipped car manages to stop. In follow-up tweets, Ogan criticizes Tesla for not adopting LiDAR technology for its autonomous vehicle software.

LiDAR, for those unfamiliar with the term, refers to light detection and ranging or laser imaging, detection, and ranging. The tech allows for determining the range between two things by bouncing a laser off one object and measuring the time it takes for the laser to return.

The Dawn Project test will form part of an advertising campaign intended to encourage U.S. Congress to ban Tesla’s FSD.

Tesla and CEO Elon Musk have so far disputed concerns over the safety of FSD. At the same time, the U.S. National Highway Traffic Safety Administration (NHTSA) has launched investigations and requested information from the company about FSD. It’s worth noting that Tesla has also recently made FSD available in Canada.

A common line of defence appears to be claiming that FSD still requires driver assistance and is not fully autonomous. And while Tesla does state this on its website, the name — Full Self-Driving — suggests otherwise. Moreover, Tesla made the software available to thousands of Tesla owners to use on public roads, many of whom have misused FSD. Tesla has also delayed or pulled FSD updates over bugs and other issues several times and even fired an employee who shared a video of flaws with the FSD system.

There are clear safety concerns at play here, and critics have highlighted these concerns in an attempt to get governments to regulate the use of autonomous driving systems on public roads until the systems are safer and more reliable. Tesla fans have responded by attacking these critics online, with one Twitter user going so far as to request a child volunteer to run in front of their FSD-equipped Tesla to prove it will stop.

“I promise I won’t run them over,” the person wrote. Yea, sure bud.

Source: Dawn Project, Taylor Ogan (Twitter) Via: The Guardian