Tesla’s biggest hater airs Super Bowl ad against FSD

Safety advocacy group The Dawn Project is taking its campaign to ban Tesla’s Full Self-Driving (FSD) system to the Super Bowl.

The 30-second ad, which is broadcasting to millions of football fans in Washington D.C. and state capitals like Austin, Tallahassee, Albany, Atlanta and Sacramento, outlines several alleged critical safety defects of Tesla FSD, the automaker’s advanced driver assistance system (ADAS).

FSD is not actually fully self-driving, although it can perform some automated driving tasks like maneuvering through city streets and highways without driver input. The $15,000 system isn’t perfect, though, and drivers must remain alert to take over in case the system malfunctions or comes across something it can’t handle. There have been several reports of accidents occurring while Autopilot, Tesla’s lower level ADAS, was engaged. As a result, Tesla has been criticized, investigated and sued for falsely marketing the capabilities of its automated driving systems.

This most recent critique comes as Tesla has recently released its latest version of FSD to around 400,000 drivers in North America, renewing concerns of the system’s safety. Last month, a Tesla engineer testified that a 2016 demo in which the company claimed its car was driving itself was actually staged.

The Super Bowl ad features a collection of incriminating videos of Teslas behaving erratically while a voice over claims FSD will “run down a child in a school crosswalk, swerve into oncoming traffic, hit a baby in a stroller, go straight past stopped school buses, ignore ‘do not enter’ signs, and even drive on the wrong side of the road.”

The Dawn Project asserts that Tesla’s “deceptive marketing” and “woefully inept engineering” is endangering the public, and calls on the National Highway Traffic Safety Administration and the Department of Motor vehicles to turn off FSD until all of the safety defects are fixed.

The Dawn Project’s founder, Dan O’Dowd, is also the CEO of Green Hill Software, a company that builds operating systems for embedded safety and security systems, as well as its own automated driving systems. That fact at once lends credence to the organization’s potential subject matter expertise, and makes it clear that Green Hill is in competition with Tesla’s FSD. Last year, The Dawn Project took out a full page ad in The New York Times claiming Tesla’s FSD has a “critical malfunction every eight minutes.”

O’Dowd, who ran for a seat in the U.S. Senate last November and lost, says he’s making the investment in the new ad campaign because he wants to put pressure on politicians to prioritize ADAS safety. Some politicians like Sens. Richard Blumenthal (D-Conn.) and Edward J. Markey (D-Mass.) have called for more oversight on Tesla’s tech, but the issue hasn’t exactly gone mainstream.

After The Dawn Project aired a commercial last summer showing a Tesla Model 3 striking four different child-sized mannequins while driving a test track in California, Tesla sent the organization a cease-and-desist letter. The letter refuted all of the campaign’s claims, doubled down on Tesla’s commitment to safety and called to question The Dawn Project’s methodology.

“The purported tests misuse and misrepresent the capabilities of Tesla’s technology, and disregard widely recognized testing performed by independent agencies as well as the experiences shared by our customers,” wrote Dinna Eskin, a Tesla lawyer, in last year’s cease-and-desist. “In fact, unsolicited scrutiny of the methodology behind The Dawn Project’s tests has already (and within hours of you publicly making defamatory allegations) shown that the testing is seriously deceptive and likely fraudulent.”

Tesla supporters also rushed to defend the technology, including one investor who tested the FSD beta using his own kid. O’Dowd offered to run the test with Musk and other critics in person to prove the accuracy and methodology of his tests.

“Tesla continues to focus on features and marketing gimmicks, not fixing critical safety defects,” said O’Dowd in a statement.”Elon even stated that Tesla’s priorities were Smart Summon, Autopark and Optimus, not making sure that FSD will not run down children. It is clear that the priorities at Tesla are wrong, and it is time for the regulator to step in and switch the software off until all of the issues we have identified are fixed.”

Tesla hasn’t responded publicly to the Super Bowl ad, but CEO Elon Musk replied to a tweet showing the ad with the Rolling on the Floor Laughing emoji. Tesla disbanded its PR department in 2020, so TechCrunch couldn’t reach out for a comment.

In addition to the Super Bowl ad, The Dawn Project is also taking out a series of full page ads in Politico and running additional TV ads in Washington, D.C., “where regulators are located,” that will call on FSD to be disabled until critical safety defects are fixed.

Tesla’s biggest hater airs Super Bowl ad against FSD by Rebecca Bellan originally published on TechCrunch

Tesla fights US Senate campaign ad showing its EVs hitting child-sized mannequins

Dan O’Dowd, the California U.S. Senate candidate behind a controversial anti-Tesla ad, has pledged to continue airing the campaign despite receiving a cease-and-desist letter from the automaker.

O’Dowd’s call for a ban on Tesla’s Full Self-Driving (FSD) software comes as the automaker enlists more than 100,000 drivers to begin beta tests on public streets, while defending itself from a National Highway Traffic Safety Administration investigation into fatal crashes involving its precursor Autopilot system.

FSD, which Tesla said Tuesday will be offered as a $15,000 option, allows the car to perform certain traffic maneuvers without driver input but is not a fully self-driving system because it requires the driver to be ready to take over the vehicle at any time.

The 30-second commercial sponsored by the Dawn Project, a safety advocacy group founded by O’Dowd, shows a Tesla Model 3 striking four different child-sized mannequins while driving a test track at Willow Springs International Raceway in Rosamond, California.

“This happens over and over again,” O’Dowd, who is also the CEO of Green Hills Software, a safety software maker and potential Tesla competitor, says in a voiceover. “A hundred thousand Tesla drivers are already using Full Self-Driving on public roads.”

Tesla accused the Dawn Project of depicting the “unsafe and improper use” of its FSD software and demanded the group remove the videos. CEO Elon Musk responded to critics on Twitter with trademark truculence.

The campaign ad sparked public scrutiny, as well as criticism from Tesla’s supporters, since its debut earlier this month. Some alleged that the Model 3 shown in the video was not engaged in FSD. At least one Tesla fan conducted a comparison test, showing the car stopping to avoid hitting both a child-sized mannequin and a real child. YouTube removed the video after it was flagged as harmful content.

“The purported tests misuse and misrepresent the capabilities of Tesla’s technology, and disregard widely recognized testing performed by independent agencies as well as the experiences shared by our customers,” Tesla deputy general counsel Dinna Eskin wrote in the August 11 letter.

The Dawn Project said in a statement that the June 21 footage from Willow Springs featured a 2019 Model 3 that was engaged in Tesla’s Full Self-Driving Beta 10.12.2 version.

O’Dowd said his campaign against Tesla’s Full Self-Driving software will run “until @ElonMusk proves it won’t mow down children.”

Tesla and the Dawn Project did not immediately respond to a request for comment.

New York Times ad warns against Tesla’s “Full Self-Driving”

A full page advertisement in Sunday’s New York Times took aim at Tesla’s “Full Self-Driving” software, calling it “the worst software ever sold by a Fortune 500 company” and offering $10,000, the same price as the software itself to the first person who could name “another commercial product from a Fortune 500 company that has a critical malfunction every 8 minutes.”

The ad was taken out by The Dawn Project, a recently founded organization aiming to ban unsafe software from safety critical systems that can be targeted by military-style hackers, as part of a campaign to remove Tesla Full Self-Driving (FSD)  from public roads until it has “1,000 times fewer critical malfunctions.”

The founder of the advocacy group, Dan O’Dowd, is also the CEO of Green Hill Software, a company that builds operating systems and programming tools for embedded safety and security systems. At CES, the company said BMW’s iX vehicle is using its real-time OS and other safety software, and it also announced the availability of its new over-the-air software product and data services for automotive electronic systems.

Despite the potential competitive bias of The Dawn Project’s founder, Tesla’s FSD beta software, an advanced driver assistance system that Tesla owners can access to handle some driving function on city streets, has come under scrutiny in recent months after a series of YouTube videos that showed flaws in the system went viral.

The NYT ad comes just days after the California Department of Motor Vehicles told Tesla it would be “revisiting” its opinion that the company’s test program, which uses consumers and not professional safety operators, doesn’t fall under the department’s autonomous vehicle regulations. The California DMV regulates autonomous driving testing in the state and requires other companies like Waymo and Cruise that are developing, testing and planning to deploy robotaxis to report crashes and system failures called “disengagements. Tesla has never issued those reports.

Tesla CEO Elon Musk has since vaguely responded on Twitter, claiming Tesla’s FSD has not resulted in accident or injury since its launch. The U.S. National Highway Traffic Safety Administration (NHTSA) is investigating a report from the owner of a Tesla Model Y, who reported his vehicle went into the wrong lane while making a left turn in FSD mode, resulting in the vehicle being struck by another driver.

Even if that was the first FSD crash, Tesla’s Autopilot, the automaker’s ADAS that comes standard on vehicles, has been involved in around a dozen crashes.

Alongside the NYT ad, The Dawn Project published a fact check of its claims, referring to its own FSD safety analysis that studied data from 21 YouTube videos totaling seven hours of drive time.

The videos analyzed included beta versions 8 (released December 2020) and 10 (released September 2021), and the study avoided videos with significantly positive or negative titles to reduce bias. Each video was graded according to the California DMV’s Driver Performance Evaluation, which is what human drivers must pass in order to gain a driver’s license. To pass a driver’s test, drivers in California must have 15 or fewer scoring maneuver errors, like failing to use turn signals when changing lanes or maintaining a safe distance from other moving vehicles, and zero critical driving errors, like crashing or running a red light.

The study found that FSD v10 committed 16 scoring maneuver errors on average in under an hour and a critical driving error about every 8 minutes. There as an improvement in errors over the nine months between v8 and v10, the analysis found, but at the current rate of improvement, “it will take another 7.8 years (per AAA data) to 8.8 years (per Bureau of Transportation data) to achieve the accident rate of a human driver.”

The Dawn Project’s ad makes some bold claims that should be taken with a grain of salt, particularly because the sample size is far too small to be taken seriously from a statistical standpoint. If, however, the seven hours of footage is indeed representative of an average FSD drive, the findings could be indicative of a larger problem with Tesla’s FSD software and speak to the broader question of whether Tesla should be allowed to test this software on public roads with no regulation.

“We did not sign up for our families to be crash test dummies for thousands of Tesla cars being driven on the public roads…” the ad reads.

Federal regulators have started to take some action against Tesla and its Autopilot and FSD beta software systems.

In October, NHTSA sent two letters to the automaker targeting the its use of non-disclosure agreements for owners who gain early access to FSD beta, as well as the company’s decision to use over-the-air software updates to fix an issue in the standard Autopilot system that should have been a recall. In addition, Consumer Reports issued a statement over the summer saying the FSD version 9 software upgrade didn’t appear to be safe enough for public roads and that it would independently test the software. Last week, the organization published its test results, which revealed that “Tesla’s camera-based driver monitoring system fails to keep a driver’s attention on the road.” CR found that Ford’s BlueCruise, on the other hand, issues alerts when the driver’s eyes are diverted.

Since then, Tesla has rolled out many different versions of its v10 software – 10.9 should be here any day now, and version 11 with “single city/highway software stack” and “many other architectural upgrades” coming out in February,  according to CEO Elon Musk.

Reviews of the latest version 10.8 are skewed, with some online commenters saying it’s much smoother, and many others stating that they don’t feel confident in using the tech at all. A thread reviewing the newest FSD version on the Tesla Motors subreddit page shows owners sharing complaints about the software, with one even writing, “Definitely not ready for the general public yet…”

Another commenter said it took too long for the car to turn right onto “an entirely empty, straight road…Then it had to turn left and kept hesitating for no reason, blocking the oncoming lane, to then suddenly accelerate once it had made it onto the next street, followed by a just-as-sudden deceleration because it changed its mind about the speed and now thought a 45 mph road was 25 mph.”

The driver said it eventually had to disengage entirely because the system completely ignored an upcoming left turn, one that was to occur at a standard intersection “with lights and clear visibility in all directions and no other traffic.”

The Dawn Project’s campaign highlights a warning from Tesla that its FSD “may do the wrong thing at the worst time.”

“How can anyone tolerate a safety-critical product on the market which may do the wrong thing at the worst time,” said the advocacy group. “Isn’t that the definition of defective? Full Self-Driving must be removed from our roads immediately.”

Neither Tesla nor The Dawn Project could be reached for comment.