[ad_1]

Dan O’Dowd is a man on a mission: to get Tesla’s Full Self-Driving (FSD) technology outlawed. He’s likely spent millions of his own money—including a reported $600K on a recent Super Bowl ad and full-page ad in The New York Times last November—to accomplish this goal via his nonprofit group The Dawn Project. And he’s incurred the wrath of Elon Musk and the Tesla co-founder and CEO’s most rabid fans.

“He’s called me batshit crazy,” O’Dowd said of Musk last week in Santa Barbara, where he set up another series of tests to prove FSD is dangerous and which MotorTrend attended. “He said my tests are fake and my company Green Hills Software is a pile of trash.”

Last August The Dawn Project conducted tests and shot a video showing a Tesla Model 3 using FSD plowing into a mannequin meant to represent a child. Tesla sent The Dawn Project a cease-and-desist letter calling the videos “defamatory.” O’Dowd responded in a tweet by calling Musk a “crybaby” for complaining about the test and offered to reproduce tests for the media and regulators.

That’s what brought us to Santa Barbara last week, where Green Hills Software, which O’Dowd founded in 1982 and is president and CEO, is based. Green Hills develops ultra-secure software for aviation, including the operating system (OS) for the Boeing 787 and B1-B bomber and the Orion crew exploration vehicle manufactured by Lockheed Martin and operated by NASA. Green Hills is also the first and only software company to develop an OS that meets the NSA’s certification for EAL 6+ High Robustness, making it almost impossible to hack.

Green Hills also develops automotive software—it’s about 40 percent of the company’s business, O’Dowd said—and is a software supplier for the 2022 BMW iX EV crossover. This has caused O’Dowd critics and Tesla fans to call out The Dawn Project’s conflict of interest and question the organization’s motives.

O’Dowd isn’t critical of Tesla vehicles and owns five of them: two Roadsters, two Model 3s, and a Model S. The Dawn Project is not only attacking Tesla FSD but any computer system it deems a danger. “Connecting the power grid, hospitals, and millions of cars to the Internet with software riddled with bugs and security defects has turned these systems into potential weapons of mass destruction at the mercy of hackers,” the organization says on its website.

“We’ve been taking all the things that our lives depend on—water treatment plants and hospitals—where when something fails thousands or even millions of people could die,” O’Dowd said.

People Are Dying

In the case of Tesla Autopilot and FSD, people are dying: According to a Washington Post report, there have been 17 fatalities, 11 since May 2022, and 736 crashes in the U.S. since 2019 involving Autopilot. One of the latest injuries involved a 17-year-old named Tillman Mitchell, who was hit by a 2022 Tesla Model Y after exiting a school bus and walking across the street to his house. He survived the crash and was last listed in good condition.

The driver, Howard Gene Yee, 51, failed to stop for a school bus, which the North Carolina Highway Patrol said employed all its warnings and stop signs. The driver was charged with reckless driving, and NHTSA opened a special investigation into the crash.

The Dawn Project tried to re-create the North Carolina crash—sans a student getting hit—in Santa Barbara last week by having a Tesla Model Y drive past a school bus with its foldable stop signs deployed on the driver’s side. The car was using Tesla’s Traffic Light and Stop Sign Control, which “is designed to recognize and respond to traffic lights and stop signs, slowing Model Y to a stop when using Traffic-Aware cruise control or Autosteer,” according to the owner’s manual.

In a half dozen passes we witnessed, the Model Y never stopped for the school bus stop sign, and it also blew through Do Not Enter and Road Closed signs set up just past the school bus. To be fair, the school bus stop sign is smaller than a regular stop sign (the Model Y stopped at a sign just around the corner from the bus), and Tesla’s Traffic Light and Stop Sign Control isn’t designed to recognize and react to Do Not Enter and Road Closed signs. The event didn’t include testing FSD with a mannequin.

Earlier in the day, O’Dowd and Tesla investor Ross Gerber, president and CEO of Gerber Kawasaki Wealth and Investment Management, live-streamed a drive around Santa Barbara testing FSD in a Model S. The two had several close calls, including the car running a stop sign, and O’Dowd tweeted “FSD tried to kill us.”

“It was a four-way stop with two cars going [by],” O’Dowd recalled. “We would’ve hit one of those two cars if Ross didn’t slam on the brakes and stop us.”

The Teslarati has since analyzed the incident in minute detail, saying that Gerber was over the speed limit and wasn’t operating the system correctly. One Tesla owner even re-created it in the same location to show the technology works as intended. Tesla fans repeatedly and relentlessly attack and mock O’Dowd’s testing, and two Tesla fanatics even used their small children to show that FSD and Autopilot work properly.

Won’t Back Down

O’Dowd, who ran for one of California’s seats in the U.S Senate last year as a Democrat but lost, has developed a thick skin and, like Musk, isn’t backing down. “I knew what I was getting into when I started,” he said. “I knew his unwillingness to compromise. Everything that goes wrong, he [Musk] just doubles down. I could never see General Motors or Toyota or Hyundai shipping a product with this many problems. If their cars could run down a kid in a crosswalk, they’d have 50 engineers on it tomorrow trying to fix it.”

O’Dowd said he’s going after FSD because the government isn’t. In February NHTSA issued a recall notice for nearly 400,000 Tesla vehicles equipped with FSD. Tesla in turn issued an over-the-air update for FSD to address the concerns NHTSA identified in its recall notice and add several other changes.

“Three months later, they still haven’t done anything about it,” O’Dowd said. “How do you ship a product that will blow past a school bus and kill or severely injure a child?”



[ad_2]

Source link