NHTSA Investigating Waymo for Bad Driving
OEIS Digital Private Investigator:
Waymo is under investigation. The National Highway Traffic Safety Administration (NHTSA) said its preliminary evaluation into an estimated 444 Waymo vehicles follows 22 reports of 22 incidents including 17 collisions.
The agency said in some of those cases the automated driving systems appeared to disobey traffic safety control devices and some crashes occurred shortly after the automated driving systems exhibited unexpected behavior near traffic safety control devices.
This is the latest in a series of investigations opened by NHTSA into performance of self-driving vehicles after initiated probes into General Motors Cruise (GM.N), opens new tab and Amazon.com’s Zoox AMZN.O>.
In February, Waymo recalled 444 self-driving vehicles after two minor collisions in quick succession in Arizona, saying a software error could result in automated vehicles inaccurately predicting the movement of a towed vehicle.
🫣 Waymo always show up doing some insane stuff on my timeline
It looks like a 10 year old taking a joy ride in grandma’s car pic.twitter.com/FbgJGauTnZ
— Greggertruck (@greggertruck) May 14, 2024
Waymo going the wrong way. If it was Tesla it would be national news. pic.twitter.com/f12FGZC3Hg
— Tesla Owners Silicon Valley (@teslaownersSV) May 6, 2024
Tesla FSD Status and Waymo Safety
When comparing Tesla FSD and Waymo, miles driven to accidents are not the same as critical disengagements. If there was a human driver in the above Waymo wrong way incident that would have been a critical disengagement. However, it looks like the Waymo was lucky and an actual accident was avoided despite unsafe automated driving.
Waymo reports about 17000 miles between critical disengagements. Waymo has no human safety driver in the car so how could they have a disengagement ? Two ways. Either the system recognizes a flaw and pulls over or stops OR a remote human operator takes over.
In early 2023, Tesla reported FSD Beta engaged experienced an airbag-deployed crash about every 3.2 M miles, which is ~5x safer than the most recently available US average of 0.6M miles/police-reported crash
In the last 12 months, a Tesla with FSD Beta engaged experienced an airbag-deployed crash about every 3.2 M miles, which is ~5x safer than the most recently available US average of 0.6M miles/police-reported crash pic.twitter.com/ft4m3Jyq5Q
— Tesla (@Tesla) March 1, 2023
Teslafsdtracker.com has public crowdsourced data on the status of Tesla FSD.
98% of drives with Tesla FSD 12.3.6 have no critical disengagements versus 60% or less two years ago. 70% with no disengagements at all vs 20% two years ago.
Tesla reports its performance for Autopilot crashes and safety.
In the 4th quarter, Tesla recorded one crash for every 5.39 million miles driven in which drivers were using Autopilot technology. For drivers who were not using Autopilot technology, they recorded one crash for every 1.00 million miles driven. By comparison, the most recent data available from NHTSA and FHWA (from 2022) shows that in the United States there was an automobile crash approximately every 670,000 miles.
Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.