Tesla FSD faces yet another probe after fatal low-visibility crash
Tesla is facing yet another government investigation into the safety of its full self driving (FSD) software after a series of accidents in low-visibility conditions.
In its latest opening
resume [PDF] into a Tesla FSD investigation, the National Highway Traffic Safety Administration (NHTSA) said that it's taking a look at Tesla self-driving systems due to four accidents, one involving an injury and another fatally striking a pedestrian. All four accidents occurred in "an area of reduced roadway visibility conditions" with FSD, either beta or supervised, engaged.
"In these crashes, the reduced roadway visibility arose from conditions such as sun glare, fog, or airborne dust," the NHTSA noted, adding that it was opening the inquiry to assess "the ability of FSD's engineering controls to detect and respond appropriately to reduced roadway visibility conditions."
Tesla owner Elon Musk has had a
long-standing opposition to the use of things like ultrasonic sensors, radar and lidar in his quest to achieve unassisted full self driving, instead preferring AI and cameras to do the job.
The EV maker has since committed entirely to computer vision,
eliminating additional sensors in its latest model year vehicles. That means that aside from being able to see in the dark like most modern cameras, Tesla vehicles are ostensibly just as good at seeing in fog, sun glare or excess airborne dust as a human driver, with the added caveat that it's an AI making the decision, not a human.
Plenty of other self-driving companies and their engineers have
disagreed with Tesla over its vision-only approach, and this may be the first time we've seen its FSD formally evaluated to see whether it can actually drive better in low-visibility environments than a human.
The type of cameras Tesla uses are vulnerable to blinding in dusk, dawn, in fog and rain and when driving into the sun because of their low dynamic range, we're told. Relying solely on cameras could be also an issue because they are passive receivers of light, meaning they're unable to predict distance unless set up in stereo, which Tesla cameras reportedly are not.
Not that Tesla hasn't had to patch FSD several times over the years to address other safety issues, including twice last year alone. Tesla patched FSD in January 2023 to address reports from the NHTSA that it was acting
unsafe around intersections, and again in December because the attention controls were "
insufficient to prevent misuse" of the system by drivers paying less attention than they should.