Tesla's "full self-driving" faces another NHTSA probe after fatal crashes in low-visibility conditions

zohaibahd

Posts: 632   +15
Staff
In a nutshell: Elon Musk's vision of a world where humans are no longer required to keep their hands on the wheel when on the road has hit another snag. The NHTSA has launched a new probe into the safety of Tesla's FSD system after a series of troubling incidents.

The National Highway Traffic Safety Administration investigation centers around four accidents where Teslas running the Full Self-Driving (FSD) beta crashed after entering areas of reduced visibility like sun glare, fog, or dust clouds. One of those crashes ended in a pedestrian fatality, while another resulted in injuries.

The NHTSA wishes to examine whether FSD can "detect and respond appropriately" when visibility is poor. They'll also check whether any similar low-visibility crashes have occurred and scrutinize any updates Tesla has made to the system or its claimed safety benefits. The skepticism is understandable.

The FSD system relies solely on cameras without a stereoscopic setup, which is different from Waymo's setup, which combines camera, lidar, and radar hardware. Additionally, thanks to Tesla's decision to allow the beta software on older models with less advanced hardware, hundreds of thousands of cars are running FSD despite having inferior sensors.

These accidents aren't the first to have landed Tesla autonomous driving technology in hot water with regulators. Early last year, NHTSA forced a recall to fix the software on almost 363,000 Teslas after finding it posed safety risks by blowing through intersections and traffic signals before drivers could intervene. In December, they made Tesla recall two million vehicles to restrict the misuse of its regular Autopilot following an investigation into nearly 1,000 crashes involving that system.

Despite the mounting safety concerns, Musk is still betting big on FSD and even more advanced autonomy as Tesla's path to profitability. After all, FSD is sold as a $8,000 add-on or a $99 monthly subscription, making it a lucrative revenue stream. Last week, Musk unveiled a planned Tesla robotaxi service suggesting owners could rent their self-driving cars to make extra cash when not using them.

However, investors seemed skeptical of another potential case of Musk over-promising, which sent Tesla shares tumbling almost nine percent after the event. The reveal was also deliberately light on details and was hosted by human-controlled Optimus robots, contributing to the distrust.

Permalink to story:

 
No one making an FSD system deserves a break or a breather. This is one place where "good enough" or "release then debug in the field" will not fly from these tech companies.
This has to be damn near perfect, and that's not something they have ever aspired to. Not even close. It's not some dipshit that forgot to add rewind to a favorite streaming service. It's life and death.
 
Last edited:
Yet they still allow it to be named, and advertised, as "Full Self Driving" despite it being illegal to not have a human in the driver's seat, paying attention, hands on the wheel, and ready to step in. That marketing is a dangerous ruse that is way too effective for anyone who spends time on the road. I've talked to plenty of people who appear convinced that their Teslas really are full self driving and all those legalities are just so much fine print for worry-warts.

This is also the automobile company that got its start selling highly subsidized luxury vehicles to rich people.

Tesla's gotten nothing but breaks from the government as far as I'm concerned.
 
Article confirms Teslas usual BS , camera only. That's not good enough.
Maybe if sees UV to IR with it's sensors - could be wrong but wouldn't such a sensor have different image planes for focusing.
Suppose if good enough for Hoomans then good enough for cars. Yet it isn't good enough for humans, I know I'm driving blind in certain situations. Anyone on a motorbike can really get freaked out
Glare, haze, sunstrike, deluge, fog , night, crap flung onto windscreen/visor , Trucks throwing up huge amount of water etc . Ignoring insect in eyes/camera etc

Anyone here who thinks their hooman eyes can handle all this- is full of crap
 
No one making an FSD system deserves a break or a breather. This is one place where "good enough" or "release then debug in the field" will not fly from these tech companies.
This has to be damn near perfect, and that's not something they have ever aspired to. Not even close. It's not some dipshit that forgot to add rewind to a favorite streaming service. It's life and death.
The question isn't whether FSD is perfect or not. It will never be perfect. Much like our current, human drivers aren't perfect either. Far from it. If we drove perfectly, we wouldn't have accidents, ever.

The question is, is it safer than a human driver or not. The only statistics I could find is about autopilot ASSISTING the human driver, which makes accidents 8 times rarer. But unsupervised FSD is another story.
 
The question isn't whether FSD is perfect or not. It will never be perfect. Much like our current, human drivers aren't perfect either. Far from it. If we drove perfectly, we wouldn't have accidents, ever.

The question is, is it safer than a human driver or not. The only statistics I could find is about autopilot ASSISTING the human driver, which makes accidents 8 times rarer. But unsupervised FSD is another story.
I agree with this general line of thinking...
I'm VERY curious as to the data comparisons between self driving vehicles and human driven vehicles (and cannot find any online that are concise enough - possibly due to the minimal dataset available yet).
IF we reach a point where FSD vehicles statistically have less accidents than human driven vehicles, then wouldn't it be logical to shift focus to FSD vehicles then?
Doesn't less accidents = less injury or death?
 
Tesla has supervised and unsupervised FSD options.
He was just being specific.
Yet they still allow it to be named, and advertised, as "Full Self Driving" despite it being illegal to not have a human in the driver's seat, paying attention, hands on the wheel, and ready to step in. That marketing is a dangerous ruse that is way too effective for anyone who spends time on the road. I've talked to plenty of people who appear convinced that their Teslas really are full self driving and all those legalities are just so much fine print for worry-warts.

This is also the automobile company that got its start selling highly subsidized luxury vehicles to rich people.

Tesla's gotten nothing but breaks from the government as far as I'm concerned.
Tesla does NOT sell unsupervised self driving. Look at their website. It is only advertised as supervised self driving.
Article confirms Teslas usual BS , camera only. That's not good enough.
Maybe if sees UV to IR with it's sensors - could be wrong but wouldn't such a sensor have different image planes for focusing.
Suppose if good enough for Hoomans then good enough for cars. Yet it isn't good enough for humans, I know I'm driving blind in certain situations. Anyone on a motorbike can really get freaked out
Glare, haze, sunstrike, deluge, fog , night, crap flung onto windscreen/visor , Trucks throwing up huge amount of water etc . Ignoring insect in eyes/camera etc

Anyone here who thinks their hooman eyes can handle all this- is full of crap
The only people who can’t see well enough to drive safely are those who lose their licenses because they’re going blind. In ANY situation, you can see well enough if you react properly. Otherwise, you’re doing something wrong.
No one making an FSD system deserves a break or a breather. This is one place where "good enough" or "release then debug in the field" will not fly from these tech companies.
This has to be damn near perfect, and that's not something they have ever aspired to. Not even close. It's not some dipshit that forgot to add rewind to a favorite streaming service. It's life and death.
Complete lies. They are aspiring to have near perfect driving. If they didn’t, then why would they still keep releasing software updates? And if someone is “making” a full self driving system, then how are they supposed to make it if it can only be perfect?

The NHTSA investigates every FSD accident as they should. There is no special treatment for Tesla. If Tesla is aspiring to achieve driving far safer than humans, then this is what the NHTSA wants. They work with Tesla closely and Tesla does what they say even if they disagree. An example is rolling stops (which almost every human ever does at empty intersections). Every FSD Tesla goes all the way to 0 mph at every stop sign. And the NHTSA does not fine or sue Tesla because they are following all government regulations.

And if you think Tesla debugs in the field, then you’re crazy. They delay FSD software updates for months because they’re testing for every bug possible they can find and having to go back and fix them. They absolutely limit who is allowed to use supervised FSD based on driving behavior because they only allow people to use it if they care about their own safety.
 
"... Every FSD Tesla goes all the way to 0 mph at every stop sign. And the NHTSA does not fine or sue Tesla because they are following all government regulations."

It's great that Tesla stops at a stop sign. Now if they can just fix that little bug that occasionally mows down innocent pedestrians crossing the street, well, then, it's perfect! Last I looked, that latter part is still against most government regulations. Right?
 
Last edited:
Tesla does NOT sell unsupervised self driving. Look at their website. It is only advertised as supervised self driving.

Sure, they finally tacked on the (Supervised) disclaimer, while still calling it Full Self Driving (Supervised). It's not called Supervised Self Driving or SSD, or even FSDS, it's just FSD.

I'm glad one of their lawyers finally talked a little sense into them, but it's too little too late. The previous marketing and advertising has already taken hold. In countless discussions and articles, it is called only FSD. Check this very article: seven references to "FSD", not a single one to "FSD (Supervised)".
 
I'm VERY curious as to the data comparisons between self driving vehicles and human driven vehicles (and cannot find any online that are concise enough - possibly due to the minimal dataset available yet).

How could there be any real data? It is not legal for Tesla vehicles to be used fully autonomously, and they are supposed to prevent such operation.

Potentially we could soon get some data from the Waymo type vehicles, but it's not really the same thing (they operate in specific pre-approved regions, under specific conditions, some/all may not even drive on highways yet, etc.)

I also think it'd be dangerous to look at broad averages. My sense is there's a pretty big difference between the safest of human drivers and least safe. We may already be at, or at least close, to the point where we'd all be better off if serious alcoholic habitual drunk drivers had their cars exchanged for FSD-only Teslas. That doesn't mean we'd be better off if every driver on the road was using one.

Anyway, I'm a long term believer in the incredible benefits this technology will one day bring to future generations. My beef is with those who are trying to imply with a wink and a knod that it's for sale today.
 
The only people who can’t see well enough to drive safely are those who lose their licenses because they’re going blind. In ANY situation, you can see well enough if you react properly. Otherwise, you’re doing something wrong.

Is that why USA has those videos of endless car pile ups in fog and whiteout conditions. You drive to the conditions, maybe some ***** will rear-end you

You never had a good old boy with extra powerful lights ( maybe not legal ) full beam you with their trusty pickup . I stopped driving in Canada at night in my travels, as too many *****s not dipping their lights in that province.

As I stated anyone who has down a goodly amount of driving has had their vision impeded , trying to avoid it helps , right tools ( eg good driving glasses ), choosing to delay driving into setting sun on a rough road with possible pot holes or just gravel etc , or sudden dust, whiteout conditions .
Has your windscreen not been hit by huge wash from a truck coming the other way. Most highways in NZ and not barriered multi lanes . You are only 1 meter or 2 away from a vehicle going the other way 60 plus mph on windy roads

Yes we have silly humans as well, setting off on a frosty foggy morning , saying windscreen will defrost in time , or car will demist etc

Going back to my motorbike days I have avoided stuff I can't see , simply by sound alone. ie accelerating away - ie I'm not wasting time to look around at source of noise and take a hit , but just dropping gear and and opening it up

So I strongly doubt you have never been temporarily blinded as has happened to me over a thousand times or impaired ie need to follow white line on passenger side, as car coming full beam on rainy foggy night , or a sun/mirror strike on my visor
Plus humans rely on more senses than sight eg hearing , feel of road/car ( for issues ) , smell if fire ahead. Always think cyclists/drivers wearing headphones to loud music *****s , same as those who drive with sun behind them , or on a dark day heavy rain no lights ( oh I don't need my lights to see FN *****s )
 
It states clearly that the car has low visibility - it warns you clearly. People trusting systems that specifically tells you that it cannot operate fully due to low visibility are just *****s.
 
I use my Cadillac Lyriq's SUPERCRUISE every single day. While there are phantom braking incidents - especially when surrounded by cars - which may confuse or "scare" the sensors, for the most part Supercruise has performed extremely well over 5000 miles of use. I regularly update surveys about it.

Tesla's Self Driving system has problems. For one: Elon took out the ultrasonic sensors and wanted to rely solely on cameras. He also doesn't like LiDAR claiming it's "too noisy".

True "Self Driving" needs every single sensory input it can get. It's the fusion of information that allows it to drive better - and that's a matter of programming.

FSD doesn't understand certain situations like school buses with their own stop signs, ice cream trucks (and why you should slow down) railroad crossings, or flooded areas.

When AI gets better, perhaps these situations will be overcome, but simply relying on cameras just isn't enough to make it safe.
 
Is that why USA has those videos of endless car pile ups in fog and whiteout conditions. You drive to the conditions, maybe some ***** will rear-end you

You never had a good old boy with extra powerful lights ( maybe not legal ) full beam you with their trusty pickup . I stopped driving in Canada at night in my travels, as too many *****s not dipping their lights in that province.

As I stated anyone who has down a goodly amount of driving has had their vision impeded , trying to avoid it helps , right tools ( eg good driving glasses ), choosing to delay driving into setting sun on a rough road with possible pot holes or just gravel etc , or sudden dust, whiteout conditions .
Has your windscreen not been hit by huge wash from a truck coming the other way. Most highways in NZ and not barriered multi lanes . You are only 1 meter or 2 away from a vehicle going the other way 60 plus mph on windy roads

Yes we have silly humans as well, setting off on a frosty foggy morning , saying windscreen will defrost in time , or car will demist etc

Going back to my motorbike days I have avoided stuff I can't see , simply by sound alone. ie accelerating away - ie I'm not wasting time to look around at source of noise and take a hit , but just dropping gear and and opening it up

So I strongly doubt you have never been temporarily blinded as has happened to me over a thousand times or impaired ie need to follow white line on passenger side, as car coming full beam on rainy foggy night , or a sun/mirror strike on my visor
Plus humans rely on more senses than sight eg hearing , feel of road/car ( for issues ) , smell if fire ahead. Always think cyclists/drivers wearing headphones to loud music *****s , same as those who drive with sun behind them , or on a dark day heavy rain no lights ( oh I don't need my lights to see FN *****s )
In fog, there is a simple solution—drive slower. Driving into the sunset, you can look at a different angle to the lane lines (closer to your vehicle), and you can still see vehicles ahead (though it’s painful). Cameras do not get tired looking into a super bright light. In fact, there’s a reason there are 3 forward looking cameras on Teslas… They look at different distances and allow different levels of exposure.

And finally, you may forget but computers have perfect memory. So if somehow visibility is completely obscured but something landing on your windshield, the only right solution is to pull over safely. Plus, the 3 rear cameras are in completely different locations. It won’t be possible to obscure all three of them, and then pulling over is easy.
Sure, they finally tacked on the (Supervised) disclaimer, while still calling it Full Self Driving (Supervised). It's not called Supervised Self Driving or SSD, or even FSDS, it's just FSD.

I'm glad one of their lawyers finally talked a little sense into them, but it's too little too late. The previous marketing and advertising has already taken hold. In countless discussions and articles, it is called only FSD. Check this very article: seven references to "FSD", not a single one to "FSD (Supervised)".
There is no inconsistency. Before it was always very clear that FSD was a feature coming out in the future. Now Tesla drivers actually have access to supervised self driving, so it is advertising the feature available now. The same unsupervised FSD feature is one that’s going to come out in the future.

Feel free to use the Wayback Machine and see for yourself.
 
In fog, there is a simple solution—drive slower. Driving into the sunset, you can look at a different angle to the lane lines (closer to your vehicle), and you can still see vehicles ahead (though it’s painful). Cameras do not get tired looking into a super bright light. In fact, there’s a reason there are 3 forward looking cameras on Teslas… They look at different distances and allow different levels of exposure.

And finally, you may forget but computers have perfect memory. So if somehow visibility is completely obscured but something landing on your windshield, the only right solution is to pull over safely. Plus, the 3 rear cameras are in completely different locations. It won’t be possible to obscure all three of them, and then pulling over is easy.

There is no inconsistency. Before it was always very clear that FSD was a feature coming out in the future. Now Tesla drivers actually have access to supervised self driving, so it is advertising the feature available now. The same unsupervised FSD feature is one that’s going to come out in the future.

Feel free to use the Wayback Machine and see for yourself.

Some good points about multiple cameras, still think with such a system there will always be downsides and some YT who hates Tesla will continue to show Tesla's cars running over mannequins ( yes they probably find edge cases, and not same car )

Thing is for dense cities we have had a solution for decades, Ban all noncommercial vehicles , restrict to commercial to certain times us utility , emergency.

Just have electric small bubble cheap self driving cars that follow a electric grid and have some sensors for collisions , less noise , no need for parking spaces , hook to phone app, less pollution etc
Only problem , why we can't have nice things , vandals, messy eaters etc .

Same for highways could make self driving cars more safe with implementing some simple extra tech

too much 1984 , but if every car had a locator beacon , or cars could talk to overhead street cameras

Try as a human to pull out into a fairly fast stream of traffic in sedan car when huge SUVs are parked on corner blocking the view, TBF a self driving car made have a very forward camera, exactly for this, I tend to look through the windows on SUVs either beside me or if parked.
USA has a lot of traffic lights so maybe less a problem.

Plus automata bots can be made to mill around like headless ants when they get congested and stopped

Other problem is pranksters would just play tricks on these cars, something that would not stop a human , but car thinks something is there that is not , or something there is not there ie like people hacking AI scene recognition - but again why we can't have nice things
 
I didn't say they weren't trying. I said that is something tech companies have ever aimed for.
I also said FSD has to be as close to perfect as possible,

Please pay attention in the future so I don't have to correct you again.
You're funny because you didn't correct me on anything :joy: Close to perfect driving IS something they are aiming for. Please provide me proof that they aren't aiming for or aspiring to have near perfect vehicles. I already provided proof that they are doing so.

FSD will be as close to perfect as possible, eventually. It's impossible to get there without real world driving experience, just as it's impossible for us to drive as perfect as possible without sticking 15 year old kids behind the wheel of 2 ton vehicles. But with supervised self-driving vehicles, you have a licensed adult able to override mistakes. There are minimal FSD caused car accidents despite there being hundreds of thousands of supervised FSD Teslas on the road.
 
That IS something they are aiming for. Please provide me proof that they aren't aiming for or aspiring to have near perfect vehicles.
Ok Pluto you are a good guy, but something here is way off.
In my first post all I said was
"This has to be damn near perfect, and that's not something they have ever aspired to."
Would your argument be the tech sector is known for quality release software and hardware?

And in my last post I said this in the post YOU quoted:
"I didn't say they weren't trying. I said that is something tech companies have ever aimed for."

Please put up what I said that made you think I don't believe they are trying to perfect FSD.
 
This is one market where AI might actually be the solution. Due to the complexity the code base is so large no human or even group of humans has much of a clue anymore if what does what. At which point AIs black magic mystery box approach might actually end up saver.

Or perhaps it'll require both, humans largely write the code but need AI assistance to figure out why certain things work the way they do.
 
LOL...retaliation. People being burned alive in Tesla didn’t prompt an investigation, but Elon’s political stance did. Ironic
 
The Govt. regulators would be wise to simply put a total ban on Tesla's assisted driving until it can be proven perfect. They should order Tesla NOT to add it to any of their cars until that time .....
 
I've been saying since the beginning of their FSD journey that optical cameras only is insufficient to replace a human. They need additional sensor data, whether that is from LIDAR, thermal imaging, sonar, whatever... they need more information to feed to the computer. They were at one point using radar as well, but radar doesn't provide a full picture of surroundings. Thermal sensors have dropped a lot in cost in the last 10 years so they could probably add those which could easily see pedestrians through adverse atmospheric conditions, and are not affected by sun flare effects. Since people's lives are at stake I'd also say add LIDAR as well. If you have two out of three sensors agreeing there is a person crossing the road the software should be slowing the car.
 
Back