If that is its best then FSD has no business being on the road.
It was erratic and didn't give clear intent to other cars what it was trying to do.
I'm not defending Tesla's FSD, which I think is inferior for Tesla's reluctance to use LIDAR in addition to cameras. Current car AIs are imperfect, they need as much info as they can if we expect them to drive safer than humans.
Thats not to excuse anyone, but it may have been a contributing factor.
It put on its hazards and gradually slowed to a stop. Noting fast or abrupt. Easy for any driver paying attention to recognize and be ready for.
It was nothing like what happened in this video.
Personally I doubt this was the car trying to stop safely due to driver inattention. Seems more likely an AutoPilot/FSD bug or the phantom braking all the way to a stop at the worst time.
Or terrible driving by the driver if they had been in full control and are lying about FSD.
We were going past a bicyclist in their lane appropriately. I had auto pilot engaged. As we passed the cyclist, they moved over towards us slightly, but never left the bike lane.
The car suddenly went into a full stop, which was a pretty quick stop.
I can understand the car may have thought the cyclist was going to go in front of the car, but the scenario wasn’t one in which a human wouldn’t have slammed the brakes. My wife commented that we’re lucky there weren’t cars. Whine is because they might not have reacted as quickly.
To me, this shows the potential danger of these safety systems false positives. Not that I wouldn’t continue to use the systems because I believe overall these systems make me safer than without them.
But as always, the danger isn’t the car I’m in, it’s the one driven by another driver who’s not paying attention.
If the sole problem was that the driver wasn't paying attention, the least it could have done would have been to put on the hazards and very slowly come to a stop. It would have been even better if it simply continued until there was a safe shoulder to stop on. The chance of having an accident by stopping on the bridge is much higher than just driving on autopilot to the end of the bridge and then pulling over.
If the problem is that this would create a moral hazard where drivers stopped holding onto the steering wheel when on bridges with no shoulder, the car could play loud obnoxious noises inside the car when this happened.
I was watching the video and the whole time I kept thinking that if I saw my car slowing down for no reason (and my Autopilot not FSD) has slowed down "ghosted" before I would have immediately taken control. Why didn't the driver do that?
> the car could play loud obnoxious noises inside the car when this happened.
I'll be interested to hear about what happened there but I know if you don't toggle the wheel or take action when on Autopilot the car will flash blue and beep at you to take control and if you keep doing it it'll disengage Autopilot until you stop and park the car.
I haven't used "FSD" much at all so I'm not 100% familiar with it but there seems like a host of problems here and the software powering "FSD" is but one of those. It saddens me though that people aren't asking the best question here which is why are we driving everywhere in the first place?
Which is erratic and reckless.
If it was an emergency situation it should have put on hazard lights, waited until all surrounding cars passed, then changed lanes and very slowly decelerated until it stopped.
All while aggressively alerting the driver what is happening.
Interesting because a computer was involved we are incensed. "No business being on the road"... but the millions of other car crashes "oh yeah, well humans make mistakes!"
Tesla's claims of being "safer than human drivers" is as much a testament to their technology as it is to how poor human drivers are.