I believe the answer is far more complicated than it seems and in practice having the cars stay still might have been the safest option any of the parties could agree on (Waymo's office, the city traffic people, state regulators, etc).
There are people thinking this stuff out and those cars can 100% pull over automatically but an explicit choice was made not to do so for safety.
Look I like Waymo. I think they’re neat and I trust them far more than any of the other companies. But in my mind being able to handle stuff like this is just a requirement to be on the roads in any non-trivial number. Like if they had two vehicles in this happened then OK that’s a problem but it was two vehicles in an entire city.
When you have enough on the road that you can randomly have six at one intersection you should absolutely be able to handle this by then.
I want them to do good. I want them to succeed. But just like airliners this is the kind of thing where people’s safety comes first.
What we saw happen looks like the safety of the Waymo and its passengers came above everyone else despite having no need to do that. There are certainly some situations where just staying put is the best decision.
The power went out and there are no other hazards on the road is not one of them. They made things worse for everyone else on average in a foreseeable situation where it was totally unnecessary. And that’s not OK with me.
This feels like the kind of thing that absolutely should’ve been tested extremely well by now. Before they were allowed to drive in large volumes.
One driver doesn’t know how to handle a power outage? It’s not news. Hundreds of automated vehicles all experience the same failure? National news.
I wish the Waymos handled it better, yes, but I think that the failure state they took is preferable to the alternative.
Imagine a model that works real well for detecting cars and adults but routinely misses children; you could end up with cars that are 1/10th as deadly to adults but 2x as deadly to children. Yes, in this hypothetical it saves lives overall, but is it actually a societal good? In some ways yes, in some ways it should never be allowed on any roads at all. It’s one of the reasons aggregated metrics on safety are so important to scrutinize.
Whatever, it happens.
This was a (totally unintentional) coordinated screw up causing problems all over as opposed to one small spot.
The scale makes all the difference.
How many non-Waymo accidents happened at intersections during this time? I suspect more than zero given my experiences with other drivers when traffic lights go off. Apparently, Waymo's numbers are zero so humans are gonna lose this one.
The problem here is that safety and throughput are at odds. Waymo chose safety while most drivers chose throughput. Had Waymo been more aggressive and gotten into an accident because it wouldn't give way, we'd have headlines about that, too.
The biggest obstacle to self-driving is the fact that a lot of driving consists of knowing when to break the law.
Did they? They chose their safety. I suspect the net effect of their behavior made the safety of everyone worse.
They did such a bad job of handling it people had to go around them, making things less safe.
We know what people are like. Not everyone is OK doing 2-3 mph for extended time waiting for a Waymo to feel “safe”.
Operating in a way that causes large numbers of other drivers to feel the need to bypass you is fundamentally worse.
So, you're saying Waymo can't handle a regular 4 way stop sign given how everyone else on the road drives? That's not a problem?
It's all a careful risk calculation, those self driving cars need to determine if it's safe to continue through an intersection without the traffic lights their computers spent millions of hours to train on (likewise with humans). That's a tough choice for a highly regulated/insured company running thousands of cars.
If anything, their programming should only take such a risk to move out of the way for a fire truck/ambulance.
Would would they do that? It's a hive, isn't it?
Self-driving cars should (1) know how to handle stops, and (2) know that the rules for a failed traffic light (or one flashing red) are those for an all-way stop.
Humans, luckily, never follow the rules to the letter, which made it reasonable to put them down like this: some will be more impatient/aggressive, others will establish eye contact and wave one another through, etc.
In a situation like this where you've got "drivers" who can't collaborate and learn on the spot, the rule does not make sense.
The first arrived rule (which applies before yield right) is usually unambiguous in a traffic jam situation (since it will also be the position where the last car went the least recently, and everyone at the intersection will have been close enough to see through the prior cycle.)
Just pulling over and getting out of the way really would help. There's no reason a human couldn't do the same safely. Not beta testing your cards on public roads would really be ideal. Especially without human drivers ready to take over.
Unfortunately HN is still not ready for that discussion despite the year being 2026 in a few days.
> but an explicit choice was made not to do so for safety.
You know this how?
However even if that’s not true if they have more cars than human drivers there’s gonna be a problem until they work through the queue. And the bigger that ratio, the longer it will take.