Why are people so committed to the idea that self-driving cars are anywhere near human standards? It just seems like a groundless assertion of faith to me.
Professional drivers can go for a million miles without an accident, and I don't believe anyone's autonomous driving software can get within an order of magnitude of that without a disengagement, even in favorable conditions.
You may disagree that a disengagement is equivalent to a human having an accident, but I strongly feel that it is. In either case, you have a situation where the driver reached a point where it was definitively unable to determine an appropriate next action.