From a PR perspective it's a definite "no". Even a single self-driving fatality would lead to global news headlines and a lawyer feeding frenzy.
But from a protecting people perspective? If making it "live" quicker would speed development and adoption... then maybe.
E.g. the first 5 year increase fatalities net by 1.1x. After that tech is actually dialed in and fatalities drop to 0.5x. Holding off gives you 400K fatalities whereas pushing earlier adoption gives you 320K. That would be 80K lives saved. (Under the assumptions of this shoot-from-the-hip model).
It's more specific than that. It will be the first self-driving crash that would be glaringly obvious to a human driver. Like the Tesla crash where the sensors mistook a white semi for clouds and merrily plowed though it.
If the tech is adopted too soon it might face a death knell when computer assisted driving proves to be much safer than fully autonomous. It's much easier to fill a human's blind spots than replace the driver entirely.