In other words, I agree that it's better for society, but that "better for society" isn't the metric that gets used for making decisions within the system.
A significant consideration is whether owners of self driving cars will have the right to make code modifications to their vehicles.
Captured aptly here by Cory Doctrow.
https://www.theguardian.com/technology/2015/dec/23/the-probl...
Edit: salient quotes from the article:
"Here’s a different way of thinking about [the trolley] problem: if you wanted to design a car that intentionally murdered its driver under certain circumstances, how would you make sure that the driver never altered its programming so that they could be assured that their property would never intentionally murder them?"
"If self-driving cars can only be safe if we are sure no one can reconfigure them without manufacturer approval, then they will never be safe."
"Your relationship to the car you ride in, but do not own, makes all the problems mentioned even harder."
This gives me weird visions of Google engineers with a necklace that explodes in the event that one of their cars causes an accident :S
Unremovable, remote controllable lethal necklaces are a central plot device in the mercenary invasion. They are put on randomly chosen civilians as "collateral" to ensure co-operation and disincentive insurgency.