I think this is a really good perspective. Considering how often drivers are already doing things like using smartphones behind the wheel of non-self-driving cars, I think that sort of activity is only magnified by partial autonomy - which is very dangerous! Humans get distracted or bored easily, especially when completing routine tasks. I'm glad that Google is choosing to build a car that never needs human intervention rather than rushing to market with a partial solution.
Here's a video where you can see what distracted teen drivers look like. Terrifying. http://youtu.be/SDWmwxQ_NnY
On the other hand, it's easy to see why auto manufacturers and others are disinterested in an all-or-nothing goal that is likely to be decades away. Because they want incremental features they can sell in the interim.
Of course, their challenge is around what incremental approaches work given that humans will not pay attention once you reach a certain level of automation. Perhaps you enable full automation only under scenarios where it works reliably--say freeways in certain weather conditions--and is legally allowed under those circumstances. (Though I suspect the first step is that people will use "autopilots" and go ahead and play with their phones--even though they're not supposed to--given that many already do that today.)
Cruise (YC 14) is just scary. They still have that advertising video online [2] that totally oversells what they can do. All they have is lane-keeping and smart cruise control, like the other entry level systems. It's automatic driving from the "move fast and break things" crowd; they're from web and app startups.
Google is being cautious and testing heavily. But they're spending enough money to test fast, with many cars on the test track. That's the auto industry way of doing things. It takes money, but not decades, to get it right.
The CEO of Volvo has the liability issue right - when in autodrive, the manufacturer is responsible. If you can't accept that, you shouldn't be doing this.
[1] http://www.cnet.com/roadshow/news/tesla-autopilot-fail-video... [2] http://www.getcruise.com/
> At the end of the shift, the entire log is sent off to an independent triage team, which runs simulations to see what would have happened had the car continued autonomously. In fact, even though Google’s cars have autonomously driven more than 1.3 million miles—routinely logging 10,000 to 15,000 more every week — they have been tested many times more in software, where it’s possible to model 3 million miles of driving in a single day.