Cars are killing by the millions now, so do the math, and I'm willing to tolerate a lot of deaths in the process, contingent on the deaths actually helping researchers and engineers reduce future faults. I would happily take a million deaths right now if it meant driverless tech instantly became available to everyone, but I think there's an upper limit to how many accidents can be realistically examined at once.
What's the alternative position? That you're for more road deaths in total? Are you also the kind of person who wouldn't pull the lever in the trolley problem?
For the sake of the discussion, let us assume that this is hyperbole and that "happily" was an infelicitous word choice. That said it raises a few questions:
- Uber and the other companies presumably are developing this technology for their own private benefit and do not plan to make it "available to everyone". In which case, how many deaths are acceptable?
- Should they be successful and succeed in developing a safe autonomous driving system should they be compelled to make it "available to everyone"?
- Are these million doomed citizens volunteers with informed consent or are they to be struck down unawares at random?
- What if the technology proves more difficult to develop than you anticipate? Suppose that after one million are sacrificed for the greater good it is improved but not yet good enough. Should we then continue with another million or should we abandon the project after only one million fruitless deaths?
- Assuming success, should we then ban human driven vehicles completely?
[edited for formating]
Developing and testing the technology in actual test scenarios (build entire testing cities and fill them up with paid stuntmen and -women, if necessary) instead of in public, until you're able to prove that the tech is statistically at least as safe as human drivers. After that, you can continue testing in public, provided that you deal with any changes to your system responsibly and ensure that the public is not exposed to additional danger because of your testing.
This is actually very common in the software development world. For production-critical systems, companies go to great length of creating a staging environment as realistically as possible, but fully separated from the production systems. You may get around that effort if the only damage you can potentially do is people not being able to see stupid cat pictures on a social network. But sorry, for life-critical tech, the move-fast-and-break-things approach is irresponsible bullshit.
> I would happily take a million deaths right now if it meant driverless tech instantly became available to everyone
What if your own death is guaranteed to be among them? Still "happily taking" it?