1) Where do you draw the line? 2) At what number does that one become two? 3) how long do you think until AI is justified to start killing those single digit persons?
4) What if that one person is you? (this is not that hard to imagine, suppose a fictitious near future where everyone that contributed to some extinction event is deemed killable: AI development, global warming, failed to do some recycling, etc).
(not talking about this conflict in particular, just making an abstract point)
Well the line would be at when you are causing more deaths than you are saving.
Would you rather a larger number of people die?
> What if that one person is you?
What if the people's lives that would be saved are you, and this number is much larger?
That argument actually works in favor of the option that saves the most lives.
There is no neutral decision here. If you choose to not save the much larger group of people, those people are dead.
So your only choice is to pick which groups of people will die. My prefer is to minimize that amount to be as small as possible. But if you want that number to be larger, and to have more people die, that requires some explanation.