While I do agree that making software better/more reliable is a good goal, I believe we would be better off making the system as a whole more robust; the system that includes humans. For every situation where a piece of software has control of something that effects society (individual, group, etc), there should always be a clear and direct means of appealing / pushing back on the decision that was made. Those means should involve a human reviewing the information and making a decision based on that information, not on what the computer said. There's thread after thread of us saying the exact same thing about companies like Google and Facebook; it should apply as a general rule.