>
Racist outcomes are still racist regardless of whether there's a guy in Klansmen robes at the steering wheel or not.Yes, I understand what systemic racism and implicit bias are, your condescending snark is appreciated.
Anyway, it's not racist because the result is not the product of implicit bias or systemic racism, it's a software bug that would have been possible no matter who was working on this software. As I wrote in another comment: the whole point of ML is to adapt to what is effectively an unbounded set of inputs, pretty much by definition there will be cases where even a team of 100% black people will train a model that, given the correct input, will fail in ways that particularly affect black people.