For the same reason why when a human pilot makes a mistake it's a mistake, but when an autopilot malfunctions every single plane of that type is grounded until the issue can be found and fixed. Machines cannot be just as good as humans, they have to close to perfect when human life is involved.
As another example, imagine if you had a radiotherapy machine, when operated manually it randomly kills 1/10000 patients, but when operated by AI it randomly kills 1/100000 patients. Yet I'm 100% certain even though it's a 10x improvement over a human operator it still wouldn't be allowed on the market.