People realised that you maximize the throughput of the raw medium if you push it hard enough it has a high error rate, and then correct those errors.
Error correction tech has reached information theory perfection (if you ignore latency), so it always makes sense to use as much of it as possible. In the future I wouldn't be surprised to see bit error rates pretty close to 1:1, and massive amounts of FEC.