Wrap all of this with an "IN MY OPINION"...
That would make things worse because we'd make the same mistakes again. I've been on many start over projects (Xeon Phi, for example, threw out the P4 pipeline and went back to the Pentium U/V). It doesn't work. You know what the most robust project I've worked on? The instruction decoder on Intel CPUs. The validation tests go back to the late 1980's!
You make progress by building on top and fixing your mistakes because there literally IS NO OTHER WAY.
Go read about hemoglobin, its the one of the oldest genes in the genome, used by everything that uses blood to transport oxygen, and it is a GIGANTIC gene, full of redundancies. Essentially a billion years of evolution accreted one of the most robust and useful genes in our body, and there's nothing elegant about it.
I think that's where we are headed. Large systems that bulge with heft but contain so many redundant checking code that they become statistically more robust.