I apologize for my tone, it was more patronizing that I had intended it to be.
> The craziness with undefined behavior is a fairly recent phenomenon. In fact, I started programming in C before there even was a standard, so all behavior was "undefined", yet no compiler manufacturer would have dreamed of taking the liberties that are taken today.
I feel that the current renewed focus on optimizing compilers has really been born out of the general slowing of Moore's law and stagnation in hardware advances in general, as well as improvements in program analysis taken from other languages. Just my personal guess as to why.
> Exactly: "matches what you are trying to do". The #1 cardinal rule of optimization is to not alter behavior. That rule has been shattered to little pieces that have now been ground to fine powder.
The optimizing compiler has a different opinion than you do of "altering behavior". If you're looking for something that follows what you're doing exactly, write assembly. That's the only way you can guarantee that the code you have is what's being executed. A similar, but not perfect solution is compiling C at -O0, which matches the behavior of older compilers: generate assembly that looks basically like the C code that I wrote, and perform little to no analysis on it. Finally, we have the optimization levels, where the difference is that you are telling the compiler to make your code fast; however, in return, you promise to follow the rules. And if you hold up your side of the bargain, the compiler will hold up its own: make fast code that doesn't alter your program's visible behavior.