The idea (I assume) is to let compilers optimize away loops without determining that they terminate. I.e. the rule is more aimed at loops which would terminate, but this lets compilers avoid proving that they do in fact terminate.
Grandparent presented this as a "surprising optimization" but I'd argue it's exactly what you'd expect when a compiler sees while(expression) -- he just happened to pick a trivial expression.