This largely derives from C developers believing they understand the language by thinking of it as a loose wrapper around assembly, instead of an abstract machine with specific rules and requirements.
Within the bounds of the abstract machine described by the standard, things like signed integer overflow, pointer aliasing, etc, don't make any sense. They are intuitively undefined. If C developers actually read the standard instead of pretending C describes the 8086 they learned in undergrad, they wouldn't be worried about the compiler doing "sane" things with their unsound code because they wouldn't gravitate towards that code in the first place. No one thinks that dereferencing an integer makes sense, no one accidentally writes that code, because it intuitively doesn't work even in misguided internal models.
This doesn't solve problems like buffer overflows of course, which are much more about the logical structure of the program than its language rules. For that style of logical error there's no hope for C in the general case, although static analyzers help.