This is why valgrind, asan and friends exist. They move the error diagnostic to the place where error actually happened.
If your C++ program exhibit undefined behaviour, the compiler is allowed to format your entire hard drive. Or encrypt it and display a "plz pay BTC" message. That's called a vulnerability. Real and meaningful security checks have been removed as "dead code" because of signed integer overflow (which is undefined behaviour by default).
If anything, I would guess the gross misunderstanding sprouted somewhere between the specs and the compiler writers. Originally, UB was mostly about bailing out when the underlying platform couldn't handle this particular case, or explicitly ignoring edge cases to simplify implementations. Now however it's also a performance thing, and if anything is marked as UB then it's fair game for the optimiser — even if it could easily be well defined, like signed integer overflow on 2's complement platforms.
No, it isn't. That's a completely made up fabrication. And if you had a compiler that was going to do that, then what the standard says or if there's undefined behavior is obviously not relevant or significant in the slightest.
The majority of the UB optimization complaints are because the compiler couldn't tell that UB was happening. It didn't detect UB and then make an evil laugh and go insane. That's not how this works.
Compilers cannot detect UB and then do things in response within the rules of the standard. Rather, they are allowed to assume UB doesn't happen. That's it, that's all they do. They just behave as though your source has no UB at all. As far as the compiler is concerned, UB doesn't exist and can't happen.
When a compiler can detect that UB is happening it'll issue a warning. It never silently exploits it.
> Real and meaningful security checks have been removed as "dead code" because of signed integer overflow (which is undefined behaviour by default).
Real and meaningful security checks have been removed because the security check happened after the values were already used in specific ways, not because of UB. The values were already specified in the source code to be a particular thing via earlier usage. UB is just the shield for developers who wrote a bug to hide behind to avoid admitting they had a bug.
Use UBSAN next time.
> even if it could easily be well defined, like signed integer overflow on 2's complement platforms.
Signed integer overflow is defined behavior, that's not UB. Also platform specific behavior is something the standard doesn't define - that's why it was UB in the first place.
It is kinda ridiculous it took until C++20 for this change, though
> No, it isn't. That's a completely made up fabrication.
Ever heard of viruses exploiting buffer overflows to make arbitrary code execution? One cause of that can be a clever optimisation that noticed that the only way the check fails is when some UB is happening. Since UB "never happens", the check is dead code and can be removed. And if the compiler noticed after it got past error reporting, you may not even get a warning.
You still get the vulnerability, though.
> UB is just the shield for developers who wrote a bug to hide behind to avoid admitting they had a bug.
C is what it is, and we live with it. Still, it would be unreasonable to say that the amount of UB it harbours isn't absolutely ludicrous. It's like asking children to cross a poorly mapped minefield and blame them when they don't notice a subtle cue and blow themselves up.
Also, UBSan is not enough. I ran some of my code unde ASan, MSan, and UBSan, and the TIS interpreter still found a couple things. And I'm talking about pathologically straight-line code where once you test for all input sizes you have 100% code path coverage.
> Signed integer overflow is defined behavior, that's not UB.
The C99 standard explicitly states that left shift is undefined on negative integers, as well as signed integers when the result overflows. I had to get around that one personally by replacing x<<n by x(1<<n) on carry propagation code.
Strangely enough I cannot find explicit mentions of signed integer overflow for regular arithmetic operators, but apparently the C++ standard has an explicit mention: https://stackoverflow.com/questions/16188263/is-signed-integ...
> Also platform specific behavior is something the standard doesn't define - that's why it was UB in the first place.*
One point I was making is, compiler writers didn't get that memo. They treat any UB as fair game for their optimisers. It doesn't matter that signed integer overflow was UB because of portability, it still "never happens".