UB is simply the latest stick to hit C with. In day-to-day working nobody worries about UB at all as you generally don't notice it.
Same with lack of a GC; this is a plus point for C for most applications, not a negative.
(and yes, I do have plenty of experience in it, I've been using it for the past 20 years, have you?).
I agree that in day-to-day working nobody worries about it, and I see odd behaviors all the time because of it. In particular as fewer applications are using C, a much higher fraction of C becomes systems and embedded where you are more likely to accidentally run afoul of choices that were made to compete with FORTRAN on numerical performance.
A read from NULL will crash on most unixen, but will not crash on some targets without an MPU and when running in kernel mode, so may be left lurking (see the linux kernel).
The C89 aliasing rules in particular are completely at odds with a lot of kernel and device driver code, and in addition where the int size was 2 bytes previously but is now 4 bytes you can have signed overflow where before the behavior was well defined:
UINT2 x; // 16 bit integer
...
x+=2; // addition mod 2**16 on 16 bit targets, undefined behavior on 32 bit targets.
These are some real-world bugs I've dealt with.> (and yes, I do have plenty of experience in it, I've been using it for the past 20 years, have you?).
I've been using it professionally for only about 15 years, but I started using C at home in '92.
[edit]
> Same with lack of a GC; this is a plus point for C for most applications, not a negative.
This is a bit of a non-sequitur, as I didn't mention memory management at all. C doesn't need a GC. It could use more memory safety though. There's been plenty of academic research on improving C's memory safety without significant runtime overhead; a lot of those techniques were used in rust. There are plenty of tools that can catch a large fraction of memory errors at compile time, which is a good thing.
Honest question, which is the case?
(0) You find it easy to determine, by visual inspection, whether a piece of code has undefined behavior.
(1) Your coding practices make it difficult to accidentally introduce undefined behavior in the first place.
> Same with lack of a GC; this is a plus point for C for most applications, not a negative.
Agreed. C addresses use cases for which GC (or any other feature requiring heavy runtime support) is simply unacceptable.
The vast majority of all code you write will not invoke UB, most people tend to stick to an 'easy' subset of syntax, unlike say C++ where everyone uses a different subset of features making it in effect multiple languages.
A combination of testing the known edge cases, wraparound issues, size issues, static analysis and tooling means running into an example of UB is extremely rare in most cases.
It used to be that people used dynamic memory allocation to beat C with, but that is just a resource management issue. TBH, this is not rocket science. If you need dynamic memory allocation, you had damn well better know how to use it properly.
Its an example of laziness and people ignoring the machine.
Another example is performance; saying that a language comes within a factor of 2 of C's performance and therefore is fast is absolutely ridiculous. a factor of 2 is huge.
You have to remember that people who write C are dealing with machine-specifics day-in, day-out. we're bit-fiddling and writing MPU code and drivers, etc.
Basically, we're much more aware of the machine than higher-level softies, so what would normally be UB is actually DB in most cases, its defined by the compiler and hardware that we're intimately familiar with.
...and that isn't to say that you can't write high level abstracted code in C, the simplicity of the language lends itself to (properly) efficient implementation, not efficient in the sense of Java or Ruby ;o)