C gives you enough rope to shoot yourself in the foot. And rightfully so. It came out in a time when everyone was coding assembly. It's meant not to hold you back from doing voodoo with low-level stuff, therefore it won't hold your hand.
Not very practical in the world of today when we've been spoiled by 'better' languages and you need to quickly ship stuff that mostly works without worrying about the little things, but at the time it was revolutionary.
This was before my time, but I think it's a common misconception (only true for operating system development). When C was created, there was already Lisp, Cobol, Fortran, Algol, Simula, BASIC... and SmallTalk and Prolog were just around the corner - and most of those are much higher level than C).
...interesting that you mention that, I think that functions and structs are the essential 'core abstraction tools' that get you to at least 80% of any higher level abstractions that were invented since then, and this is exactly the reason why C is still quite popular. Its feature set is just enough to be considered a high level language which enables abstractions, but not more (especially no fads and fashions that came and disappeared again).
It is a urban myth that C was the first one, usually pushed by naturally UNIX folks.
JOVIAL, ESPOL, NEWP, PL/I, PL/S, PL.8, PL/M, Bliss, Mesa, Modula-2, VMS Basic, VMS Pascal,...
If this was intended to illustrate the unexpected consequences of undefined behavior it succeeded remarkably well!
eg. C leaves the case of exceeding the size of an int undefined. In most cases it has a predictable effect on modern, mostly similar architectures but that is by no means guaranteed, and forcing an architecture to calculate overflow a particular way seems like a negative.
That being said, everyone has a pet example of a compiler doing some really odd and deep optimisations - I suspect that’s mostly due to successive layers and optimisers adding up to have unexpected effects, rather than a deliberate effort by compiler writers - but I’m no expert on the matter.
Section 4. Conformance says "A strictly conforming program shall only use those features of the language and library specified in this International Standard. It shall not produce output dependent on any unspecified, undefined, or implementation-defined behavior, and shall not exceed any implementation limit."
Compilers are not allowed to produce output dependent on UB for strictly conforming ISO C programs, they must optimize those statements out. Treating UB as impossible is required for ISO C. It's NOT required for GNU C, or Clang C, or Microsoft Visual C, but they usually do so anyway (even though they're not compiling strictly conforming ISO C programs).
I will utterly kill all humor by explaining it:
They made a funny observation about a thing that happens. The thing that happens is (today) called UB. The funny observation is that that comment kind of exhibited the outward appearance of what the effects of UB could look like.
It began reciting one metaphore, "enough rope to hang yourself" but mid-way unexpectedly switched to a different metaphore "shoot yourself in the foot", producing a combined invalid nonsensical output. As though a program suffered some UB in the routine for looking up and printing metaphores.
The comment author might have done it on purpose. Maybe they intended to make exactly that joke.
The history of the term UB has no more bearing than the history of any of the other words used.
Expect to hear from @pjmlp on this!