This is a rather strange and insulting article. I'm not sure why Zed can't help "old programmers" nor do I understand why he's angered that individuals know about undefined behavior in C. Is there any background to this or did he have the misfortune of being insulted on IRC?
Edit -- I googled for a bit and discovered this was in response to someone doing a pretty good job technically reviewing the book for free! http://hentenaar.com/dont-learn-c-the-wrong-way Perhaps the title was a bit inflammatory.
Zed's rebuttal is at https://zedshaw.com/2015/09/28/taking-down-tim-hentenaar/ and is a great example of how not to react to constructive criticism. My favorite part is his safercopy function and the lack of size_t.
And finally, to leave us all with a quote from Zed's rebuttal:
"Over this next week I’m going to systematically take down more of my detractors as I’ve collected a large amount of information on them, their actual skill levels, and how they treat beginners. Stay tuned for more."
Wow.
All I can say is the order of topics, the choice of topics and the quoted explanations would make for a very confused beginner. Especially the crusade he seems to have against strings and functions called incorrectly. That makes me think he should be teaching the language, not the language he wishes it were. Of course these are selective quotations so I can't draw too many conclusions.
Going on my time teaching C, I wouldn't even mention Duff's device or safer, better strings at this level. There's better ways to introduce defensive programming, along with a discussion of the pros and cons.
Oh, I'm past 50 so am clearly "doomed" and beyond help. Not that I'm sure what I need help with. Oh well. :)
If you read the first part of that same sentence, it should give you a clue.
I disagree. Low level languages, especially C, are the easiest to master. K&R book is the only book you need to read to know everything about C. All you need after you understand the fundamentals is a bit of discipline.
C++ on the other hand is extremely difficult to master. Just have a look at the rules for Rvalue references and you will see what I mean.
It may be easier for a complete novice to write some code that doesn't crash in C++ than it is in C, but not mastering it, or even be good at it.
I am a huge C fan but this is not true at all. C has tons of pitfalls, especially with modern UB-aggressive optimizing compilers. There are a lot of rules you need to be aware of that are not naturally-occurring results of the fundamentals.
http://www.open-std.org/jtc1/sc22/wg14/www/docs/n1570.pdf is the most recent draft before the official, purchase-only C11 was published according to http://www.open-std.org/jtc1/sc22/wg14/www/standards. I don't know if it's identical, but it should be close, and it's free.
Absolutely not.
The problem with C is that even a C master can't necessarily write correct code, because C is a very programmer-unfriendly language, making developers remember to do various actions manually and perform error-prone calculations.
C++ is definitely harder to master (after many years, I can't say I master every corner of the language), but it's much easier to write correct code in C++ and it will be just as fast, run on as many platforms, etc, etc.
C lost this battle a long time ago, it's surviving because of nostalgia, still having good street cred and inertia. The number of domains where one must use C is shrinking and now that we also have Go and Rust this will accelerate. All for the better, really.
But man, the guy is insecure to the point of requiring therapy or something. He seems obsessed with his image and status, and the slightest criticism will cause him to lash out in an immature and ridiculous manner. Past rants have him making lewd comments about penis-sizes and challenging others to a physical fight[1].
It's a shame, because if he just relaxed a bit and took criticism gracefully, he'd probably find himself to be a bit more valuable to the community and employers, and would actually be a pretty decent dude. Instead, his writing seems to reek of a constant need to validate and defend himself.
This is probably an unfair comparison, but I can't help but think of Terry Davis: a brilliant programmer hindered by mental issues. Schizophrenia is obviously not the same as insecurity, but I think the situation here is somewhat similar.
[0] http://programming-motherfucker.com/
[1] http://harmful.cat-v.org/software/ruby/rails/is-a-ghetto
I don't think Zed's doing anything wrong. He's saying what he thinks needs to be said, he's challenging the complacent, and he's not pulling any punches. If you don't like his attitude there's plenty of other people to listen to. I appreciate that he's out there making noise, getting people to re-think their assumptions about programming.
If you live life by particular principles sometimes you have to take the hard road. You can't argue it hasn't been an interesting path.
Screw passion, I'd rather have discipline and a healthy interest instead.
I don't have an opinion on the actual topic, but whether someone's goal is others' perception of them, or they are just poorly optimizing it as a proxy for their sense of self worth, attacking every criticism head on could undermine how others perceive them, or it could waste their time and mental energy compared to other things they actually care about more than the measure of their achievements based on others perceptions.
I do agree that we should be moving away from C and C++, though. It's pretty simple, really: C was a pretty good language in 1978. We didn't know a lot of things in 1978 that we do now in 2016. It now makes sense to revisit those decisions in light of nearly 40 years of practice. The so-called "PL Renaissance" has given us a whole host of new languages which have steadily chipped away at the dominance of C and C++, and I think this is a healthy trend that ought to continue.
The fact that C arrays decay to pointers without any bounds is single-handedly responsible for a huge chunk, possibly even the majority, of all RCEs, worms, malware, and exploits. Ever. In the history of computing.
It was a bad design.
It was a bad design in 1978.
It was known to be a bad design in 1978.
Other languages knew that checking array bounds was important, including for security. The internet made the impact of using C much more devastating but people were exploiting buffer overflows in the 80s to great effect. Some of C's predecessors/contemporaries passed a length as the first part of an array so bounds-checking was possible, though that has the downside of not being able to pass slices of an array without copying.
C could have included an arrayref type that was a length + base pointer, and let array l-values decay to an arrayref instead of a pointer. Then taking a slice of an array would not require copying elements. You could still take the address of an individual element. This would not have required much work to implement, even in 1978! Maybe the first compilers didn't insert array bounds checks, but at least the entire design wouldn't preclude them. Let's say you even spell arrayref as []. It would mean sizeof() works on arrays passed to functions.
void wat(int[] values) { for(int i = 0; i < sizeof(values); i++) { printf("look ma, no buffer overflows! %d", values[i]); } }
(Yes, I know this is not K&R syntax)
Maybe you can forgive C for the stupid header compilation model (why let the compiler do what you can make the programmer do by hand?). You can understand why they might not have foreseen the need for namespaces. D&R didn't invent the macro system so that's not even their fault.
What is unforgivable is the horribly stupid design of C's arrays.
I actually think it would be beneficial if the standards committee added arrayref now. It won't fix all the busted C code but at least you could start improving the #1 problem. Compilers could eventually adopt a flag to prohibit arrays from decaying directly to pointers. You'd probably have to introduce lengthof() to avoid confusion and use some other syntax to declare one, maybe array(int) or something.
When C was designed, and even today, there are systems without pipelining, where it is expensive (in time) to de-reference a memory address and follow that pointer.
I do not argue that the design you suggest would be safer, and even have advantages for slicing; but that's really not the kind of program that C was intended to service writing.
Also, C is supposed to scale down to //really// simple systems. Systems that lack indirect addressing modes, caches, MMUs, etc. It is literally intended to be a thin veneer over actual assembly for those systems, and why so many operations are specified in terms of /minimum standard unit size/ (for portability of that almost machine code between systems).
What you advocate is more like what C++ actually /should/ have been; a reason to use something more than C to gain advances in safety and ease of design.
This model enables binary-only distribution of libraries, you get the code as a .a (or lib, .so, .dll or whatever) and the API declaration as a header file.
You can write code against a library without having the library, using only the header. You can't do the final linking of course, but you can write the code.
The alternative, I guess, would be to embed this information in the library itself, and have the compiler extract it, which sounds as if it would have been scary from a performance point of view 40 years ago (and also somewhat hard).
I used to think it was hopeless, especially as each new language that came out required garbage collection or worse targeted the JVM. Perhaps cloud services will motivate people to fix this stuff since now computing costs are a hard line item on the books.
We surely did know that Burroughs was selling an operating system written in ESPOL, later NEWP in 1961. Nowadays Unisys still sells them as MCP.
We did know that the Flex machine was written in ALGOL 68RS in 1980.
We did know that VME was written in S3 in 1970.
We did know that Pilot was written in Mesa in 1977.
We did know that Lillith was written in Modula-2 in 1997.
There are lots of other examples.
The main difference was that UNIX and consequently C, source code were available for free because AT&T could not sell it, while people had to pay for the other ones or they were behind research walls.
There's still a lot of C code out there, and a lot of new C code still being written.
I don't agree: it's been a long process, but the trend is unmistakable. It's hard to remember now, but in the early '90s C and C++ were completely dominant. Nowadays they're much more specialized: you're as likely to build your company on Java or even Python/Ruby as you are to build it on C++. People talk about how it's hard to hire C++ engineers nowadays, while in the '90s "C++ engineer" was pretty much synonymous with "programmer". And so on.
We also have nearly 40 years of infrastructure built on C, which needs to be maintained and updated.
This is the same old argument advocating for rewriting everything from scratch just because someone somewhere managed to develop a new flavor of the month.
There are plenty of reasons why the whole world still has a heavy demand for COBOL and FORTRAN developers, and the development of new flavor of the month isn't a good enough reason to eliminate this demand.
I'm not saying rewrite everything for no reason. I'm saying that there are reasons, and we've gotten a very good idea of what those are over the last 40 years.
But C++ won't be easy to replace, and I'm not sure it needs to be, since rewrites are highly risky, time consuming and disruptive. With some luck and depending on how the language evolves we might be moving from C++ to a safer C++.
But using the C++ features that make it safer than C is only an option in small security motivated teams.
Sadly the majority of C++ teams, at least in the enterprise space, tends to use it as "C with classes" thus voiding most improvements the language has to offer over plain C.
> "You're right, but you're wrong that their code is bad." I cannot fathom how a group of people who are supposedly so intelligent and geared toward rational thought can hold in their head the idea that I can be wrong, and also right at the same time.
Zed, you're right, period. But I think you probably just hurt people's feelings because they revere Kernighan and Ritchie and this is one prominent item of their legacy.
> But C? C's dead. It's the language for old programmers who want to debate section A.6.2 paragraph 4 of the undefined behavior of pointers. Good riddance. I'm going to go learn Go (or Rust, or Swift, or anything else).
Amen. The union of those three are likely to address all use cases that C handled in the past.
BTW the blog post would be clearer if titled: " 'Deconstructing K&R C' is dead". Gotta love mixing up C with natural language operator precedence ambiguity. :)
I think that C should rapidly be moving toward obsolescence, and I hold K&R in great esteem.
This is hilarious, because programmers by and large love to pride themselves about being stoic, logical, and practical in lieu of letting emotion dictate what they do.
Since when do programmers give a shit if people's precious fee-fees contradict what is technically correct? (The best kind of correct!)
Programmers at least the ones I've seen in my life are not from Vulcan :). In other words, we humans, are all driven by our emotions like it or not. The problem is that some people chose to believe that they are pure rational beings, therefore they are always right.
Well, then, in that case you shouldn't you reall refer to it as K&&R
Not a single actual quote from any of his detractors, for the reader to judge for him or her self if their criticisms have any validity.
The categorical declaration of "I cannot help old programmers," without providing the evidence he has for this claim. Lots of name calling, though.
No link to the original content, to determine for ourselves whether or not it was fair to K&R's work.
I suppose Zed just meant this to be personally cathartic, and didn't realize he posted it on a public web site where other people can read it?
Yes. I can't figure out exactly what he's ranting about. He writes "I will make it clear that my version of C is limited and odd on purpose because it makes my code safe." Does this mean he defined a safer subset of C? (There are lots of those. I've taken a crack at that myself [1], but it's politically hopeless. Rust is the way forward.)
Why would anyone want to write K&R C today? It's awful. It didn't even check function parameter types. Struct fields were just offsets; you could use one on a pointer of the wrong type and the compiler wouldn't complain. (Considering that Pascal predated C by some years, and had a sane type system, this was kind of lame. But they were trying to compile in 64K of 16 bit words in one pass. That was an adequate excuse in the 1970s.) The first ANSI C at least had a sane type system.
[1] http://www.animats.com/papers/languages/safearraysforc43.pdf
He's done these kinds of rants repeatedly. It's his counter-productive style. I can't judge his arguments on a technical level, (I do think his introduction to various language guides are excellent.) but these kinds of rants surely just alienate more people than they persuade?
There is nothing wrong with carefully crafted C code for applications were it is the best suited tool. Sure, there are sharp edges. True you can write crappy, security nightmare code.
You do make some good points. I agree Go is fantastic. Rust is coming along as well. However, C still runs the world. That's not changing anytime soon. Not with the explosion of IoT and GPU type devices. And, hello Linux kernel and all the glorious command line tools on nix.
Try using Go or Rust (love both, x2 for Go) to allocate say a hundred GB of memory for some huge/fast in-memory data processing. Let me know how far you get.
Your rant is as polarizing as those who are blind to C's flaws (yes, there are a few). Stop saying "don't write C", that's just childish. Rather, what about "let's write better, less security flaw prone C."
As an engineer, one ought to choose wisely when choosing tools. This means pros and cons and balanced unemotional decision making. Not a holy war against a given tool.
And I am a professional programmer.
Let's do C where C makes sense.
(Edit: fixed typos)
I'm currently working on a couple of bugfixes for a Rust program I wrote last year which regularly allocates north of 500GB of RAM per-node on a cluster. It's wicked fast (regularly matching or beating comparable workloads implemented in C/C++), and Rust's ergonomics and safety guarantees made it very easy to extract much greater amounts of parallelism than the previous C++ version had, while never once having to chase down a bug from memory corruption, data races, or iterator invalidation.
Um, what's wrong with that in Rust?
> Rather, what about "let's write better, less security flaw prone C."
We've been trying this for the past 40 years and we've completely failed to stem the constant tide of new game-over security flaws. I think it's time to admit that if we couldn't do it in 40 years, we've failed.
> Try using Go or Rust (love both, x2 for Go) to allocate
> say a hundred GB of memory for some huge/fast in-memory
> data processing.
Why would this be a problem in Rust? It literally doesn't impose any overhead on memory consumption, at least not any that C doesn't (e.g. padding). Dropbox has clusters of machines that manage exabytes of data whose core is written in Rust.There is no fundamental reason why this should be slower or harder in Rust. Rust generally compiles down to more or less the same code C does.
There are reasons why this could be slower in Go, but it really depends on what program you're writing, so it might even just work fine. If you don't hit the GC, for example (and Go gives you ample opportunities to not hit the GC), data processing should be quite fast. But it depends.
I'd love to hear real-world experiences with such systems in Go.
The busiest node traffic-wise had average GC time over the past 20min of 3.4ms every 54.5s. 95th percentile on GC time is 6.82ms.
That node is sitting at 36GB in-use right now, and has allocated (and freed) an additional 661GB over the past 20min.
Can't really speak to how fast this is vs other environments, but it's smooth sailing overall. /shrug
A consistent theme throughout the article is that he's actually more interested in teaching people to write C well than fight with pedants. He's not torching his book, he's updating it and removing the contentious chapter. "let's write better, less security flaw prone C." is exactly what he's trying to say - the "don't write C" bit at the end is more about it being a dinosaur than a childish huff, though there is a little of that in that comment.
Good riddance.
Unfortunate he uses this categorization. The problem is a mindset that can exist in any generation.
Z: K&R's strcpy is broken, e.g., you can forget to null-terminate the string. Mine is safer.
Ohters: It's not broken, of course it'll do something unpredictable if you break its preconditions.
Z: strcpy is still broken.
Others: Your function will break too if you pass it the wrong length.
Z: This cannot happen, K&R strcpy is broken, mine is safe.
I wish that was true, but you will be surprised how many things you use everyday are written in C. Even the ones you would never imagine.
Node.js for example, a large part is in C. Redis, C. Memcached, C. PHP itself is written in C.
Make of that what you will, but it seems to me that given all of the other ways that C can blow up due to programmer error, it seems reasonable to expect programmers to pass a valid string to a string function.
Mind you, we're talking about the stdlib here. You can swap this stuff out. Some people do: djb is a fairly well-known example.
a. I haven't written a program in C in over 10 years. I wrote software 5 days a week for those 10 years.
b. I wouldn't want to write a program in C now.
c. The first "high level" programming language I learned was C, from a book (not K&R C), while travelling in Asia, without a computer. It taught me well, but I immediately went on to other languages.
e. I can't shake the idea that there is some value to knowing that low level stuff, even though I don't use it much myself.
Maybe linux kernel hackers will keep it alive. I know game programmers use it a lot as well. But for the majority of us, it's kind of an arcane skill now.
That's fine. Perhaps the kind of programs you have been writing and not a good fit for what C is great at doing. That does not take away from C or its use for appropriate work.
When all else fails... come back and say this again ! But for the time being ignorance be bliss.
"But C? C's dead. It's the language for old programmers who want to debate section A.6.2 paragraph 4 of the undefined behavior of pointers"
Someone has to build the low-level stuff. Dear boys in too-tight pants and a hippie mustache: your high-level things and gluten-free snacks does not grow on trees.
Some of us where already doing it in much better languages, before C had any meaning outside AT&T walls.
Next time before getting pissed off about the response you get, think what could it be that you have said or done that may have triggered it.
You can document its shortcomings, its dangers and all the headache-inducing choices. But while you're doing that, people all over the world are building wonderful and terrible things with it.
So you've moving on to Go or Rust? Great! Good choices! But remember that there are people who may disagree and be wrong and also do something interesting with that wrongness.
If anyone is interested in what he removed, you can find it here: https://web.archive.org/web/20150101224641/http://c.learncod...
Zed is frustrating sometimes.
https://web.archive.org/web/20150106191636/http://c.learncod...
Plan9 dialect of C is another example. There is portable mk package, with includes core libs (libbio, libutf, etc. which also served as core libs for earlier versions of Golang) to appreciate what C supposed to be.
I would paraphrase - attention seeking by attacking classics is a poor style.
"I’ve more or less kept my mouth shut about some of the dumb and plain evil stuff that goes on in the Rails community. As things would happen though I’d take notes, collect logs, and started writing this little essay. As soon as I was stable and didn’t need Ruby on Rails to survive I told myself I’d revamp my blog and expose these fucks."
and:
"After Mongrel I couldn’t get a gang of monkeys to rape me, so forget any jobs. Sure people would contact me for their tiny little start-ups, but I’d eventually catch on that they just want to use me to implement their ideas. Their ideas were horrendously lame. I swear if someone says they’re starting a social network I’m gonna beat them with the heel of my shoe."
So that is very much his style of writing.
We detached this subthread from https://news.ycombinator.com/item?id=11727718 and marked it off-topic.
I'll take it as dead when the Linux kernel, or it's futuristic replacement, is written in something other then C.
If you are talking about at the user-space level, then yes I can see that. But you shouldn't assume your single use case, higher level user space apps, is the only use case.
There's no argument that the Linux kernel is currently written in C. But that doesn't prove that nothing exists that can replace C.
Right now C is only the tried and true solution. The rest are possibilities only.
It's an older home wiring technology that works fine for years if undisturbed, is still present and working OK in homes all over, was invented in the early days of electrified homes, requires considerable skill to install properly, tends to be unsafe if not handled skillfully, is expensive and delicate to modify, has no hidden components, allows interesting wiring layouts because conductors are separated, ...
One could go on with the obvious parallels. (I learned on a PDP-11.)