Exactly! I cannot agree more.
I have a small test program I port to different languages to test the length of the code and the speed of the program. Of course it only represents a single use case.
* C is first, of course.
* twice as slow, come Pascal, D and... Crystal!
* x3 to x5, come Nim, Go, C++ (and Unicon).
* x6 to x9, come Tcl, Perl, BASIC (and Awk).
* x15 to x30, come Little, Falcon, Ruby and Python.
* x60 to x90, come Pike, C#, Bash.
* x600 to x1000, come Perl6 and Julia.
This list looks byzantine, I know :-) The trends I can get out of it:
* the last 2 are languages with JIT compilation, and that's horrid for short programs.
* the "old" interpreted (or whatever you name it nowadays) languages (Tcl, Perl) are not so bad compared to compiled languages, and much faster than "modern" one (Ruby, Python). (Again, this is only valid for my specific use.)
* compiled languages should all end up in the same ballpark, shouldn't they? Well, they don't. The more they offer nice data structures, the more you use them. The more they have some kind of functional style (I mean the tendency to create new variables all the time instead of modifying existing ones) the more you allocate and create and copy loads of data. In the end, being readable and idiomatic in those languages means being lazy and inefficient, but what's the point of using those languages if don't use what they offer? C forces you to use proper data structures and not re-use existing ones. It comes naturally. What is unnatural in C is to copy again and again the data, it is simpler to modify the existing one and work on the right parts of it, not to pass the whole chunks every time you need one single bit. In more evolved languages, compilation won't save you by doing some hypothetical magic tricks, it cannot remove the heavy continuous data copying and moving you instructed your program to do. And that is what made the difference in speed between C on one side, and D, C++, Go on the other side.