> As one might have guessed, this is not an essay. It's a transcript of the following talk by R. Martin with some substitutions made (SmallTalk -> Haskell, Ruby -> Rust, and others). You are free to make any conclusions from this.
If there's a thing that has arguably "killed" haskell and could potentially kill rust too it's probably not this but rather packages bifurcating into a million different incompatible implementations of everything because so little is in the standard library.
This was my problem. I played with Haskell some. I actually liked it quite a bit, but I just couldn’t force myself to program in it. I’m so used to Java where you have a huge standard library. Even without that I’ve done PHP or python where the standard libraries were quite good.
Haskell felt like a giant step back. And I know there are reasons for the way it ended up. And in many ways that makes sense. But even if you ignore all the functional programming stuff that’s new for many people it’s an incredible pain to get started in Haskell because of the lack of standard library and the huge number of extensions.
And if you ask around you find out that lots of people don’t use Haskell without like 12 different extensions turned on. And then you have to use _these_ libraries to get anything done.
And that’s before you get into the religious wars over if you should be using lenses or some other thing.
As a new user, my preference would be this:
1. Stop pretending GHC isn’t Haskell. For all intents and purposes it is the one and only implementation that matters. Just like PHP, Ruby, and Go have canonical implementations. Bonus points for renaming GHC to Haskell.
2. Bring all the modern stuff people expect into the standard library. I’m sorry it’s been way too long to remember what’s missing, but I’m sure people who are active in the community know what the “must use“ libraries are.
3. Language extensions are for academics. There’s nothing wrong with having them, but you shouldn’t need to mess with any of it to write good apps. Figure out what should be on by default, compile it in. No normal programer should have to mess with them.
I feel like fixing those three things would make the language 100 times more approachable. I don’t think it would ever happen, too many people would have to agree on too many things. But I just don’t see Haskell ever going anywhere without it. The same way I don’t see Emacs ever becoming a “normal“ program that people use without lots of changes.
It’s OK if you want to stay “obscure“ because you like the way things are done. But don’t expect to take over the world.
"Avoid success at all costs" is the Haskell motto, so...
> Bring all the modern stuff people expect into the standard library.
GHC already ships with lots of these, they're just not in the base package. You might find Relude [0] more to your liking?
> Language extensions are for academics.
GHC2021 happened and mostly alleviates this! But it's now one language extension that turns on everything that most people (on the committee) think everyone is fine with. We're not really going to see it becoming built-in without a new Haskell standard, and that (Haskell Prime) fizzled out due to lack of time from the committee members.
There are a lot of project templates already existing that do this btw.
I think it isn't a big issue to most Haskell users because it's really only an issue when starting up new projects, which isn't what programmers are doing most of the time.
- In my daily work, the work that pays my bills, I regularly use software built in Rust, eg ripgrep.
- About the only time I ever cross paths with Haskell is in those times when I'm going out of my way to do so as part of my tech hobbying.
I think that's an important difference
(Fizzling is a likely outcome for any new language, and does not suffer the mess of interpretation that "dying" carries.)
The thing is, there was a time when Ada had a lot more industry support and investment than Rust has today. OS kernels were coded in it. Aircraft and satellite instrumentation was coded in it, with billion-dollar contracts. Ada fizzled.
Erlang had rapidly growing support in the '90s. It had unique strengths. Erlang fizzled, too.
As hot as Rust is on HN, it could still fizzle in exactly the same way, for the same reasons. You might code something good in your work, but when you move on, will anybody be found to maintain it? People with Erlang skills today are most in demand to rewrite Erlang systems in something, anything not Erlang.
The rabid response and reflexive downvoting seen for anything even slightly critical of Rust betrays a level of insecurity that does not bode well for Rust.
For Rust to succeed, it probably needs at least a 10x improvement to compilation speed. That will probably require uncomfortable and difficult changes to the language and core libraries. Is there appetite for such changes among the faithful, who have come to terms with extremely slow builds already? To succeed, Rust needs to attract hundreds more for each current user. They have not accepted its weaknesses yet, and might never.
Rust needs to be easier to adopt. Rust is already as complicated to learn and use effectively as C++, but with nowhere near the level of industry support. Rust enthusiasts are little islands in the ocean; you have to go online to talk over design details, most places. It needs to not demand arcane, abstruse apparatus for ordinary things. (C++ has been good at sealing off difficult apparatus in easy-to-use libraries.)
When you think Rust must be really taking off already, consider that more people pick up C++ for the first time, in any week, than the total who are employed full time coding Rust. Rust really can still fizzle, as unpleasant as that is to consider. Vociferous advocacy is not what will save it. It needs much harder measures, that will appeal much more to people not now coding Rust than to those scattered few who already do. For now.
I see a lot of claims to this effect but having tried to learn both I found Rust much easier. It's also safe by default, which C++ isn't. How exactly do you think it's equally complicated?
Most of the complexity of C++ comes from its long history, with new features layered on old, extra junk you have to say to get the new stuff, and old stuff to stay clear of. E.g., constructors should be explicit, and lots of things should be const, and you have to call std::move() to pass a value that way, and implicit conversions can happen accidentally. But actually using it, sticking to the modern bits, doesn't depend on much in the way of new concepts. Writing libraries for general use depends on deep lore, but that doesn't leak out to users of the libraries, who just get libraries that can do more for them.
In Rust, you need to learn a whole new regime to do even basic things. And, you need to learn workarounds for what C++ does but Rust doesn't, yet. There are literally thousands of times as many people who can answer questions about C++ than about Rust, so mysteries are shallower.
Rust is the first memory-safe language without a garbage collector [*].
That is why it is a big deal. Those are table stakes now.
Everything else is just styling and bonus points.
As for the article title: what killed Haskell was deviously unpredictable performance from large-scale programs. Haskell performance analysis is intractable for big programs. Rust's "just say no to GC" is about as diametrically opposed to that as you can possibly get.
[*] aside from research prototypes -- much respect to MLKit and Cyclone: https://elsman.com/mlkit/pdf/mlkit-4.3.0.pdf
By contrast people really do seem to love building things in rust.
Anything that was written in C or C++ before, and anything written for the web (Rust seems to me to be the only language that cross-compiles easily to WASM, and is useful in the required GC-less context).
So now there are a lot of programs in C/C++ that people have been using for years that often cause some kind of issues, because it's really hard to get memory safety right here.
And there are a lot of things written for the web, where people want something more sane than JS, and even though typescript already filled that hole in a sense, there is just so much space here that Rust still has little competition here.
Rust appears to have an entirely different goal, which is to take everything that has been learned over the last 40+ years of practical systems programming, and make a language that provides safety, speed and concurrency and works well for teams of programmers on large projects. [2] In that sense, it really is looking for wide adoption for the sorts of projects C and C++ have been used for during that time.
In its original version (Smalltalk and Ruby), this talk always struck me as a sort of fun polemic that probably didn't warrant too much serious thought, but here it's been stretched beyond belief, so I'm pretty surprised it's being posted here. Just as an example from the first paragraph, is there really anyone who seriously thinks rust is "Haskell without higher-kinded types"? That substitution has just been made because it vaguely fits the format of the original talk, but is just wildly off-base.
[1] https://haskell.foundation/whitepaper/ [2] I'm paraphrasing here, but check out https://doc.rust-lang.org/book/ch00-00-introduction.html for their explanation
This is a transcript of a Robert C. Martin (Uncle Bob) talk with some substitutions made (SmallTalk -> Haskell, Ruby -> Rust, and others). The original talk was mediocre; the changes haven't improved it.
after clicking: yes!
The functional language conceit exists and its a problem.
It's interesting that Go is mentioned as a target of the conceit. Well, there is a LOT of important software in modern IT systems that is golang based. There has been impressive productivity from that language and its programmers, and I'm someone that doesn't like golang.
Functional languages just haven't produced something on the scale of Kubernetes, or a database, or a web framework, or... really anything. Not even something like Ruby on Rails, which is now replaced with JS flotsam/React, but was significant.
Functional languages seem to assume that you build it and they will come. It hasn't happened.... like ever. The quip is if you keep doing the same thing expecting a different outcome, then that is insanity.
What functional languages need is a significant application. Since they are (theoretically) better for multicore and we are in the age of core scaling rather than serial speed improvement, and have been for a decade... WHERE IS YOUR APP?
It has to be the IQ filter. I simply think that FP requires higher IQ people, and that filters out a massive amount of people that just want to get stuff done and go home to their families or rave. Once you get over a certain IQ threshold, then the people in that cohort actually repel other people. If you've ever been to a Mensa meeting, you should know what I mean.
I think Rust is fine, it is focused on practical goals: rewriting vulnerable C programs, improving Firefox speed and safety, etc.
I don't know of any Rust program many people not coding Rust need. There are plenty of rewrites of existing programs, such as ripgrep and alacritty, but neither offer compelling reasons to switch. (Grep speed has never been an issue, for me, and kitty is much faster than alacritty.)
Rust still has no compelling usage story, such as a program that people need that would be so hard to write in some other language, instead, that no one has succeeded. (People talk up Servo, but who is using it?)
That's a biiiiiig ask. Absolutely massive. The fact that there is some Linux movement in supporting rust in the kernel shows it is largely successful. While you're right it doesn't rise up to "show me the apps" criticism, it is a big win.
"Show me the apps" is not a criticism that you CAN write apps in it. All I'm saying is that something is culturally wrong, because the people that want to write practical, useful, mass-market software choose NOT to use functional programming, even though it has allegedly massive advantages and superiority.
The lisp essay by Paulie guy is informative: it enabled a really smart programmer to outscale a team of programmers, but it hit its limits. It did not scale beyond that, and was, I suppose, too inscrutable to be picked up and supported by others.
So was it Lisp that allowed him to compete for a while? Or the fact that he wrote it and knew it top to bottom and was really really smart and motivated? Probably a bit of both, I personally would argue 80% superprogrammer, 20% language.
Before you say it, it's quite popular and used already. But there's a barrier beyond which a lot of programmers just don't tread, and will stay in Python / PHP / JS regardless (while raving about how awesome Elixir is).
My guess is market size and job security concerns but who knows, maybe some of what you say applies as well.
If Rust finds only the amount of penetration that Erlang has, it will have failed.
> As one might have guessed, this is not an essay. It's a transcript of the following talk by R. Martin with some substitutions made (SmallTalk -> Haskell, Ruby -> Rust, and others). You are free to make any conclusions from this.
I think that means that this isn't a serious enough argument to be worth engaging with.