If my core value prop would be around real-time messaging or streaming data, BEAM/OTP would be my first choice.
Slightly OT: I'm still in dubio about Cloudflare pages, but I'm sure that the platform for the backend reboot that we're starting will be in Typescript. There's so much advantage in having one language for everything.
I've been using emacs + elixir-mode + alchemist since Elixir came out and I have zero complaints, it all works beautifully complete with autocomplete, jump to definition and documentation popups. But obviously emacs isn't for everyone.
In [2], Wojtek Mach explains how the team behind Elixir build Livebook Desktop. He explains how the project started, some subtle bugs found when building the app for MacOS, some limitations of wxWidgets in Windows, and many other implementation details.
It would be awesome if the Elixir team release something like elixir-desktop based on Livebook. That is, forking the Livebook repo and release an official template project for generating desktop applications based on LiveView.
It sucks that syntax matters but it kind of does. I, for one, am somewhat put off by meta-programming and flexibility of Elixir. The idea of having to learn a syntax per-library I struggle to see the benefits of. I have never programmed Ruby or Lisp so perhaps I have not experienced the joy of what Elixir has to offer...
For something to be production-ready I'd expect you to at least cover major things like "latency to serve x in Elixir instead of lang y is k% better" or "EMFU we got when training x in Elixir was comparable to lang y".
These are two random metrics that are of course biased to my experience but the article just feels empty without numbers.
Corporate sponsorship: even with curly braces and mutability, Go probably wouldn't have gained its mindshare without Google.
Performance: It's getting much better, but the BEAM was never designed for maximum performance. People don't like slow platforms, despite the other advantages.
Scale: much like one of the databases written in Erlang, Riak, you typically don't need a BEAM language until your solution is large enough that you've already written it in something else.
I love Erlang, but I'm a lost cause.
Doing AI/ML without Python is doable but not quite as mainstream.
> Scale: much like one of the databases written in Erlang, Riak, you typically don't need a BEAM language until your solution is large enough that you've already written it in something else.
These two seem to contradict one another. How can it not be fast but scale well (I'm thinking of a way as I type this but I'll let you answer)?
I've also repeatedly seen this idea from relative newbies that you can replace things like Redis with a simple Erlang key/value store, possibly using ETS, and the result is always much, much worse in terms of both performance and reliability. A lot of the older Erlang/Elixir proponents will tell you to just use Redis.
Most of the popular statically typed languages also have decent abstractions for concurrency and parallelism now while having far better runtimes, far better performance in almost all cases, far more libraries and much larger communities. Erlang/Elixir will never be more than a small niche.
I have a theory about why there's more getting done with Elixir than you perceive. It's very productive, and you can get a lot done alone or with a very small team. You don't need as much devops. You maybe don't need as many frontend devs, because LiveView. And so on. Elixir teams don't need to hire as much, so when you look around, you don't see a ton of Elixir job postings. Fewer jobs discourages people from leaning into it as a career choice, which means fewer devs, strangling the overall growth of the community.
Also, at least in large tech companies, small productive teams are anathema to managers. Headcount is everything, and you have much more political power managing 10 Java devs than 3 Elixir devs. Never mind that the company is spending more on salaries, because that's the company's money, not yours. Anything you can do to have a bigger team is a win.
I don’t know why it has not. However, it has such a strong set of advantages that people who know what it can do for them keep describing it as a “secret weapon”.
As others have mentioned, it will scale up on a single machine to make use of all the cores (unlike Nodejs, Python, or Ruby). It can already scale horizontally by clustering. Because of the way it is designed, I never have to define a Kubernetes liveliness probe for an Elixir app, whereas I have seen Dotnet and Nodejs apps freeze without crashing (cooperative async reactor can go into infinite loops; Nodejs is very bad about orphaning async execution, by design).
A lot of AI apps is going to involve connecting with unreliable third party services (for example, agents making calls to other api to get information or initiate actions), and may even be on hardware with unreliable network (IoT and edge) and this is where BEAM / OTP shines ahead of pretty much every other runtime and platform.
HN and elsewhere are riddled with Elixir developers extolling its competitive advantages for years … I have had very smart people argue to me why Typescript makes Nodejs so much better, but at this point, I have very little incentive to persuade them. Hence, “secret weapon”.
Python definitely does "use all cores" on a machine with the multiprocessing package, not sure what you mean?
But python doesn't vertically scale very well. A language like Elixir can grow to fit the size of the box it is on, making use of all the cores, and then without too much additional ceremony scale horizontally aswell with distributed elixir/erlang.
Elixir getting a good story around webdev (phoenix and liveview) and more recently the a good story around ML is going to increase it's adoption, but it's not going to happen overnight. But maybe tomorrows CTOs will take the leap.
Strong disagree, this is a skill issue. I have written C++ modules for Python with pybind11 to speed up some code significantly but ended up reverting to pure Python once I learned how to move memory efficiently in Python. Numpy is very good at what it does and if you really have some custom code that needs to go faster you can run it externally through something like pybind11. It is a skill issue mostly. If you are writing ultra low latency code then you're right. You can make Python really fast if you are hyper aware of how memory is managed; I recommend tracemalloc. For instance, instead of pickling numpy arrays to send them to child processes, you can use shared memory and mutexes to define a common buffer which can be represented as a numpy array and shared between parent and child processes. Massive performance win right there, and most people simply would have never realized Python is capable of such things.
I agree that the article falls flat on providing descriptive reasons how Elixir compliments machine learning. But that shouldn't be an argument against the product, but rather against it's cult following that does not provide sufficiently detailed and advanced examples and explanations. Something extremely common in tech journalism.
Good ideas and technologies do get underappreciated and wide spread adoption I would wager is not at all correlated to quality. Javascript, c++, java and others are good examples. Yes they have taken over the world at one point, but there are better designed languages out there. It just seems that people don't want to relearn a new paradigm and you can write software on anything.
one simple example: what Jose Valim calls Distributed²
You can have a livebook (similar to notebook, but distributed and live) and distribute your load not only on multiple GPUs on your machine, but on multiple GPUS on multiple machines
with basically zero effort
https://news.ycombinator.com/item?id=7277957
And also "taking over the world" is generally a larger task than people realize. C is still fairly strong, and at this point it is basically the baseline against we programmers issue our complaints, with only rare and very contrarian compliments given to it... yet there it is. Language turnover is really quite slow.
Elixir developers are also almost non-existent in the United States. It is far more productive and far less risky from a business perspective to hire someone that knows Python for the backend and is forced to work with mypy and pydanic using a major web framework.
The trope that functional programming leads to better code and is more maintainable is also a lie. Defect rate is just as high if not higher per module because of the typing problems.
But no need to put too much weight on this, lots of languages have more than enough users to be viable for a long time. The top stack overflow languages tend to have poor S/N in the user communities.
At least when we used Elixir about eight years ago, we had to also learn Erlang as many libraries running on the BEAM VM that Elixir and Erlang share were only written in Erlang. So that made my brain melt further.
Once I got into the flow, I really liked it, and am glad I got the opportunity to learn it as a junior developer, but I've largely dropped it as there aren't many jobs in my area looking for Elixir.
* Want something flexible and easy and don't care about performance? Use Python
* Care about performance? Use Rust, etc.
Just because it is distributed cloud native doesn't mean for single requests or functions it is fast.Almost all SaaS applications are request based and do not need P2P or real time communication. BEAM is worse here. If a company does need that they can specialize that part of their application.
To each their own, I suppose.
1. From the outside, the benefits appear almost entirely theoretical, and in many cases, they are, and the issue with that is that in order to understand the real, non-theoretical benefits of having everything integrated into Elixir and OTP, you have to go all-in. Elixir has to own your app. Elixir is amazing, but when you start talking about its ability to do async IO, people say, "my language has a framework for async IO". When you start talking about its ability to automatically timeout and restart tasks and be fault tolerant, people say, "my language has a framework for network timeouts and I have systemd/k8s/whatever for application restarts". When you start talking about having ETS tables built in, people go, "I have Redis and it works amazingly well, why do I need it inside my app?" You can't adopt OTP piecemeal like you can with e.g., Tokio, or Redis, or Sidekiq, or whatever other thing. It doesn't really make sense to. Other languages might not be as powerful and not as well integrated, but through sheer force of effort, people make existing languages compose with general purpose tools and this is often "good enough". Of course you can write a single small service in Elixir, and this is fine and works, but this often runs into the issue of "now we have an Nth language beyond our normal languages, and we already have caching/monitoring/supervision figured out, so what is this actually adding?"
2. The library ecosystem is still very weak if you're used to e.g. Ruby, Python, Java, or Go (or even Rust, which already has way broader library coverage that is often significantly higher quality) and being able to find a mature driver/integration for a given API or native library. I don't know if this will ever catch up or reach a critical level where most people will find it acceptable. Stuff like the AWS library languished for, literally, years. Just critical stuff that's totally sorted in other ecosystems. This is purely anecdotal on my part, but it seems that way less people as a percentage of the whole community are working on libraries than applications. I have no idea why this is.
3. The performance - especially around latency and garbage collection - is great, but it's probably not "better enough" to actually matter in the domains where people tend to adopt Elixir. What I mean by this is, if you're already working on a project where the performance profile of Ruby or Python or Java is fine (web stuff, relatively simple network or CLI stuff), Elixir may be better in certain respects, but it's not so much better that it provides enough of a step change to make the other issues worth it. For our application, we had some very CPU intensive math stuff that dominated processing times, but we wrote that in Rust anyway, so the higher-level control layer could have been literally anything and it wouldn't have had any effect on performance at all. We picked Elixir because we liked it, but the product would not have failed for performance reasons if we had picked Ruby.
4. It is still, somehow, relatively difficult to find Elixir developers that understand OTP and the more operational concepts of the ecosystem (releases, deployments, observability etc.) at more than a beginner level. The community is still rather beginner oriented and there are a ton of folks who have done simple Phoenix apps and never gone further. I want to be clear that I mean this simply as an observation and not in a derogatory way at all. There is absolutely nothing wrong with having beginners in your community. Beginners are essential to ensure the community continues to grow and a language community without beginners is dying. But over time you have to build up a significant base of experts that can start and lead teams and companies independently, without having to rely on a class of consultants (of which the Elixir community seems dominated by - which is another conversation entirely that I will not get in to here). In Elixir, it seems like the proportion of experts has stayed small relative to the number of experts in e.g. Python, Ruby, Clojure, Go, or Rust. I can go out today and hire someone who understands the inner workings of Python as well as anyone working on the Django team. Anecdotally, as someone who has hired for Elixir roles, this seems difficult to do in Elixir, and I don't know why this is. I have been trying to understand why I have this perception and whether this perception is correct for literally years now, and I haven't been able to figure it out. (Strangely, Erlang on the other hand almost tilts too far in the other direction, with a ton of senior-level people and very few junior or mid-level.)
I could say more but it's probably better that I just leave it at that. I also want to say that I write all this as someone who really loves Elixir but wants to be honest about the issues that I have seen hamper adoption at the companies I've been at.
Over the years I've tried to learn more about releases, deployments, distributed applications, fault tolerance in environments with multiple nodes, and hot code upgrades; but none of the projects I've worked on have reached sufficient scale or complexity for those things to really matter. My impression is that it's difficult to learn about these topics without real hands-on experience (something that seems hard to replicate in small/solo projects).
Would you mind expanding on that last point? I'd be curious to learn more about these knowledge gaps that you observed, as well as any recommendations for "leveling up" past them, whether it be projects that one could build to learn more, books to read, etc.
Hiring juniors (and getting work AS a junior I imagine) is a challenge. I don't think Elixir is that hard to learn (and has fewer foot-guns than many competitors), but it makes evaluation more difficult.
Erlang is a different story. If you're interested in the BEAM, it isn't terrible. It has a few of the same problems, but the combinatorics are reduced.
People really need to unwarp their brains from how they judge libraries in Elixir compared to other ecosystems. Erlang is 30 years old. Elixir sits on top of that stability. Elixir will very likely never reach 2.0 because it doesn't need to. And if a 2.0 does come it will be simply to remove deprecated functionality.
Not having 12 major version releases per year means what you think are "abandoned" are actually stable and perfectly fine to use.
In Elixir, we don't really care about the last time a version was pushed. I regularly use and rely on libraries that haven't been touched in years because they don't need to be touched.
As for "two string types" maybe I'm not working on hard enough problems, but in ~4 years of Elixir I've never once needed to use a charlist. My understanding is that it's a backwards-compatibility thing from Erlang and I'm not even sure when I'd ever need to use it over a string.
- two different string types: You have undercounted (three types in Erlang, and Elixir adds a fourth), and suggested that something which is a non-issue for the vast majority of Elixir code. Most Elixir code deals with `String.t()` (`"string"`), which is an Erlang `binary()` type (`"string"` in Elixir, `<<"string">>` in Erlang) with a UTF-8 guarantee on top. A variant of the Erlang `binary()` type is `bitstring()` which uses `binary()` to efficiently store bits in interoperable ways. Code interacting directly with Erlang often needs to use `charlist()`, which is literally a list of integer values mapping to byte values (specified as `'charlist'` in Elixir and `"charlist"` in Erlang; most Elixir code would use `String.to_charlist(stringvar)` in the cases where required.
Compare and contrast this with the py2 to py3 string changes and the proliferation of string prefix types in py3 (https://docs.python.org/3/reference/lexical_analysis.html#li...).
- three different types of exception: true, but inherited from Erlang and the difference is mostly irrelevant. The three types are exceptions (these work pretty much as people expect), throws (non-local returns, see throw/catch in Ruby; these are more structured than `goto LABEL` or `break LABEL` for the most part), and process exits. In general, you only need to worry about exceptions in most code, and process exits only if you are writing something outside of your typical genserver.
- return AND throw versions for almost every func: trivially untrue, but also irrelevant. Elixir is more sparse than Ruby, but still comes more from the TIMTOWTDI approach so most libraries that offer bang versions usually do so in terms of one as the default version. That is, `Keyword.fetch!` effectively calls `Keyword.fetch` and throws an exception if the result is `:error`. It also doesn't affect you if you don’t use it. (Compare the fact that anyone who programs C++ is choosing a 30–40% subset of C++ because the language is too big and strange.)
- slow compiler without much in the way of compile-time checking: I disagree with "slow", even from a 2020 perspective, and "compile-time checking" is something that has only improved since you decided that Elixir wasn't for you. Even there, though, different people expect different things from compilers, and not every compiler is going to be the Elm compiler where you can more or less say that if it compiles it will run as intended. (I mean, the C++ compiler is both slow and provides compile time checks that don't improve the safety of your code.)
- constant breakages over trivial renamings "just because": false‡. Elixir 1.0 code will still compile (it may have deprecation warnings). To the best of my knowledge, nothing from 1.0 has been hard deprecated. ‡If you always compile `--warnings-as-errors`, then yes you will have to deal with the renaming. But that is a choice to turn that on, even though it is good practice.
- having to burrow into erlang errs half the time since many elixir funcs are just wrappers: not an issue in my experience, and I can only think of a handful of times where I have had the Erlang functions leak out in a way where I needed to look at Erlang error messages.
Elixir isn't suitable for everything, but frankly your list of so-called shortcomings is pure sour grapes.
Personally I don't like FP.
is it wrong to love it then?
Haskell is one of the best in the world in this. But it also happens to be incredibly complex and also weird in it's syntax, to the point it does make code fringe and unreadable.
There is a real difference here - a language that can help you fact check your logic and reason that you are accessing a potentially null value in this context can be better if it's drawbacks don't outweight it.
Elixir isn't static so it doesn't do this but it does blow up a lot compared to other dynamic languages. It's abstraction of proccesses and threads and what not, does actually make your code more portable and more modifyable. It offers a specific paradigm for programming that works great for multiple proccesses and the whole language is built around it. So if you are going to be using that paradigm anyway it would be a better choice.
It's also just pretty how it fits together.
For example I have not found a language whose error handling I like - Go and Rust are super meticulous, for example and you get error handling code everywhere, which is good for stability but not great for developer experience. TS/JS/Node suffers from the opposite - error handling is an afterthought and you never know what exception will come at you and from where and bomb you whole server, so you end up relying on cloud solutions running large redundant arrays of processes. Isolation of errors within a process and message passing is a great abstraction.
Another thing is that I get to spend a lot of time on is setting up Redis/Cloud stuff to do basic queue, caching and cron. That stuff takes up time, increases complexity, creates new sources of error and grows institutional knowledge. Using a system which has those built into the language is a major improvement. Vercel/Deno/Bun/etc are solving some of the above by including them in their cloud offerings with relatively good DX, but it still increases complexity, takes you out of code and locks you in with the vendors.