Not (only) because that's faster to run, but because it's faster to change while still remaining somewhat working correctly. And the current crop of compilers not only can produce stunningly fast code, but also awe-inspiringly great error messages that put the 90's and oughties' cryptic error messages to shame. Try that with a dynamic language!
I think it’s interesting how static and dynamic languages have grown closer together since this was done. I’m not sure there’s really all that much to argue about anymore. Your static languages tend to have many of the features people like about dynamic languages and vice-versa, though of course that depends on the specific language.
Yes, you can still do dynamic typing. But I'd argue that using `dyn Any` you can do so in Rust, which is a statically typed language if I ever saw one.
Otherwise I completely agree about the languages growing closer together.
Sure a handful of people use Rust now and Swift and Go now, but I think you missed the whole point of the fine article
But the key thing is that none of this is for performance purposes. Which is kind of the whole point of this talk: speed isn't everything, and the productivity of languages without types outweighs that of languages with types. But it turns out that types are really useful even without any sort of performance benefit, hence why a lot of languages are turning back to typing code without using those types at runtime at all.
Or similarly, he makes a point about how it's often possible to statically analyse dynamic languages, which is true, but it turns out that it's still so much easier to analyse statically typed languages that adding types back in often makes sense. If you read library documentation for packages using gradual typing, this is often one of the things they specifically mention as a reason for using their library with static types instead of without.
The point, as I understood it, was that you don't need static types to still get lots of cool things (performance, analysis, etc). Which is stuff I don't disagree with. But the quality of those things with (well-designed) static languages is still so much higher than it is in dynamic languages, which is why so many languages are now trying to support both modes.
That's a really good observation. Many people ask why Python is the lingua franca of ML. It's a glue language that allows you to prototype quickly and use low-level libraries like numpy for matrix calculations, etc.
I wish Python type hints were taken more seriously. It's crazy that you can type them in function definitions but then Python completely ignores them. mypy does a much better job at that, but that's not the Python most people use.
Its a lot more than a glue language. Its just the right amount of high level abstraction with easy extensibility to make it applicable to most applications out there.
Even for things that are performance sensitive, extra containers/instances are cheaper than developers.
We're seeing a swing to what I call "pragmatically typed" languages: those with extensive type inference and possible escape hatches.
TypeScript isn't faster than JavaScript, it doesn't even change it's emit based on types - which are completely erased.
Similarly, Python with type annotations is not a statically typed language either.
A lot of ML startup use Python in production. OpenAI uses Flask to power their API for ChatGPT and their other models. Almost every AI startup uses Python in production for more than their ML/AI stack.
Python, Javascript, PHP, and Ruby are still very popular languages today. More popular than Typescript, Rust, Swift and pretty much every typed language outside Java, Go, C and C++.
I think, in general, we've seen the hype cycle shift from static to dynamic to static, and we see that e.g. startups and otherwise highly opinionated people who care about such things maybe change their preferences (or simply new people entering the field with new preferences), but...
the millions of people writing Java code didn't go away when Ruby and later Node were all the rage, and all the PHP jobs also still exist.
Trends come and go, but dynamic and static typing have coexisted since forever. LISP is the second oldest language and if that's too niche, Smalltalk was decently popular for a while.
I also agree though that modern statically typed languages are much better than e.g. Java used to be.
There are highly dynamic languages like Lisp that have implementations that have generated “stunningly fast code” for ages.
Dynamic languages increase productivity, static languages learn from dynamic languages & find ways to statically verify the previously thought dynamic patterns.
It turns out the reason static typing seemed like a pain at the time is because we didn't have good tools. You'd write code for a while, then you'd run the compiler, and UGH there's all these errors to go back through and fix.
Now that my IDE highlights the errors as I go, not to mention has good auto-complete and jump-to-definition, I am much more productive in a statically-typed language than a dynamic one.
Interestingly there are still areas where most people seem to prefer dynamic typing: service APIs. JSON everywhere. Is it because JSON is actually better, or is it because we don't yet have good enough tools for schema-driven APIs (e.g. Protobuf, Cap'n Proto, etc.)? If we had those tools, would schema-driven APIs be widely seen as being more productive? (I suspect so but I am perhaps biased.)
Java IDEs were certainly highlighting errors, auto-completing, refactoring etc in 2008. Admittedly IntelliJ (the most impressive one) didn't have a free version then.
The error checking nature of compilers was never a real pain point. It's a question of if you want pain now or later.
You can still have schemas in the code through. We use alot of pydantic at work for this. You have your data schema class that is statically typed which you interact with and decide/encode it to json in the background.
All services expose an api to get JSON schemas from the apis so you can automatically generate the remote types used by other services when they change.
Using schemas for remote services has made life simpler for us so I would say schemas are a huge win. Not supporting JSON is not something I see happening anytime soon though, multiple parallel encodings based on schemas might be a possibility though.
The main issue is that static typing is a global property of a program, and big distributed systems don't have such global properties. Each part can be upgraded independently, at any time.
In general, you don't own both sides of the wire. People who have worked at Google are used to owning both sides of the wire :) (I also think the model of the data center as a single computer stopped scaling, and that's why it's so hard to write software there these days)
The argument I usually make is: Why isn't the entire Internet statically typed? Why don't we have statically typed HTTP and SMTP and IRC and XMPP ?
If you admit there's a problem there, then there are also problems with static typing in the areas where people use JSON.
---
Dynamic typing is basically for when static typing stops scaling / runs out of steam.
I wrote a long post about this - A Sketch of the Biggest Idea in Software Architecture, i.e. about software composition at runtime, not compile time:
https://www.oilshell.org/blog/2022/03/backlog-arch.html
Also, static typing doesn't scale to the code even on a SINGLE machine, on either Windows (COM and successors) or Linux (Debian-style ABI compatibility, and shell-style composition)
https://lobste.rs/s/sqtnxf/shells_are_two_things#c_pa4wqo
Some people scratched their heads at that argument, but I would say it's only irrelevant if you don't care if your system works when it's deployed. If all you want is for the IDE to say green and commit your code, then you can just lean on static typing. But if you care the problem from end-to-end, you should also care about dynamic typing and runtime software composition :)
The other argument I make is that SREs are responsible for all the problems that escaped the static type system, and ~10 years ago SREs started making as much or more money than SWEs. So that is a lot of problems.
The problems that static types catch aren't the most important ones; they're just the ones that affect certain people's jobs.
---
Protobufs do a pretty good job of evolution, but I've noticed it takes awhile for people to understand that field presence is dynamic, not static. They want their Maybe<> type, but that kind of static typing simply doesn't work in distributed systems.
I'd say better tools could help in some ways, but you still have the fundamental problem that even if I go and download Github's or Stripe's schema from their codebase and statically link it into my code, I don't control when they deploy their systems.
They can literally update it in the FUTURE, and static checks fundamentally can't handle that -- only dynamic checks can.
> They can literally update it in the FUTURE, and static checks fundamentally can't handle that -- only dynamic checks can.
You're not wrong, but you are underselling what reasonably disciplined adherents to a static regime can use to their advantage.
----
My favorite commentary (in favor of your position of what static can't do*) include:
- some remarks Gilad Bracha once made on some podcast (might've been Software Engineering Radio) about how hardware at base is not static, which feels somewhat counterintuitive when low high-level languages like C are in the same room suggesting that the truth is otherwise
- Lars Bak giving an interview about V8 at Microsoft to Erik Meijer and Charles Torre(?) where Lars breaks the latter's brain by pointing out that even if JS hadn't won and you were dealing with a purportedly better static language like C# compiled down to CIL, then the engine would still apply the same treatment to the payload it received, insofar as performing "inefficient" dynamic validation
* which happens to be my position, too, to be clear
I can fault Ruby for many things, but Bundler is easy to use and just works (except in the case of C bindings, but I don't think that's in the scope of Bundler to fix), and from what I hear, cargo should be similar.
[0] Although this is also an ecosystem problem; in e.g. Ruby, libraries specify version ranges for subdependencies, and then Bundler will try to resolve all version ranges, and report an error if there's no way to resolve it. Usually, this works quite well. A Java library A will just specify a particular version of library B, you have no guarantee that the version of B that you get when you add A to your build is the same one. There are various solutions to this, such as BOMs, but if the library you use isn't in a BOM, you're out of luck. In any case, this is not a theoretical problem, I've spent countless hours trying to debug problems with conflicting dependency versions causing errors at runtime because certain classes or methods didn't exist.