Nim makes fast, small executables. It has an excellent heterogenous JSON data structure and a good dataframe library. It prefers the stack so strongly that dynamic data structures (sequences and tables, basically its lists and dictionaries) are pointers on the stack to heap data, where the lifetime is managed by the stack frame. I don't think I have any dynamic references anywhere in my program, and don't have to worry about GC at all. The type system is simple, sensible, and guides you to correctness with ease. Nim also defaults to referential transparency; everything is passed immutably by-value unless you opt out. Generics are powerful and work exactly as you expect, no surprises. Universal function call syntax is ridiculously powerful: You can write the equivalents to methods and interfaces on types just by making procedures and functions that take a first parameter of that type; not needing those abstractions greatly simplifies and flattens code structure. It's just procedures and objects (functions and structs) all the way down.
It's been a real joy to work with and reminds me of when I discovered D back in the day, only it's even better. If you imagine native-compiled type-annotated Python where nearly 100% of your code is business logic with no cruft, you're getting close to the Nim experience.
Isn’t that the same as a C++ vector or map on stack? They allocate internally as needed, and the whole container is destroyed when it goes out of scope.
So now the language can credibly claim the same as c++ - no room left closer to the metal. But it's packaged in a much nicer syntax (imho), and has features like macros which we can expect I'm C++ in maybe 10 years, if we're lucky.
You can also compile C projects with Nim like bearssl [1]. Nim takes care to compile the C files and recompile them when config flags change. It's actually really nice.
1: https://github.com/status-im/nim-bearssl/blob/99fcb3405c55b2...
I hope it gets more popular, seems like a much much easier to use Rust
I think most of them are available via nimble.
After programming professionally for 25 years, IMO Nim really is the best of all worlds.
Easy to write like Python, strongly typed but with great inference, and defaults that make it fast and safe. Great for everything from embedded to HPC.
The language has an amazing way of making code simpler. Eg UFCS, generics, and concepts give the best of OOP without endless scaffolding to tie you up in brittle data relationships just to organise things. Unlike Python, though, ambiguity is a compile time error.
I find the same programs are much smaller and easier to read and understand than most other languages, yet there's not much behind the scenes magic to learn because the defaults just make sense.
Then the compile time metaprogramming is just on another level. It's straightforward to use, and a core part of the language's design, without resorting to separate dialects or substitution games. Eg, generating bespoke parsing code from files is easy - removing the toil and copypasta of boilerplate. At the same time, it compiles fast.
IMHO it's easier to write well than Python thanks to an excellent type system, but matches C/C++ for performance, and the output is trivial to distribute with small, self contained executables.
It's got native ABI to C, C++, ObjC, and JS, a fantasic FFI, and great Python interop to boot. That means you can use established ecosystems directly, without needing to rewrite them.
Imagine writing Python style pseudoocode for ESP32 and it being super efficient without trying, and with bare metal control when you want. Then writing a web app with backend and frontend in the same efficient language. Then writing a fast paced bullet hell and not even worrying about GC because everything's stack allocated unless you say otherwise. That's been my Nim experience. Easy, productive, efficient, with high control.
For business, there's a huge amount of value in hacking up a prototype like you might in Python, and it's already fast and lean enough for production. It could be a company's secret weapon.
So, ahem. If anyone wants to hire a very experienced Nim dev, hit me up!
It really is a nicer, better Python. And I say that as someone who does like Python.
How does that work? What i mean specifically is how convenient is it to use js interop in dev time, and not just compile nim to js as a standalone lib?
Can we simply call something like browser API directly from Nim (Or with fairly simple wrapper)?
To be fair, I did have to spend like 2 hours tuning my ESP32 code for handling a 22 kSPS ADC where microseconds matter. ;) Mostly just to avoid extra allocations as I was pretty new to Nim at the time.
Ah, but no major regressions in performance or changes needed for ~4 years!
Contact info?
Nim has been my language of choice for the past decade and I'm really happy with the new features in Nim 2.0. Some of them are real gamechangers for my projects. For example, default values for objects theoretically allow me to make Norm[1] work with object types along with object instances. And the new overloadable enums is something Karkas [2] wouldn't be possible at all (it's still WIP though).
More and more large companies and startups are adopting Nim.
Super excited for Nim 2.0 and huge thanks to all who contributed!
Ineresting.
Are there any stats / data on this, or is it anecdotal?
Even if anecdotal, can you name some names?
The only downside is some of the included modules being moved to 3rd party repositories, as mentioned at the very bottom. It's not a big deal, but it was nice having SQLite support built into the library. I suppose once you support some databases, you'll be pressured to support more and more. I am a bit surprised to see MD5 and SHA1 support moved out though.
While it's nice to have path or logging support in the batteries, some other things are better as third parties, to allow them to evolve.
I find Nim to be an absolutely fascinating language. I've been trying to find a reason to use it on my job (my work is mobile-adjacent so the idea of compiling to JS and to ObjC is fascinating) but haven't gone beyond playing around with it so far. I've been comparing it to Rust and it's just so much simpler to get started with.
This is great for reusing Nim code in a web app, and possibly for performance critical code.
One thing I'm also lucking is out-of-the-box interop with C/C++ libraries without creating own adapters (so that you can just import header and be done with it).
Another thing is I wish it had similar easy interop with Rust - just to increase adoption and also because in Rust easier to find high quality cross-platform crates (including mobile) that work without hassle even on mobile devices.
I worry in few years either Python will catch up (because of faster python, non-GIL, nuitka, briefcase for mobile etc) or Mojo will eat Nim lunch.
That said Nim does have the nimpy library that allows for pretty seamless interop with python. Which means you can just import PyTorch, or scipy, or opencv and use them in Nim.
Zig is nice and I like its optionals support and error handling approach. But I was put off by its noisy syntax, e.g. !?[]u8 to represent an error union of an optional pointer to a many-pointer of uint8. Also having to prepare and weave allocators throughout most of the code that needs to dynamically allocate (which is most of the code) gets in the way of the main logic. Even little things like string concatenation or formatting becomes a chore. Zig also doesn't have dynamic dispatch, which makes polymorphic code hard to write; you have to work around it through some form of duck typing. In the end I realized that Zig is not for me.
[1] https://github.com/khaledh/axiom [2] https://github.com/khaledh/axiom-zig
- its module system, especially not being able to have mutually recursive imports (there has been a 7 year old proposal[1])
- order-sensitive declarations of procs (i.e. can't use a proc defined further down in the file unless you add a forward reference to it). For the latter there's an experimental pragma[2], but it doesn't work a lot of times once you introduce mutually recursive calls
- object variants requiring declaration of a separate enum instead of allowing inline declaration of the variant cases, and a close issue[3] with not being able to define the same field names under different variant cases.
[1] https://github.com/nim-lang/rfcs/issues/6
[2] https://nim-lang.org/docs/manual_experimental.html#code-reor...
It doesn't have it as a language feature, but it does have VTables just like C would. The `std.mem.Allocator` is an example of this.
I think looking at the examples (which is essentially the same code in different languages) gives you a high level idea, but they only scratch the surface when it comes to language features (for instance the Zig examples don't use any comptime features):
Zig: https://github.com/floooh/sokol-zig/tree/master/src/examples
Nim: https://github.com/floooh/sokol-nim/tree/master/examples
Odin: https://github.com/floooh/sokol-odin/tree/main/examples
Rust: https://github.com/floooh/sokol-rust/tree/main/examples
I would probably use Nim for CLI tools, server applications, maybe GUI applications and games too.
The Zig teams seems to be putting much more effort into the whole compiler infrastructure, which is really amazing in my experience. There’s some great innovations there.
I wouldn't necessarily prefer Nim for any of the things you listed but this doesn't have the same argument as for games with Odin (which has great tools and libraries for making games as well as gives a much better overview of important things you'll have to care about for making them in terms of performance, etc.).
Rather, it's because I've found that Nim belongs with the other languages that think that complexity can be managed by being hidden well enough, which I've found is simply not the case when something actually needs to be debugged or you need to understand the behavior of the program.
Hiding/ignoring allocation errors, not making allocation explicit, not making deallocation explicit, etc., makes for a much worse time actually understanding what's going to happen. Adding tons of GC options like alternative GC implementations isn't going to fix it and this new one is really just another example of trying even harder to hide complexity.
I think the ultimate irony of these languages that have magical features like move semantics is that they do some of those things in the name of performance but in practice many of them are so complicated to write well-performing code in with these space technology features and non-obvious behavior that the end results are worse than much, much simpler languages. I've also found that these languages' development cycles (for the end user) isn't that much longer than the space tech ones because there is ultimately much, much less to use in them so people end up just writing the actual code instead of trying to wrangle all of the magic.
There were loads of specific differences, but if I could characterize both languages in a simple way:
- Nim seems to emphasize being a swiss army knife in the way that Python is, except as a compiled language.
- Zig is a much more focused language that tries to hit a certain specific niche - being a successor and replacement for C - and hits that mark spectacularly.
I think language preference comes down to what your personal needs and wants out of a new language that isn't being served by whatever you're using currently. I personally landed in the "Zig" camp because the way it approaches its ambition of being a C successor is intriguing, but I could see why other people might land on Nim.
On top of that you have only indirect control over memory allocation and deallocation, which goes completely against Zig's values where custom allocators are used and everything that allocates should take an allocator as an argument (or member in the case of structures). In contrast to that there isn't even the concept of an allocator in the Nim standard library.
I would say that my experience with Nim has made me fairly certain that Nim has absolutely no desire to make things obvious but rather chooses convenience over almost everything. It's not so much a competitor (in performance or clarity) to Odin or Zig as it is a competitor to Go or something with a much higher-level baseline.
On top of all of this it doesn't really have tagged unions with proper support for casing on them and getting the correct payload type-wise out of them, which is an incredibly odd choice when all of its competitors have exactly that or an equivalent.
Overall I would say that coming from Odin or Zig (or Go) and actually liking those languages it's very hard to like Nim. I could imagine that if someone came from a much higher-level language where performance is nearly inscrutable anyway and nothing is really obvious in terms of what it's doing, Nim would feel like more of the same but probably with better performance.
Edit:
Often while reading the Nim manual, news and forum posts, etc., I get the sense that Nim is really just an ongoing research project that isn't necessarily trying to solve simpler problems it already has along the way. If you look at some of the features in this announcement, it's hard to see anyone ever asking for them, yet here they are. In many ways it's way worse than Haskell, which often gets derided as "just a research language". A lot of what Nim has makes for a much worse experience learning and using the language and I'm sure it doesn't get easier in the large.
That seems accurate. Dealing with raw pointers as one does in Odin or Zig is very much de-emphasized in favour of dealing with safe references, and a lot of effort is put into optimizing out all the overhead of those reference checks (hence ARC/ORC) and writing code to evade them. The manual memory management features of Nim are there for flexibility and fallbacks and are not really the main way to write code: even for embedded. The stuff that Zig (and Odin?) do surrounding allocators and alignment, and constructs for slightly-safer pointers, are really very interesting yet are most helpful if you are indeed working with pointers and worrying about offsets: which you usually aren't in Nim.
I am curious as to what you mean about comptime, though. I have gotten the impression that equivalent constructs in Nim are more powerful. You have `static` blocks and parameters, `const` expressions, `when` conditionals, and then also both templates and typed macros operating on the AST (before or after semantic checking)... `when` even provides for type-checking functions with varying return types (well, monomorphized to one type) via `: auto` or the `: int | bool | ...` syntax.
I will also defend "naked imports" as a feature that works very well with the rest of the language: functions are disambiguated by signature and not just name and so conflicts scarcely occur (and simply force qualification when they do). And, this allows for the use of uniform function call syntax - being able to call arbitrary functions as "methods" on their first parameter. This is incredibly useful and allows for chaining function calls via the dot operator, among other things. Besides, if you really want you can `from module import nil` and enforce full qualification.
Interest in proper structural pattern matching sparked back up again recently and some complementary RFCs were proposed: https://github.com/nim-lang/RFCs/issues/525 and https://github.com/nim-lang/RFCs/issues/527.
Edit: Ouch. Just found this thread. Very disappointing, and actually makes a greater case for institutional ownership: https://forum.nim-lang.org/t/10312
That said, I'm a Slav which is the origin of the word "slave" because in Europe, slaves were predominantly Slavs once. I don't really mind it because it feels irrelevant today. Connotations of "master" doesn't feel that ancient yet though, considering that black people weren't allowed to live in Palo Alto, CA (heart of Silicon Valley today) until 1950's.
If you can't handle a little silliness from humanity, might as well bow out now.
It's not surprising he has his head screwed on straight. There is clear genius in Nim's design. I'm not a genius, and I don't know much about compilers, just scant knowledge of some data structures and algorithms, but what I do know is that being able to make something so powerful be used by mere mortals like me is very much genius (an idiot values complexity and all that jazz).
But I left it because of recursive imports. I had to basically put all my types into one file and use them from various others. For a relatively medium sized project (~10LOC), its a but of a hassle. Refactoring is an issue.
That being said, the language is fantastic. Can anybody with experience suggest me what HTTP library/framework do they prefer for servers?
Chronos is probably the most feature rich and uses async. Mummy is newer and uses a threading model. Both are used in production.
That being said, the Nim team is working on it as per a few issues: [1] https://github.com/nim-lang/rfcs/issues/6 [2] https://forum.nim-lang.org/t/2114
I love the language, and this is probably the only bottleneck for me.
Even though it's older than its peers like Rust and Go, it still quite the underdog.
Hope more people start paying attention to it.
now you got me really interested.
At some point dlang-betterc + zig + nim should have an interoperability article and share libraries.
An ABI for languages with a proper type system seems fantastic. Swift, Rust, Nim, D all share very similar type systems (and memory management systems) and it would be very cool to see what kinds of interop easy dynamic linking would allow.
I hope this will help with bindings for C++ libraries that have historically been tricky to wrap.
For example, I would like to use Qt from a compiled language that's a pleasure to use, and this project looks promising:
Questions:
- value/object semantic: i peeked at some code, and i can't tell what is a value, and what is a reference type, is everything heap allocated?
- tooling: what's the state of their language server? does it work with all of their language features?
- debugging: does gdb/lldb understand nim's types and slices?
And finally: is a no-gc mode available?
I'll play with it later today, it's always been in my todo list of languages to try, now is the perfect time
So "var i: int" is value, "var i: ref int" is a heap allocated reference that's deterministically managed like a borrow checked smart pointer, eliding reference counting if possible.
You can turn off GC or use a different GC, but some of the stdlib uses them, so you'd need to avoid those or write/use alternatives.
Let me say though, the GC is realtime capable and not stop the world. It's not like Java, it's not far off Rust without the hassle.
>is a no-gc mode available?
You can disable gc, but most of standard library depends on it. But in Nim 2.0 there's finally support for ARC and ORC (ARC + cycle collector).
Is not exactly true, smaller than 24 bytes is passed by value, the compiler optimizes larger calls and passes by reference implicitly.
If a type is declared with 'ref', it's a reference type. Otherwise, it's a value type.
I did ask him about it eight years ago: https://forum.nim-lang.org/t/1392#8675
But that was a little early on and there have been other priorities for the language.
Now that ARC/ORC is considered "complete," are there any remnants of the old GC still in the language, or has the entire ecosystem hopped over?
proc echoLine(): void = discard
Discard looks cool! I'm Rust I had to use the unimpl macro crate [0] to get sane error messages. It would be good if that was build in though. Improved type inference
...
let foo: seq[(float, byte, cstring)] = @[(1, 2, "abc")]
This looks like a normal type declaration to me, why is there any inference involved?Previously Nim didn't do any "reverse" type inference so you'd need to say `@[1'f64, 2'byte, "abc")]`. That was because it's a constraints problem that can become exponentially expensive to solve. Exploding compile times in Rust and Swift are good examples of this. But there's limited subsets which can still be quick and are helpful like this case.
But that example looks about as simple as it can be, so I clearly must miss something.
> The compiler needs to infer that the 1 is a float type, 2 is a byte, and compile it appropriately.
And I don't understand _why_ it has to infer anything, as the type is explicitly declared. I mean, there are 2 possibilities: * 1 is both a valid integer and a float literal => Nim needs the type declaration on the left to unify the type (from "integer or float" or "numeric" or whatever the type checker inferred) to `float`. * 1 is not a valid float literal (but an integer) => the type is not inferred, but implicitly converted to `float`. In both cases the solution does not involve inference?
- npeg lets you write PEGs inline in almost normal notation: https://github.com/zevv/npeg
- owlkettle is a declarative macro-oriented library for GTK: https://github.com/can-lehmann/owlkettle
- ratel is a framework for embedded programming: https://github.com/PMunch/ratel
- futhark provides for much more automatic C interop: https://github.com/PMunch/futhark
- nimpy allows calling Python code from Nim and vice versa: https://github.com/yglukhov/nimpy
- questionable provides a lot of syntax sugar surrounding Option/Result types: https://github.com/codex-storage/questionable
- nlvm is an unofficial LLVM backend: https://github.com/arnetheduck/nlvm
- chronos is an alternative async/await backend: https://github.com/status-im/nim-chronos
- cps allows arbitrary procedure rewriting in continuation passing style: https://github.com/nim-works/cps
A longer list can be found at https://github.com/ringabout/awesome-nim.
https://github.com/status-im/nimbus-eth2
https://github.com/orgs/status-im/repositories?language=nim&...
I gave a talk about it here: https://www.youtube.com/watch?v=elNrRU12xRc including some more intense use of Nim (for inline PEG grammars and data-parallel processing with Weave)
https://git.sr.ht/~bptato/chawan
Also, there exists another Nim web browser project; from what I can tell, it's in somewhat earlier stages of development.
https://github.com/sergiotapia/torrentinim
It's easy to understand code.
Shameless plug: I'm working on a programming language called Yaksha that is also inspired by Python like syntax, however, philosophy differs from nim. Please take a look and let me know what you think :) https://yakshalang.github.io/documentation.html
Question - does dark theme and light theme both have the contrast issue?
https://github.com/SciNim https://scinim.github.io/getting-started/
Generally, projects created by Mamy Ratsimbazafy (mratsim) are a good start since he's very adept at optimisating data science-related libraries.
You might want to ask in the #science channel of the Nim Discord server since although it's often quiet, that's where people working on these repositories hang out.
https://archive.fosdem.org/2022/schedule/event/nim_hpcfrompy...
The big issue Nim faces isn't performance but rather the relative community sizes, and thus how many libraries are available (and also how much help you might find when you run into problems).
This cleans up Nim's syntax a little, we use it in production with not much maintenance.
Could someone share some bad experiences when adopting Nim so I can weight that in? I'm seriously considering it.
- Tooling is not great. The language server has a tendency to silently crash on occasion, and it's no rust-analyzer to begin with. A tooling rewrite has been delayed behind proper incremental compilation, which has been delayed behind ARC/ORC...
- Interfaces ("concepts") are experimental and there are two differing implementations.
- It lacks proper sum types and structural pattern matching in the core language. There are a number of quite good macro-based libraries that provide for this, however: fusion/matching, andreaferretti/patty, beef331/fungus, alaviss/union...
- Optional types are not the standard: the stdlib will throw exceptions. This is more so a personal preference than anything.
But that's about it. I do like Nim quite a lot.
We've talked about this before! You know it has sum types, just not the variation you want.
How is interop with Rust these days?