In addition to the linked examples, I have some code of my own which is made simpler due to this feature: https://github.com/RustAudio/ogg/commit/b79d65dced32342a5f93...
Previously, the table was present as an array literal in C-style, now I can remove it once I decide for the library to require the 1.46 compiler or later versions.
Link to the old/current generation code: https://github.com/RustAudio/ogg/blob/master/examples/crc32-...
This sounds promising. Can you give examples? I don't know Rust at all, and the reason I like C++ is its metaprogrammability.
> All boolean operators except for && and || which are banned since they are short-circuiting.
I guess I'm missing something obvious but why does the short circuiting break const-ness?
(I'm a little surprised they weren't stabilized at the same time! Edit: they were! I just didn't look closely enough.)
Until this version of rust, all conditional branches were banned from const functions.
I guess to keep things simple they just banned any feature that might cause branching.
That said, coming from a FP background (mostly Haskell/JS, now TS) Rust is... hard. I do understand the basic rules of the borrow checker, I do conceptually understand lifetimes, but actually using them is tricky.
Especially in a combinator world with lots of higher order functions/closures it’s often completely unclear who should own what. It often feels my library/dsl code needs to make ownerships decisions that actually depend on the usage.
Anyways, I guess this gets easier over time, right? Should I avoid using closures all over the place? Should my code look more like C and less like Haskell?
[edit] great answers all, providing useful context, thanks
Yes.
> Should I avoid using closures all over the place?
Not necessarily.
> Should my code look more like C and less like Haskell?
Yes. Others sometimes don't like to hear this, but IMO, Rust is not at all functional. Passing functions around is not ergonomic (how many function types does Rust have again? Three?). Even making heavy use of Traits, especially generic ones, is difficult.
Rust is very much procedural. Java-style OOP doesn't work because of the borrowing/ownership. And FP style function composition doesn't work without Boxing everything. But then you'd need to be careful about reference cycles.
Depending on what you meant, there are more than three:
* There are 3 traits, used by closures depending on their needs:
* Fn(Args) -> Output
* FnMut(Args) -> Output
* FnOnce(Args) -> Output
* *Every* `fn` is its own type (`fn() {foo}`)
* Function pointers (`fn()`), which is how you pass the above around in practice
> Rust is very much procedural.I think this is like saying Python is very much procedural: true, but loses some nuance. Rust has some attributes of OOP, some attributes of FP. Some constructs from OOP and FP are made harder once you involve borrowing. Saying it is procedural conjures images of Pascal and K&R C in people's minds. To bolster your argument, though, I mostly use method chaining for iterators but every now and then I need to turn it into a `for` loop to keep the lifetimes understandable for the compiler, myself and others.
It has to, right? ATS has many function types as well, plus stack-allocated closures (I think Rust has that too??)
I've been using Rust for a little over year, almost daily at work, and for several projects. I have a pretty good intuition about how the borrow checker works and what needs to be done to appease it. That said, I don't think I'm any closer to understanding lifetimes. I know conceptually how they are supposed to work (I need the reference to X to last as long as Y), but anytime I think I have a situation that could be made better with lifetimes, I can't seem to get the compiler to understand what I'm trying to do. On top of that very little of my code, and the code I read actually uses lifetimes.
When the compiler starts complaining about lifetimes issues, I tend to make everything clone()able (either using Rc, or Arc, or Arc+Mutex, or full clones).
Because if you start introducing explicit lifetimes somewhere, these changes are going to cascade, and tons of annotations will need to be added to everything using these types, and their dependent types.
A simple example i often run into is wanting to do something with a string, without taking owned parts of the string. Very intuitive how the str matches the lifetime of the owned value.
On the otherhand, the other day i was trying to write a piece of software where:
1. I wanted to deserialize a large tree of JSON nodes. I had the potential to deserialize these nodes without owning the data - since Serde supports lifetimes, i could deserialize strings as strs and hypothetically not allocate a lot of strings.
2. In doing that, because a tree could be infinitely large i couldn't keep all of the nodes together. Nodes could be kept as references, but eventually would need to be GC'd to prevent infinite memory.
3. To do this, i _think_ lifetimes would have to be separate between GC'd instances. Within a GC'd instance, you could keep all the read bytes, and deserialize with refs to those bytes. When a GC took place, you'd convert the remainder partial nodes to owned values (some allocation) to consume the lifetime and restart the process with the owned node as the start of the next GC lifetime. ... or so my plan was.
I have, i think, just enough understanding of lifetimes to _almost_ make that work. I _think_ some allocations would be required due to the GC behavior, but it would still reduce ~90% of allocations in the algorithm.
Unfortunately, i got tired of designing this complex API and just wrote a simple allocation version.
Conceptualizing allocations and the lifetimes to make it work are.. interesting. Especially when there is some data within the lifetime that you want to "break out of" the lifetime, as in my example (where i had a partial node, and i made it owned).
I still think i understand enough to do it - it'll just take a fair bit of thinking and working through the problem.
* If you need to keep unchanged the input, you must either use a reference-to (.iter()) or copy-of (.iter().cloned()) of each item
* If you don't need the input ever again, you should move the items (.into_iter())
These rules follow for each step of the chain.
I very very often write very Functional code in Rust and I find it natural and easier to reason about than imperative-style code. The example I could find the fastest: https://github.com/thenewwazoo/aoc2019/blob/master/src/day10...
Edit: another example (this one uses types that are Copy so the copies are implicit) https://github.com/thenewwazoo/cryptopals/blob/master/src/tr...
Another edit: I am not a Functional programmer, and have never known Haskell or any Lisp. Erlang is as close as I've ever gotten. I've found Rust to be a fantastic language for writing Functionally.
It ends up being doable. I dabbled in ATS, developed Stockholm syndrome, and now Rust ain't too bad.
Higher-order functions are difficult in Rust or with linear/affine types in general. Haven't looked at what Rust does recently.
> Should I avoid using closures all over the place? Should my code look more like C and less like Haskell?
When in Rome do as the Romans :)
Anyway, some fun imperative programming stuff you can do in Rust that is fickle in Haskell (or OCaml/Standard ML).
example:
fn add(mut self) -> Self { self }
fn add(self) -> Self { self }
instead of:
fn add(&mut self) {}
fn add(&self) {}
With this, you will be able to ‘store’ closures easily and apply them later. No more fighting with the borrow checker over where to borrow as mut or not. You will also avoid a few copies.
There is a lot to like, understand lifetimes conceptually, but it's hard.
It is definitely not easier compared to C++, contrasting with D, which is easier than C++.
However, the program worked correctly at the first try, which I guess it is also a consequence of the Rust model.
Now that's damning with faint praise.
Are there any particular set of problems that I can solve systematically, so that I can learn all the features of Rust?
https://doc.rust-lang.org/stable/rust-by-example/ is the "by example" introduction, which is all about sample programs, but feels a bit dated, IMHO. Still not incorrect, but not up-to-date.
You may also like the O'Reilly book, or Rust In Action, which use more fully-featured example programs more heavily than The Book does.
I found that approach for Rust in particular to not work well at all, and have colleagues who've reported the same. There are some fairly complicated, fundamental concepts that are unique to Rust that I think need to be tackled before you can really do much of anything (mostly borrowing and lifetimes), and that's not immediately obvious from starter programs -- because of lifetime elision, some early programs can look deceptively familiar, but there's a bunch of barely-hidden complexity there, and as soon as you start to stray from the tutorial path, you'll run headfirst into a wall of compiler errors that you're not yet equipped to understand. For Rust I'd highly recommend just reading a book cover to cover first (either TRPL or the O'Reilly one), and then starting to write code.
The manual is safer even though it's harder to find your exact problem and solution, especially when you're just starting out.
As always, feel free to drop into the Rust Stack Overflow chat room[1], or any of the official Rust discussion channels, and ping me or other Stack Overflow contributors to review and update answers.
It also taught me about Boxes and Rc's, which are essential for certain kinds of things, and which I don't remember being covered in the main Rust Book at all
Question for Rust experts: On what ETL tasks would you expect Rust to outperform Numpy, Numba, and Cython? What are the characteristics of a workload that sees order-of-magnitude speed ups from switching to Rust?
Now, if you're doing lots of computation in Python itself - not within the confines of Numpy - that's where you might see a significant speed boost. Again, I don't know precisely how Rust and Cython would compare, but I would very much expect Rust to be significantly faster, just as I would very much expect C++ to be significantly faster.
That way you leverage a more developed data ecosystem, can call python when necessary and avoid writing low level code.
Depends on the task of course.
With rust you can stream each record and leverage the insane parallelism and async-io libs (rayon, crossbeam, tokio) and a very small memory footprint. sure you have asyncio in python but that’s nowhere near the speed of tokio.
It is the Rust way of specifying a function as being _pure_. In other words the output is dependent only on the function arguments, and not on any external state.
This means they can be evaluated at compile time. I suppose in the future, it could also allow better compiler optimizations.
const functions can't directly do any IO or even allocation - at the moment.
But this can be easily circumvented, eg by using a proc macro that does IO.
Sidenote: even in Haskell the function signature doesn't guarantee purity, due to unsafePerformIO.
Sadly it looks like the wayback machine does not have a copy of the original. Does anyone know how to get one?
In that case, you currently have to write code like:
if let (Some(username), Some(password)) = (username, password) {
/* both are set */
}
else {
/* at least one is not set */
}
With zip this can be written as `if let Some((username, password)) = username.zip(password) {` In this case it doesn't look like a big difference, but it does allow you to chain other Option combinators more easily if you were doing that instead of writing if-let / match. Using combinators is the "railroad-style programming" that kevinastone was talking about. For example, you can more easily write: let (username, password) = username.zip(password).ok_or("one or more required parameters is missing")?;
You could of course still do this without .zip(), but it would be clunkier: let (username, password) = username.and_then(|username| password.map(|password| (username, password))).ok_or("one or more required parameters is missing")?;
The zip form does lose the information of which of the two original Options was None, so if you do need that information (say the error message needs to specify which parameter is missing) you'd still use the non-zip form with a match.One thing that people may not realize, especially now that we have loop. You may expect this to hang the compiler:
const fn forever() -> ! {
loop {
}
}
static FOO: u32 = forever();
But it won't: error[E0080]: could not evaluate static initializer
--> src/lib.rs:2:5
|
2 | / loop {
3 | |
4 | | }
| | ^
| | |
| |_____exceeded interpreter step limit (see `#[const_eval_limit]`)
| inside `forever` at src/lib.rs:2:5
This does place an upper limit on any given const fn.The upside of course is that any computation you compute at compile time is a computation that you don't compute at runtime. For some applications this trade off is definitely worth the cost of admission.
At the end of the day it's a trade off that will have to be made in light of the scenario it's being used in. Being able to make that decision is a good thing.
> while, while let, and loop
> the && and || operators
Common Lisp user here. Why just that? How come you can’t have the entire language as well as all your language customizations available at compile time for evaluation?
Why isn’t `const fn` like this too? One word answer: determinism. Rust takes type/memory/access/temporal safety very seriously, and consequentially you can’t use anything in a `const fn` that isn’t fully deterministic and doesn’t depend in any way on platform-specific behavior. This includes, for example, any floating-point computation, or any random number generation, or any form of I/O, or handling file paths in an OS-specific way, or certain kinds of memory allocation. The span of things possible in `const fn`s has been expanding over time, and will in the nearish future largely overtake C++’s direct counterpart of it (`constexpr` functions) in capability. But some things will intentionally never be possible in `const fn`s, for the reasons given above.
I can't wait for when we'll be able to `const fn` all the things. Regex, expensive constants that feel as though they should be literals, etc.