Is it expensive to maintain all the binaries for all of Hackage?
You can actually do this today. The buildRustCrate function in nixpkgs creates a derivation for a single Rust crate. You can combine this with crate2nix [1] to make derivations for a crate and all its dependencies in a Cargo.lock file. Every compiled crate will be a separate output in the Nix store and you can upload them to a binary cache such as Cachix.
This is what I have been using for CI in many of my Rust projects in combination with Cachix to get very fast builds.
There are 15,000 packages on Hackage, but 48,000 on crates.io. According to http://www.modulecounts.com/ hackage is getting about four new packages a day right now, and crates.io is getting 54. This means that expense is more of an issue.
I don't know about Haskell, but in Rust, many projects tweak various compiler flags, for performance, or whatever. You'd need to get the exact flags the same too. It's possible there's a long tail effect here.
https://doc.rust-lang.org/nightly/rustc/platform-support.htm... is way longer than https://gitlab.haskell.org/ghc/ghc/-/wikis/platforms (I am not 100% sure the latter is canonical). This isn't a hard stopper, and there's probably a long tail effect here, but it's still a factor in you getting a precompiled binary or not.
Rust relies more heavily on inlining than Haskell does, and relies heavily on monomorphization This means more work to do during compilation; it also means that even for a "precompiled binary", you still end up needing to do a lot of work on the final compilation.
There are existing tools like sccache that you can use to implement a binary cache, so people that really want this feature also have the ability to get it, and some do. It's just not a globally available thing.
If you look at Rust's compile times, generally, even once your dependencies are compiled, it's still slower than we'd like. You only compile dependencies once, and so, while binary libraries may help your initial build, they don't really affect later builds at all, and that's most builds. So more effort is going into improving that, rather than focusing on pre-compiled dependencies.
That's off the top of my head...
In my naive view, there should be a way to at least cache the latest successful build from a package version. Then when users `cargo install ..`, it will check if a build was already made and retrieve it.
Tools like guix and nix aim to describe packages in such a way that for the same input (package dependencies) you get the same output.
If the output is a function of the input, and you know a cached build of a package was built with the same input, then using the cached build of a package would be the same as if you'd built it yourself.
It does depend on whether the packages are written in a "pure" way. (Or "impure", such that building with the same inputs might result in different outputs). https://github.com/NixOS/nix/issues/2270
The problem with caching just on package name and version is that there are many different possible flavours. For example, multiple different versions of the sourcecode may have the same given version number. Things such as target architecture, optimisation level, and optional features confuse things further. Nix therefore uses a hash of all the inputs as the cache key.