I hover over a Clojure function in INtelliJ and I get a pop up of the clojuredocs with description and example usages for that function.
It's great and I don't know why more IDE's don't do this. Why isn't VS linked to the MSDN for C# / .NET ? SO I can get the information for that function / class / library etc straight in my IDE!
Following the `# Arguments:` convention is redundant (I get it's petty, but I'm annoyed every time I write it), but more than that it's error-prone and limiting. Because arguments names are just a convention that's not strictly enforced, it's not automatically checking the naming is correct, and it limits the ability of tools like cbindgen and flapigen (love them both) to transform param specific docs.
Of course this is somewhat fickle and makes competing documentation generations harder to get started but as a user when rustdoc is really good it is a nice benifit.
[1] https://juliadocs.github.io/Documenter.jl/stable/man/guide/#...
really? I've found that in general the same number of people bother to go deep into explaining compared to e.g. JS, and always having rustdoc for the people that don't is far better than reading the source or TypeScript definitions.
I have my dependencies documented locally, I have the standard library documented locally, both of these work well with the ability to do searches just like the online docs. The problem is they're separate. The local docs for one of my crates cannot link to my local standard library docs; instead, I have to jump around different browser tabs and manually look things up.
There used to be some hacks that could work around this, but those hacks stopped working.
[0] - https://elixir-lang.org/getting-started/mix-otp/docs-tests-a...
let my_arr: [u32; 3] = my_vec.try_into().expect("msg"); pub const fn from_be_bytes(bytes: [u8; 2]) -> u16
you can't pass it a slice. This would let you pass a vector to this function.Now, because this wasn't possible previously, it means many things take a slice, and then check that the length is the length they expect, because that ends up being easier for folks. Stuff like this helps it be able to be done properly.
(This specific function is one example that is done in this style, and it's a pain. I have real world code that looks like
let foo = u16::from_be_bytes([some_slice[0], some_slice[1]]);
This gets even worse with say, u128::from_be_bytes.)A situation I've encountered several times already is implementing statically sized FIFOs. At the moment in Rust I can't implement a type "FIFO of depth N" where is a generic, static parameter. My only choices are implementing "FIFO of depth n" where n is provided dynamically at runtime (and implemented internally using something like a VecDeque) or a completely fixed depth FIFO type that I need to duplicate for every depth (FIFO32, FIFO16, FIFO10 etc...).
If you require very high performance a dynamically checked FIFO can incur a fairly large overhead when a well optimized static FIFO can implement most operations in a couple of opcodes at most.
The 1.49 release will have a new tier 1 target (aarch64-unknown-linux-gnu) as well as apple silicon as a tier 2 target. The 1.50 release will have min const generics as well as stable backtraces.
As the releases are every 6 weeks, an individual one might seem small. But over time they add up.
Note though that I do consider the rustdoc improvements to be major. Previously I wasn't bothering with directly linking to referenced items because you had to figure out html names. Now it's very easy and I plan to write more links.
There have been releases where ARM (maybe I was using armv7 rather than aarch64 then, but I'm on aarch64 now) was totally broken, and now I know that won't happen on 1.49 or beyond.
Min const generics...I'm not sure I'll find much use for it until const_evaluatable_checked happens, but I'm glad to see progress.
Stable backtraces will mean I can stop using the deprecated failure crate without giving up my quality diagnostics.
Similarly there's regularly ~350 PRs merged each week into rust. (The libification and chalkification is ongoing, which is the next-gen solver for the type/trait system, plus at the same time some refactor of the compiler to make it more like a usable library, so rust-analyzer can use it to provide more immediate/incremental feedback during development.)
Did I miss something or is there some progress being made in that respect?
Try the newer and future default plugin, rust-analyzer (currently only 110K installs).
It’s much better and improving each week. With RA and VSCode the rust developer experience is fantastic.
Link: https://marketplace.visualstudio.com/items?itemName=matklad....
Maybe it was still downloading some required binaries in the background when you were trying it out?
Even if you had to use notepad to program - or had very basic syntax highlighting - wouldn't be what the language provides a more compelling point to make these choices?
Ada/SPARK/Ravenscar are a perfect example -- they provide incredibly powerful tools for proven correct programming. They are open source. The ergonomics are nowhere near what you're probably used to having, and that's why odds are very high you've never used them.
In java with a good IDE, you can ^Space yourself through a lot of problems without knowing the libraries too much. And even if you don't just pick the first thing sounding right, scrolling through the documentation of the auto suggested methods is very convenient.
On the other hand, my dev-laptop currently has about 20 different rust-doc pages open at the moment to keep track of ... the methods Iter<T> has, the methods IterTools has, Vec has, Slices have, what package FromStr was in again, what methods a str::fmt::Formatter has to implement Display to implement Error (that needs to be imported), where HashMap is, what methods those have...
I've been there multiple times over the last decade or two with multiple languages and rust is compelling enough to work through that.
But if you compare the ergonomics of modern java, go or python IDEs with my current vscode+rust-analyze state, rust isn't winning on IDE ergonomics. At least in my setup.
And that's very much a part of language choice.
Usability is a feature, and many people consider a language’s IDE support to be a critical aspect of usability.
Absolutely yes.
I no longer consider a programming language finished unless it has at least tab-complete and proper debugging support. Inline popup doc-comments are nearly mandatory too.
Not designing a new language for modern tooling is a cardinal sin. It's giving up decades of progress for... what exactly? Some sort of ascetic purity?
One disappointing design aspect of Rust specifically is that it has a tendency to "hide" functions unless explicitly imported. This makes things difficult for IDE tab-completion. It gives the false impression that some functions are missing, when in fact they were simply not imported into scope.
The language seems to be designed for developers that know ahead of time what they will or will not use, to the finest detail, before they even start typing code.
I've only ever met one programmer that could do that: start at the top of the file and type continuously, left-to-right and top-to-bottom without editing, without pause, without having to go back and fill in anything he had missed. Let's just say he was a "special sort" and leave it at that.
The rest of us are much more productive with IDEs and debuggers.