1. Clear audience target: They aren't going after C++ gurus or C magicians but people who are new to systems programming. From Klabnik to Katz to literally everyone in the community, they are consistent with this messaging.
2. As part of 1, they have invested a lot in teaching systems programming 101 (heap v. stack, etc.), i.e., stuff that you learn in the first course in systems programming in college, but many self-taught, higher-level programmers might not know. This is a great example of authentic content marketing based on a clear strategy working out.
3. Their community is very inclusive. My experience (as a marketing guy who barely remembers how to code) is that people are very helpful when you ask questions, submit a patch, etc. This has been the case for me not just with Rust itself but a couple of Rust projects that I've interacted with.
For C++ people, Rust's generics remain less powerful than template metaprogramming (which is Turing-complete, with people building real programs in the tarpit), so there are reasons you might not switch.
Meanwhile, Rust does make it a lot easier to get started with systems programming, which is good! Every tool should help both empower beginners and extend the reach of experts. For example, writing zero-copy parsers in C is fairly hard to get right, and might not be worth the debugging or validation time that even an expert might have to put in. C string manipulation works, but it's verbose and fiddly. In Rust, it's trivial to use the lifetime system to make sure you keep all the input data around long enough and don't read outside the buffer. You could even use #[must_use] and affine types to check that every character of input data ends up attributed to exactly one terminal.
Or if Rust doesn't support your OS yet.
I am working on porting LLVM and writing a MIR to C++ translator in parallel. We'll see which one I get further on.
Because I'd love nothing more than to use Rust.
But my main problem is a lack of a project -- a ThingIWantToDo that would be well suited to a systems programming language like Rust. And I don't even know what kinds of problems or projects are well suited to systems programming -- so far, when I've had an itch to scratch and gone to scratch it, I've found Python able to do what I want.
Now I realize that Python is in no way appropriate for all classes of problems, and that there problems for which it is not fast enough. But thus far, the only project I'd like to tackle that I know Python will be too slow for is doing real-time audio processing on Linux with lv2 plugins and JACK. But lv2 and JACK are C APIs, so that's incentive for me to learn C, not Rust.
Understand, this isn't a knock against Rust. As I said, it's caught my eye. I just haven't found a compelling reason to actually get involved yet. I am hoping I eventually will.
Maybe, but it might be an incentive to learn just enough C that you can wrap the C interface in Rust. The point of having an API is to have well defined behavior at a particular boundary, which can reduce quite a bit (but not eliminate) a lot of the reasons to use the language it was implemented in on the caller side.
I suspect learning rust will probably make you familiar enough with the basics of C that you won't have to do much specific C learning to use most libraries.
I just hope Rust practitioners can do a few things where they have to use 'C' ( properly ), much as I think assembly is a good thing for 'C' programmers to do.
I hope the relationship between 'C' and Rust is collegial - ideally, it would approach being the same people over time because legacy code. Nothing divides like language, and flexibility is a great way to harden your skillset.
If this actually is their strategy, is it being done voluntarily or out of necessity? I ask, because I've witnessed enough scepticism about Rust from C and C++ programmers. Rightly or wrongly, there are enough of them who don't appear to be receptive to Rust, and likely never will be. So the Rust community may never be able to appeal to these C and C++ programmers, even if they wanted to. The only option may be to appeal to the non-C and non-C++ programmers.
I don't think there's a single language on the planet that hasn't had skeptics, especially at the beginning. Remember how skeptical everyone was of Python at first due to its significant whitespace?
Programmers are very tribal about their tools.
let the_bits:usize = unsafe { std::mem::transmute(pointer) };
You can also use `std::mem::forget(*pointer)` to avoid fighting with Rust about who manages the memory.So I don't think it's fair to say that C "gets out of the way". It won't let you get the overflow flag, or alias arbitrary pointers, for instance.
What is that?
> They aren't going after C++ gurus or C magicians
Don't they have anything to gain from using Rust?
Rust does market to C/++ people, a lot. We just don't market to the super-awesome C++ folks. It's the same reason I preferred Word 2003 over the new stuff for half a decade. I had years of memorized shortcuts, custom macros, and general ui familiarity. I could eventually learn the shiny new Word and become as good, but the activation energy for that is too much and I was happy with 2003.
Rust isn't only going after non-systems folks. The community is roughly half systemsy. But it may seem this way because Rust tries very hard to not alienate non-systems people with jargon and unexplained systems concepts.
C++ gurus and C magicians already have invested too deep into their languages to throw everything away and start from zero.
For example I love Rust and play occasionally with it, but for the time being C++ is my native language on the job when I need to use a native language outside .NET or JVM.
I know it since the C++ARM "standard" and we depend on standard OS tooling that Rust is still catching up with.
The day will come when our customers will be able to do mixed debugging between JVM/.NET and Rust. Or produce COM as easy as C++ compilers do.
But these are things that beginners in systems programming aren't usually doing.
>Don't they have anything to gain from using Rust?
Who's "they"?
Sure they do, but they're not throwing away a decade of hard-won experience in their specialties just to tinker; at least not with production code bases. The barrier of entry for beginner systems programmers is lower.
Even if they can succeed by going after the C++ gurus as well.
Rust is what it took C++ 20 years to become, except anew and reimagined.
It's ready and usable, today.
I think Rust would be create for building common crypto infrastructure and things such as crypto currency. It seems risky to me to build something like Bitcoin with C++ where millions can easily be at stake if the system doesn't work.
I am an application programmer so I might not be the primary target, but I started programming with Swift and although it isn't the same as Rust it has some similarities. A lot stricter language than C++, C, Lua, Python and Objective-C which have have used most in the past. So many bugs are caught at compile time. I used to be skeptical towards static typing, primarily because languages like C++ and Java made types so awful to work with. But with the newer OOP languages with more functional inspiration, it is getting easier to deal with strict typing.
You don't have to chose between productivity and safety so much anymore.
Exactly. If I had to sum up Rust's philosophy in one sentence, this would basically be it. (Add "and performance" after "safety" too.) :)
Choose any three.
(Taking a hint from SQLite.)
Imagemagick has a long history of issues (although I appreciate the most recent, major issue could have been written into any language). Mozilla only just audited libjpeg-turbo and found a series of issues, and a quick Google will point to most of the options being terrible.
I'm sure someone will (if not already) write a decent Rust alternative - but what everyone is missing at the moment is bindings for their favourite high level language with comparable APIs to their existing tools.
Yes, you take a 20% or so performance hit for using a microkernel. Big deal.
At one time, you could download the QNX kernel sources and look at them.[1] This would be helpful in getting the microkernel architecture right. It's very hard to get that right. See Mach or Hurd.
[1] http://community.qnx.com/sf/sfmain/do/downloadAttachment/pro...
And there are others: http://wiki.osdev.org/Rust
Here's the QNX architecture document on this, which discusses how message passing and CPU scheduling integrate.[1] Microkernel designers need to read this very carefully. A good test is to run a message-passing benchmark on an idle system, then run it again with a CPU-bound process of equal priority in round-robin mode also running. If the message passing-task starves, or the CPU-bound task starves, message passing was misdesigned. If, on a multiprocessor, a simple message pass causes a CPU switch, message passing was done wrong.
If message passing and scheduling do not play very well together, a service-oriented architecture (sorry, "microservices" architecture) will be sluggish. This is where most microkernels fail.
[1] http://www.qnx.com/developers/docs/6.4.1/neutrino/sys_arch/i...
If their code was written in Rust, that sort of bug could not have occurred.
Nearing a half-century of momentum in the community which includes developers, mature tools, etc. Until rust arrived it was nearly the only game in town for predictable low-latency systems programming.
Yes, now that Rust's here there's a bit of an alternative. But if you've got a team of 30+ software devs who know C/C++ and an existing well-tested codebase of millions of lines, even if you had multiple Rust champions it would take a very long time to evolve towards Rust.
If the answer is "no static analysis tool", there's your problem.
But as my first paragraph implied, they can't catch everything, so "we installed many layers of protection and it still got through even so" is definitely a possibility. Rust may have a different set of such issues but they will of course always exist.
If you mean this one[1], it's merged. Still lots of work to do, and even more corners where things will shake out[2], but there's definitely progress.
[1]: https://github.com/rust-lang/rust/pull/33460 [2]: https://github.com/rust-lang/rust/pull/34174
The code base is so modified by macros/typedefs its hardly even C anymore.
Not to mention the LLVM has to support the embedded device you are targeting. And Rust's support of legacy CPU's (8008, 8080, 80386, 68000) are lacking.
Rust won't change anything about this, if there would have been interest in changing the situation other alternatives like Ada would have been available for years.
( no snark; I hope you get my point )
Ironically, reliability is actually a value of merit with 'C'/C++ - in cases. It's just that the ways of doing that seem rather inaccessible these days, or the flow of people past seeing them is not working out.
I don't think there will ever be a way around developing proper test vectors. It's quite interesting work but it tends to go unrewarded.
Programming languages isn't just fashion, we invent them because we thing we can solve old problems in better ways.
Most of my co-workes doing C++ never even wanted to look at the alternatives. Being the only thing they have ever done, they don't even realize how bad it is. They have just internalized it.
There are other safe languages they could have used which have a longer track record than Rust, e.g. Ada. It's used in avionics. Why shouldn't it being used here?
I do not know what kind of unsafe memory access happened in their systems, but you can do all sorts of memory opperations and as long as the explicit typecasts are a-ok misra won't flinch.
1. Most of those platforms don't have compilers for any languages other than C(++). If the platform has a lot of history behind it, maybe you could write it in Ada, but that's pretty much it.
2. Development tools (debuggers, static analyzers, standards compliance verification tools and so on) for C and C++ are very hard to match, both in strength and in sheer availability. In the meantime, Rust still relatively recently got decent GDB support.
3. A lot of Rust's features simply aren't needed when writing this kind of software (e.g. the breadth of features related to memory management is largely unneeded because everything is statically allocated).
4. For better or for worse, C is well-understood (C++ is... well, not that I haven't seen good safety-critical code written in C++, but in my experience, C++ code is a lot easier to get wrong, both by humans and compilers). Rust isn't, not yet in any case. There's no Rust equivalent for e.g. MISRA, and not because Rust doesn't need one.
5. To, uh, put it bluntly -- C and C++ are very well known in the far corners of the world where a lot of this software is outsourced. Rust -- not so much, because outsourcing companies don't really encourage their employees to learn this kind of stuff.
6. There's a lot of commercial risk involved. I'm not sure about autonomous vehicles, this is probably a more volatile field, but many safety-critical systems have to be maintained for a very long time (10 years is fairly common, and 15-20 isn't unheard of). Rust may well be dead and burried ten years from now, whereas language enthusiasts have been singing requiems to C (on roughly the same tune as Rust, no less) for almost thirty years now.
Rust is a great development in this field and I can't wait for the day when we'll finally put C (and especially frickin C++, Jesus, who writes that!) to sleep, but it's at least five years away from the point where I'd even half-heartedly consider it for a project with critical safety requirements.
> If their code was written in Rust, that sort of bug could not have occurred.
I don't know the specifics of the bugs you mentioned, so I can't really comment on this, but in my experience, most of the similar claims that float around the Interwebs are somewhat exaggerated when put in their proper context. E.g. Heartbleed, which wasn't because C something something PDP-11, but because someone decided to be smart about it and implement their own (buggy) memory management system so as to make the damn thing run decently on twenty year-old operating systems.
I've seen people write that kind of code, for similar reasons, in Java and Go -- and, at least once, with Heartbleed-like results. The ways in which a language can be misused rarely reveal themselves before that language breaks out of its devoted community.
To be clear on it though -- I think Rust is a step in the right direction, and one that we should have taken a long, long time ago. If it can make it through its infancy, and if it can get enough commercial support, it will be a great alternative to C and C++.
True, but I don't think that Rust's other features wouldn't be useful here. References which know about mutability/immutability, sum/enum types, "fat" pointers/slices w/ bounds checking, the ability to construct library APIs which enforce non-memory safety through session/affine/linear types, sane integer typing, etc, could all still be useful to a fully-statically-allocated program.
Rust doesn't exactly need these, no? Most static analysis in C/++ is safety/UB focused. Rust doesn't need this, unless you're going to spend a lot of time with `unsafe` code.
Rust does have clippy, a lint library with >150 lints which catch things ranging from correctness to style to safety issues. I'm one of the maintainers, so I'm biased, but I've personally found it to be much better than its equivalents in C++land. Perhaps not Javaland.
Except Rust allows for less undefined behavior. I wouldn't be surprised if it improved at a faster rate then C. Or C++ ()
() Please don't say C/C++. They are different beasts.
On the hardware of yesteryear, a parallel compile could build Postgres in about 45 seconds (750-1305KLOC, depending on measurement) , and user mode Linux (which doesn't compile so many drivers) in about a minute.
The real improvements will come when incremental compilation lands. The precursor requirements are just landing now; so it won't be immediately here, but it will be soonish.
Well, MSVC still hasn't even fully implemented C99. One of the big draws of C, as I see it, is its wide support on many operating systems and architectures. If you're going to abandon that by using new C features, you might as well use a language with less cruft.
Neither does GCC. Both don't fully implement C11 either.
Most these features are either things C-Compilers don't need to support themselves. Namely: special integer types can be placed in libraries instead of compilers.
Also bounds checking interfaces are a performance loss and not included in C compilers despite them being part of the C11 standard. (Well they're optional)
IMO, a better evolution is to do what the Rust folks have done - define a new language. This way it has a new name and you don't have to qualify which version of 'C' you mean.
So if I specify both unsigned char and uint8_t as cases the same _Generic expression, with GCC 6.1 I get:
foo.c:6:2: error: ‘_Generic’ specifies two compatible types
uint8_t: "uint8_t", \
^
...
foo.c:5:2: note: compatible type is here
unsigned char: "unsigned char", \
^
and with Apple clang-7001.81 I get foo.c:12:22: error: type 'uint8_t' (aka 'unsigned char') in generic association
compatible with previously specified type 'unsigned char'
Another issue with _Generic: you have to be careful with type promotion, especially because everything smaller than int is quickly promoted to int in most kinds of expressions.Another issue is type qualifiers: (int) is different from (const int) is different from (volatile int) is different from (const volatile int). _Atomic and restrict increase the permutations.
I have a fuzzy memory that early clang had a wrong implementation of _Generic that didn't obey the standard. But as far as I know, today both clang and GCC have identical behavior. Whether Microsoft implements it compatibly if they add it is another question. For example, Microsoft has an idiosyncratic interpretation of macro tokenization and evaluation that makes implementing certain C99 variable argument macro constructions difficult.
Rust aficionados will say that their compiler is getting better, but so is C. clang has gotten faster than gcc on some benchmarks and on some others gcc has catched up and is now faster than clang again.
But what if you don't need optimal performance? Then you can use Rust. But then you can also use Go, Python, SBCL, Haskell, Java, C#...
On non-SIMD tasks Rust/C are neck and neck https://benchmarksgame.alioth.debian.org/u64q/rust.html
You're just cherry picking benchmarks. In the cases you care about raw number crunching power you'll likely be using a GPU not SIMD instructions as CPU's are roughly 3-4 orders of magnitude slower then GPU's at pure number crunching tasks.
Not that SIMD isn't important as it's instructions also cover things like AES, SHA1/2, Random numbers, Cache pre-loading/evacuation, memory fences, and fast loading paths. But so few programmers worry about these things you are really hitting a niche market.
Cool. Let's call that language with SIMD and inline assembly support FutureRust(tm) to differentiate it from the currently released and available Rust. We can have a discussion about how fast FutureRust will be vs C, but this discussion is about Rust vs C. Or rather clang 3.6.2/gcc 5.2.1 vs Rust 1.9.0 since language performance is very implementation dependent.
> On non-SIMD tasks Rust/C are neck and neck https://benchmarksgame.alioth.debian.org/u64q/rust.html
In 5 of 10 benchmarks, C is twice as fast as Rust. In one of the benchmarks where it is neck and neck, like pidigits (https://benchmarksgame.alioth.debian.org/u64q/performance.ph...) it appears to be so because both the C and the Rust variant are wrapping libgmp. GMP is written in C.
Other than that Rust has way better zero-cost abstractions, so in practice allows writing faster code, as in C there are sanity limits after which you give up and write slower, but easier to manage code, as macro processor sucks, and type system is trying to stab you in the back at every step.
Good example is `qsort`: http://www.tutorialspoint.com/c_standard_library/c_function_...
I don't have much faith in microbenchmarks. Usually all they measure is how much effort the author put into overoptimizing code.
How many skilled assembly programmers you know that are able to write better code than the collective intelligence embedded in current compilers? Even if you have a few of them handy, then aren't their resources better spent in hand-optimizing the compiler output for critical sections only?
Sometimes you just need safety and correctness .
Also, given how prone microbenchmarks are to depending on hand-optimization over the compiler quality, benchmarks should have been contributed to by the community -- I don't think anyone in rust has heard about this one.
Oh also, Rust is as fast as C/C++ there. It's just not faster than C++Cached, _which is a different algorithm_. That's the problem with microbenchmarks, you end up measuring differences in the algorithm used.
Really? Are there really people who write (a lot of) assembly in order to get code "a fair bit" faster than C? What on earth are they working on?
I doubt it, the mental overhead of doing "safe memory programming" in Rust is very high.
Edit: all good replies, want to clarify and forgot to mention that I was comparing to languages with a GC, since I'm seeing Rust being used for lots of stuff, in a general purpose programming language sense (like creating web frameworks for example). Also, for non-very-low-level stuff I guess this cognitive load will be less if/when they introduce an optional GC.
Re: reducing mental state for a programmer, algebraic datatypes in general decrease the size of the state space of your program by making many illegal states unrepresentable. Without advanced forms of dependent types (maybe quotients), you can't make all illegal states unrepresentable, but you shrink the size of the state space hugely compared to writing everything as product types (as you would in C). A programmer has to reason about all the possible values their variables can take on, so it pays to minimize the cardinality of that set.
Besides, you're still going to be doing safe memory programming regardless of whatever language you use. (unless you're just saying "writing broken code is easier")
Of course, some people still don't like it. Not every language can be to everyone's liking, a plurality of languages is a good thing. Plus, we do have some stuff in the pipeline to increase the number of programs the borrow checker will understand; some people can get frustrated when they want to write a valid program that gets rejected, but this is going to be the case with any kind of static analysis.