If 32-bit x86 support can be dropped for pragmatic reasons, so can these architectures. If people really, really want to preserve these architectures as ongoing platforms for the future, they need to step up and create a backend for the Rust toolchain that supports them.
There's other languages that are considered acceptable, even desirable, languages to write applications in (e.g., Java, PHP, Go), but Rust is really the first language to compete sufficiently close to C's competence for people to contemplate adding it to the base-system-languages list. I'd say only Go has ever come close to approaching that threshold, but I've never seen it contemplated for something like systemd.
Interestingly, I wonder if the debates over the addition of C++, Python, and Perl to the base system language set were this acrimonious.
I think any projects that are run by people that see themselves as "X-people" (like Python-people, Perl-people) always have a bit "ick" reaction to new languages being added to projects they might see as part of a language's community.
So say you're a C++ developer, contributed to APT over the years, see all of it linked to the C++ community which you are part of too, and someone wants to start migrating parts of it to Rust/$NewLang. I think it might sometimes affect more for these people than just the code, might even be "attacking" (strong word perhaps) their sense of identity, for better or worse.
Debian has ongoing efforts to make many shell scripts (like postinst Scripts in packages etc.) non-bash-specific.
A minimal Debian installation doesn't contain bash, but rather dash, which doesn't support bash extensions.
I don't know if you've tried to get someone else's Python running recently, but it has devolved into a disaster effectively requiring containers to accurately replicate the exact environment it was written in.
Core system applications should be binaries that run with absolutely minimal dependencies outside of default system-wide libraries. Heck, I would go as far as to say applications in the critical path to repairing a system (like apt) should be statically linked since we no longer live in a storage constrained world.
> indeed, there's quite a few commenters here who I think would be surprised to learn that not only is C++ on this list, but that it's been on it for at least 25 years
... isn't so surprising.> Critical infrastructure still written in C - particularly code that parses data from untrusted sources - is technical debt that is only going to get worse over time.
But hasn't all that foundational code been stable and wrung out already over the last 30+ years? The .tar and .ar file formats are both from the 70s; what new benefits will users or developers gain from that thoroughly battle-tested code being thrown out and rewritten in a new language with a whole new set of compatibility issues and bugs?
After all the library wasn't designed around safety, we assumed the .debs you pass to it are trusted in some way - because you publish them to your repository or you are about to install them so they have root maintainer scripts anyway.
But as stuff like hosting sites and PPAs came up, we have operators publishing debs for untrusted users, and hence suddenly there was a security boundary of sorts and these bugs became problematic.
Of course memory safety here is only one concern, if you have say one process publishing repos for multiple users, panics can also cause a denial of service, but it's a step forward from potential code execution exploits.
I anticipate the rewrites to be 1 to 1 as close as possible to avoid introducing bugs, but then adding actual unit tests to them.
Not necessarily. The "HTTP signature verification code" sounds like it's invoking cryptography, and the sense I've had from watching the people who maintain cryptographic libraries is that the "foundational code" is the sort of stuff you should run away screaming from. In general, it seems to me to be the cryptography folks who have beat the drum hardest for moving to Rust.
As for other kind of parsing code, the various archive file formats aren't exactly evolving, so there's little reason to update them. On the other hand, this is exactly the kind of space where there's critical infrastructure that has probably had very little investment in adversarial testing either in the past or present, and so it's not clear that their age has actually led to security-critical bugs being shaken out. Much as how OpenSSL had a trivially-exploitable, high criticality exploit for two years before anybody noticed.
Additionally, the fact that this comes across as so abrasive and off-putting is on brand for online Rust evangelicalism.
No: a little less than 5 years ago there was CVE-2020-27350, a memory safety bug in the tar/ar implementations.
Seeing this tone-deaf message from an Ubuntu employee would be funny if I didn’t actually use Ubuntu. Looks like I have to correct that…
- It's not an option for debian core infrastructure until it supports at least the same platforms debian does (arm, riscv, etc) and it currently only supports x86_64.
- It doesn't turn C into a modern language, since it looks like there's active development here getting the productivity benefits of moving away from C is likely still worth it.
If all the entry-level jobs are C or C++, do you think companies would have a hard time filling them? Would the unemployed new graduates really shun gainful employment if Rust wasn't part of the equation?
Meanwhile, hiring managers left and right are reporting that within hours of a job being posted, they are flooded with hundreds of applications. And you can't find a single person because of the programming language of your stack? And to remedy this, you're going to rewrite your stack in an unproven language? Have you considered that if you can't find anyone that it might not be a programming language or tech stack problem?
A lot of the C code used in python is calling out to old, battle tested and niche libraries so it is unlikely that someone is going to replace those any time soon but Rust is definitely increasing as time goes on for greenfield work.
All (current) languages eventually have a compiler/runtime that is memory unsafe. This is basically fine because it's a tiny amount of surface area (relative to the amount of code that uses it) and it exists in a way that the input to is relatively benign so there's enough eyes/time/... to find bugs.
There's also nothing stopping you from re-implementing python/ruby/... in a safer way once that becomes the low hanging fruit to improve computer reliability.
I don't know about that. Look at the code for the COSMIC desktop environment's clock widget (the cosmic-applet-time directory under <https://github.com/pop-os/cosmic-applets>), for example. It's pretty much unreadable compared to a C code base of similar complexity (GNU coreutils, for example: <https://savannah.gnu.org/projects/coreutils/>).
as in that "isn't the style of code you are used too"
I don't think "how well people not familiar with you language can read it" is a relevant metric for most languages.
Also IMHO while C feels readable it isn't when it matters. Because it very often just doesn't include information you need when reading. Like looking at function header doesn't tell you if a ptr is nullable, or if a mut ptr is a changeable input value or instead is a out ptr. which is supposed to point to unitialized memory and if there is an error how that affects the state of the validity of any mutable ptrs passed in. To just name some example (lets not even get started about pre processor macros pretending to be C functions). In conclusion while C seems nice to read it is IMHO often a painful experience to "properly" read it e.g. in context of a code review.
As a side note: The seemingly verbose syntax of e.g. `chrono::DateTime` comes from there being 2 DateTime-types in use in the module, one from the internationalization library (icu) and one from a generic time library (chronos). Same for Sender, etc. That isn't a supper common issue, but happens sometimes.
If I wanted to tweak the Rust project, I’d feel pretty confident I was calling the right things with the right params.
Most of the code in that module is dedicated to the gui maintenance. The parts that do deal with time are perfectly legible.
I disagree. Both seem perfectly readable, assuming you know their preferred coding styles. As a non-C programmer, I absolutely despise running into #ifndef SOME_OBSCURE_NAME and `while (n) { if (g) {` but C (and in the latter case Go) programmers seem to love that style.
Comparing a bunch of small, barely integrated command line programs to a UI + calendar widget doesn't seem "of similar complexity" to me. Looking at a C clock widget (https://gitlab.freedesktop.org/xorg/app/xclock/-/blob/master...) the difference seems pretty minimal to me. Of course, the XClock code doesn't deal with calendars, so you have to imagine the extra UI code for that too.
I beg to differ.
Can you provide some evidence to support this? There’s a large body of evidence to the contrary, e.g. from Chrome[1].
> But we have tools to prevent that. The new security issues are supply chain attacks.
Speaking as a “supply chain security” person, this doesn’t really hold water. Supply chain attacks include the risk of memory unsafety lurking in complex dependency trees; it’s not an either-or.
[1]: https://www.chromium.org/Home/chromium-security/memory-safet...
Does it audit third-party code for you?
But Rust, you know, has one.
No. Rust is not magic, it just forces a discipline in which certain safety checks can be made automatically (or are obviated entirely). In other languages like C, the programmer needs to perform those checks; and it's technical debt if the C code is not coded carefully and reviewed for such issues. If coding is careful and the code is review - there is no technical debt, or perhaps I should say no more than the unsafe parts of a rust codebase or the standard libraries. And the safety of critical infra code written in C gets _better_ over time, as such technical debt is repaid.
> Rust is explicitly designed to be what you'd get if you were to re-create C knowing what we know now about language design and code safety.
That's not true. First, it's not a well-defined statement, since "what we know now" about language design is, as it has always been, a matter of debate and a variety of opinions. But even regardless of that - C was a language with certain design choices and aesthetics. Rust does not at _all_ share those choices - even if you tack on "and it must be safe". For example: Rust is much richer language - in syntax, primitive types, and standard library - than C was intended to be.
How many decades have we tried this? How many more to see that it just hasn't panned out like you describe?
History shows again and again that this statement is impossible..
Name a large C application that’s widely used, and I’ll show you at least one CVE that’s caused by a memory leak from the project
IOW, what's your specification?
According to what?
> Rust is explicitly designed
There is no standard. It's accidentally designed.
> knowing what we know now about language design and code safety.
You've solved one class of bugs outside of "unsafe {}". The rest are still present.
Are you really claiming that you can't design a language without an official standard? Not to mention that C itself has been designed long before its first ISO standard. Finally, the idea that a standard committee is a preconditionfor good language design is rather bold, I have to say. The phrase "design by committee" isn't typically used as a compliment...
> You've solved one class of bugs outside of "unsafe {}".
It's "only" the single most important class of bugs for system safety.
This kind of deflection and denialism isn't helping. And I'm saying this as someone who really likes C++.
I have been seeing hatred on this forum towards Rust since long time. Initially it didn't make any kind of sense. Only after actually trying to learn it did I understand the backlash.
It actually is so difficult, that most people might never be able to be proficient in it. Even if they tried. Especially coming from the world of memory managed languages. This creates push back against any and every use, promotion of Rust. The unknown fear seem to be that they will be left behind if it takes off.
I completed my battles with Rust. I don't even use it anymore (because of lack of opportunities). But I love Rust. It is here to stay and expand. Thanks to the LLMs and the demand for verifiability.
For instance,
struct Feet(i32);
struct Meters(i32);
fn hover(altitude: Meters) {
println!("At {} meters", altitude.0);
}
fn main() {
let altitude1 = Meters(16);
hover(altitude1);
let altitude2 = Feet(16);
hover(altitude2);
}
This fails at build time with: 12 | hover(altitude2);
| ----- ^^^^^^^^^ expected `Meters`, found `Feet`
| |
| arguments to this function are incorrect
Guaranteeing that I’ve never mixed units means I don’t have to worry about parking my spacecraft at 1/3 the expected altitude. Now I can concentrate on the rest of the logic. The language has my back on the types so I never have to waste brain cycles on the bookkeeping parts.That’s one example. It’s not unique to Rust by a long shot. But it’s still a vast improvement over C, where that same signed 32 bit data type is the number of eggs in a basket, the offset of bytes into a struct, the index of an array, a UTF-8 code point, or whatever else.
This really shows up at refactoring time. Move some Rust code around and it’ll loudly let you know exactly what you need to fix before it’s ready. C? Not so much.
If people from that world complain about Rust, I surely wouldn't want them around a C codebase.
There's nothing wrong about memory-managed languages, if you don't need to care about memory. But being unfamiliar with memory and complaining about the thing that help you avoid shooting your foot isn't something that inspires trust.
The hardship associated with learning rust isn't going to go away if they do C instead. What's going to happen instead is that bugged code will be written, and they will learn to associate the hardship with the underlying problem: managing memory.
I think this is more true of C than it is of Rust if the bar is "code of sufficient quality to be included in debian"
It might take some people months rather than days, but I think that is a desirable outcome.
Important low level software should be written by competent developers willing to invest the effort.
Time and time again, theoretically worse solutions that are easily accessible win
That could also be applied to C and C++ …
> Rust is already a hard requirement on all Debian release architectures and ports except for alpha, hppa, m68k, and sh4 (which do not provide sqv).
Wonder what this means for those architectures then?
It looks like the last machines of each architecture were released:
Alpha in 2007
HP-PA in 2008
m68k in pre-2000 though derivatives are used in embedded systems
sh4 in 1998 (though possible usage via "J2 core" using expired patents)
This means that most are nearly 20 years old or older.
Rust target triples exist for:
m68k: https://doc.rust-lang.org/nightly/rustc/platform-support/m68... and https://doc.rust-lang.org/nightly/rustc/platform-support/m68... both at Tier 3.
(Did not find target triples for the others.)
If you are using these machines, what are you using them for? (Again, genuinely curious)
Either legacy systems (which are most certainly not running the current bleeding-edge Debian) or retro computing enthusiast.
These platforms are long obsolete and there are no practical reasons to run them besides "I have a box in the corner that's running untouched for the last 20 years" and "for fun". I can get a more powerful and power efficient computer (than any of these systems) from my local e-waste recycling facility for free.
Here is one famous example of a dude who’s managed to get PRs merged in dozens of packages, just to make them compatible with ancient versions of nodejs https://news.ycombinator.com/item?id=44831811
Well, there are so many things were you could argue about the relevance of a userbase.
If the size of a userbase would be the only argument, Valve could just drop support for the Linux userbase which is just 2-3% of their overall userbase.
But yeah, those can figure out how to keep their own port
Cars, airplanes, construction equipment, etc.
Who is actually _running_ Debian Trixie on these platforms now?
It is counter-intuitive to me that these platforms are still unofficially supported, but 32-bit x86 [edit: and all MIPS architectures!] are not!
I am emotionally sad to see them fall by the wayside (and weirdly motivated to dig out a 68k Amiga or ‘very old Macintosh’ and try running Trixie…) but, even from a community standpoint, I find it hard to understand where and how these ports are actually used.
It’s just a bit annoying that Rust proponents are being so pushy in some cases as if Rust was the solution to everything.
It's been somewhat useful for finding weird edge cases in software where for whatever reason, it doesn't reproduce easily on AArch64 or x86, but does there. (Or vice-versa, sometimes.)
I don't know that I'd say that's sufficient reason to motivate dozens of people to maintain support, but it's not purely academic entertainment or nostalgia, for that.
(LLVM even used to have an in-tree DEC Alpha backend, though that was back in 2011 and not relevant to any version of Rust.)
[0] Looks like there is basic initial support but no 'core' or 'std' builds yet. https://doc.rust-lang.org/rustc/platform-support/m68k-unknow... This should potentially be fixable.
They will be rebranded as "retro computing devices"
Here's a thread of them insulting upstream developers & users of the Debian packages. https://github.com/keepassxreboot/keepassxc/issues/10725
Unnecessary drama as usual...
The one demanding it is the maintainer of keepassxc it would’ve been better to just close the issue that this is a Debian only problem and he should install it like that and just close it.
In fact not having it encourages copy and paste which reduces security.
Whats next? Strip javascript support from browsers to reduce the attack surface?
I don't get how this is even a discussion. Either he is paid by canonical to be a corporate saboteur or he is completely insane.
now this is separate from being open for discussion if someone has some good arguments (which aren't "you break something which isn't supported and only nich used") and some claim he isn't open for arguments
and tbh. if someone exposes users to actual relevant security risk(1) because the change adds a bit of in depth security(2) and then implicitly denounces them for "wanting crap" this raises a lot of red flags IMHO.
(1): Copy pasting passwords is a very bad idea, the problem is phsishing attacks with "look alike" domains. You password manager won't fill them out, your copy past is prone to falling for it. In addition there are other smaller issues related to clip board safety and similar (hence why KC clears the clipboard after a short time).
(2): Removing unneeded functionality which could have vulnerabilities. Except we speak about code from the same source which if not enabled/setup does pretty much nothing (It might still pull in some dependencies, tho.)
but yes very unnecessary drama
It is our responsibility to our users to provide them the most secure option possible as the default.
Removing features is not the most secure option possible. Go all the way then and remove everything. Only when your computer cannot do anything it will be 100% secure.If I have a program that encrypts and decrypts passwords, then the surface area is way smaller than if it also has browser integrations and a bunch of other features. Every feature has the potential to make this list longer: https://keepass.info/help/kb/sec_issues.html which applies to any other piece of software.
At the same time, people can make the argument that software that's secure but has no useful features also isn't very worthwhile. From that whole discussion, the idea of having a minimal package and a full package makes a lot of sense - I'd use the minimal version because I don't use that additional functionality, but someone else might benefit a bunch from the full version.
Security is there to keep the features usable without interruptions or risks.
E.g. plugging the computer off the network is not about security if the service needs to be accessible.
Yes there are absolutely some obnoxious "you should rewrite this in Rust" folks out there, but this is not a case of that.
And regardless, my point is it would be more sensible to say "I'm going to introduce an oxidized fork of apt and a method to use it as your system apt if you prefer" and then over the next year or so he could say "look at all these great benefits!" (if there are any). At that point, the community could decide that the rust version should become the default because it is so much better/safer/"modern"/whatever.
Note that I'm not saying Debian should, I'm saying it is reasonable that they would. I am not a Debian maintainer and so I should not have an opinion on what tools they use, only that adding Rust isn't unreasonable. It may be reasonable to take away a different tool to get Rust in - again this is something I should not have an opinion on but Debian maintainers should.
* It's becoming increasingly difficult to find new contributors who want to work with very old code bases in languages like C or C++. Some open source projects have said they rewrote to Rust just to attract new devs.
* Reliability can be proven through years in use but security is less of a direct correlation. Reliability is a statistical distribution centered around the 'happy path' of expected use and the more times your software is used the more robust it will become or just be proven to be. But security issues are almost by definition the edgiest edge cases and aren't pruned by normal use but by direct attacks and pen testing. It's much harder to say that old software has been attacked in every possible way than that it's been used in every possible way. The consequences of CVEs may also be much higher than edge case reliability bugs, making the justification for proactive security hardening much stronger.
On your second part. I wonder how aviation and space and car industry do it. They rely heavily on tested / proven concepts. What do they do when introducing a new type of material to replace another one. Or when a complete assembly workflow gets updated.
uutils/coreutils is MIT-licensed and primarily hosted on GitHub (with issues and PRs there) whereas GNU coreutils is GPL-licensed and hosted on gnu.org (with mailing lists).
EDIT: I'm not expressing a personal opinion, just stating how things are. The license change may indeed be of interest to some companies.
The GPL protects the freedom of the users while MIT-licensed software can be easily rug-pulled or be co-opted by the big tech monopolists.
Using GitHub is unacceptable as it is banning many countries from using it. You are excluding devs around the world from contributing. Plus it is owned by Microsoft.
So we replaced a strong copyleft license and a solid decentralized workflow with a centralized repo that depends on the whims of Microsoft and the US government and that is somehow a good thing?
Whether the rewrite should be adopted to replace the original is certainly a big discussion. But simply writing a replacement isn’t really worth complaining about.
Furthermore, if these architectures are removed from further debian updates now, is there any indication that, once there's a rust toolchain supporting them, getting them back into modern debian wouldn't be a bureaucratic nightmare?
These architectures aren't being removed from Debian proper now, they already were removed more than a decade ago. This does not change anything about their status nor their ability to get back into Debian proper, which had already practically vanished.
i.e. they are only still around because they haven't caused any major issues and someone bothered to fix them up from time to time on their own free time
so yes, you probably won't get them back in once they are out as long as a company doesn't shoulder the (work time) bill for it (and with it I mean long term maintenance more then the cost of getting them in)
but for the same reason they have little to no relevance when it comes to any future changes which might happen to get them kicked out (as long as no company steps up and shoulders the (work time) bill for keeping them maintained
The GCCRS project can't even build libcore right now, let alone libstd. In addition, it is currently targeting Rust 1.50's feature set, with some additions that the Linux kernel needs. I don't see it being a useful general purpose compiler for years.
What's more likely is that rustc_codegen_gcc, which I believe can currently build libcore and libstd, will be stabilised first.
What I don't get is the burning need for Rust developers to insult others. Kind of the same vibes that we get from systemd folks and LP. Does it mean they have psychological issues and deep down in their heart they know they need to compensate?
I remember C vs Pascal flame back in the day but that wasn't serious. Like, at all. C/C++ developers today don't have any need to prove anything to anyone. It would be weird for a C developer to walk around and insult Rust devs, but the opposite is prevalent somehow.
... where?
I think it’s a combination of religion decreasing in importance and social media driving people mildly nuts. Many undertakings are collecting “true believers”, turning into their religion and social media is how they evangelize.
Rust is a pretty mild case, but it still attracts missionaries.
So, the people are different, Western society’s different and social media’s giving everyone a voice while bringing out the worst in them.
> Rust is a security nightmare. We'd need to add over 130 packages to main for sequoia, and then we'd need to rebuild them all each time one of them needs a security update.
What has changed? Why is 130 packages for a crypto application acceptable?
The dependency explosion is still a problem and I’m not aware of any real solution. It would have been interesting to to see why their opinion changed… I’m guessing it’s as simple as the perceived benefits overriding any concerns and no major supply-chain attacks being known so far.
https://github.com/keepassxreboot/keepassxc/issues/10725#iss...
It's insane that x86 Debian is still compiling all software targeting Pentium Pro (from 1995!).
x64 Debian is a bit more modern, and you must splurge for a CPU from 2005 (Prescott) to get the plethora of features it requires
Debian 13 raised the x86 requirement to Pentium 4 because LLVM required SSE2 and Rust required LLVM.
The target before was not Pentium Pro in my understanding. It was Pentium Pro equivalent embedded CPUs. Servers and desktops since 2005 could use x86-64 Debian.
Note that Debian no longer supports x86 as of Debian 13.
The cost of supporting this old hardware for businesses or hobbyists isn’t free. The parties that feel strongly that new software continue to be released supporting a particular platform have options here, ranging from getting support for those architectures in LLVM and Rust, pushing GCC frontends for rust forward, maintaining their own fork of apt, etc.
(In my second-tier university at my developing country, the Sun workstation hadn’t been turned on in years by the late 2000s, and the the minicomputer they bought in the 1980s was furniture at the school)
Edit: As for big businesses, they have support plans from IBM or HP for their mainframes, nothing relevant to Debian.
See (relatively recent) list of manfuacturers here:
https://en.wikipedia.org/wiki/List_of_x86_manufacturers
and scroll down for other categories of x86 chip manufacturers. These have plenty of uses. Maybe in another 30 years' time they will mostly be a hobby, but we are very far from that time.
But you are also completely ignoring limited-capabilities hardware, like embedded systems and micro-controllers. That includes newer offerings from ST Microelectronics, Espressif, Microchip Technology etc. (and even renewed 'oldies' like eZ80's which are compatible with Zilog's 8-bit Z80 from the 1970s - still used in products sold to consumers today). The larger ones are quite capable pieces of hardware, and I would not be surprised if some of them use Debian-based OS distributions.
BTW, today is Pentium Pro's 30 years anniversary.
why not? I still want to run modern software on older machines for security and feature reasons
What's the long-term play for Canonical here?
Open source fundamentally is a do-ocracy (it's in literally all of the licenses). Those who do, decide; and more and more often those who do are just one or two people for a tool used by millions.
The obvious potential motivations are things like making a more reliable product, or making their employees more productive by giving them access to modern tools... I guess I could imagine preparing for some sort of compliance/legal/regulatory battle where it's important to move towards memory safe tooling but even there I rather imagine that microsoft is better placed to say that they are and any move on canonical's part would be defensive.
Presumably it's rewriting critical parsing code in APT to a memory-safe language.
This doesn't seem like a noteworthy change to the degree to which GNU/Linux is an accurate name... though there are lots of things I'd put more importance on than GNU in describing debian (systemd, for instance).
Edit: Looks like Perl 1.0 was under the following non-commercial license, so definitely not always GPL though that now leaves the question of licensing when debian adopted it, if you really care.
> You may copy the perl kit in whole or in part as long as you don't try to make money off it, or pretend that you wrote it.
https://github.com/AnaTofuZ/Perl-1.0/blob/master/README.orig
But, there are now a lot more replacements for GNU's contributions under non-copyleft licenses, for sure.
More seriously I think Linux in general could benefit from a bit more pruning legacy stuff and embracing new so I count this as a plus
It's also "relatively easy" to add a new backend to Rust.
There's a policy document for Rust here: https://doc.rust-lang.org/rustc/target-tier-policy.html
There are a lot of things that can go wrong. You want to be able to test. Being able to test requires that someone has test hardware.
Much of the language used seems to stem from nauseating interactions that have occured in kernel world around rust usage.
I'm not a big fan of rust for reasons that were not brought up during the kernel discussions, but I'm also not an opponent of moving forward. I don't quite understand the pushback against memory safe languages and defensiveness against adopting modern tooling/languages
Apparently, Rust is part of the "woke agenda"
If you could separate the language from the acolytes it would have seen much faster adoption.
As far as i read on HN, the only memory safe language discused on HN is rust and mostly with childish pro arguments.
I think it isn’t reasonable to infer that nobody uses something because you don’t know anybody who uses it in your niche. I know lots of embedded programmers who use Rust.
I think the linked requirement, the hype you see, and rust's own material is misleading: It's not a memory-safety one-trick lang; it's a nice overall lang and tool set.
https://github.com/rust-embedded/cortex-m
Even the embedded world is slowly changing.
That's you. At companies like Microsoft and Google, plenty of people think about and discuss Rust, with some products/features already using Rust.
EC2 (lots of embedded work on servers), IAM, DynamoDB, and parts of S3 all heavily use Rust for quite a few years now already.
We can move really fast with Rust as compared to C, while still saving loads of compute and memory compared to other languages. The biggest issue we've hit is the binary size which matters in embedded world.
Linux has added support for Rust now. I don't think Rust's future supremacy over C is doubtful at this point.
AWS might honestly be the biggest on Rust out of all the FAANGs based on what I've heard too. We employ loads of Rust core developers (incl Niko, who is a Sr PE here) and have great internal Rust support at this point :). People still use the JVM where performance doesn't matter, but anywhere where performance matters,I don't see anyone being okay-ed to use C over Rust internally at this point.
Additionally to that, a part of the team doesn't had fun on writing code with Rust.
We trashed the whole tool, which was a massive loss of time for the project.
This is your bias alone. I know tons of people and companies that do. Rust most likely runs on your device.
Most people nowadays who criticize Rust do so on a cultural basis of "there are people who want this so and it changes things therefore it is bad". But never on the merits.
Rust is a good language that contains in its language design some of the lessons the best C programmers have internalized. If you are a stellar C programmer you will manually enforce a lot of the similar rules that Rust enforces automatically. That doesn't mean Rust is a cage. You can always opt for unsafe if ypu feel like it.
But I know if my life depended on it I would rather write that program in Rust than in C, especially if it involves concurrency or multiprocessing.
Practically on embedded the issue is that most existing libraries are written in C or C++. That can be a reason to not choose it in the daily life. But it is not a rational reason for which a programming language sucks. Every programming language had once only one user. Every programming language had once no dependencies written in it. Rust is excellent in letting you combine it with other languages. The tooling is good. The compiler error messages made other language realize how shitty their errors were.
Even if nobody programmed in Rust, the good bits of that language lift the quality in the other languages.
This is entirely the wrong lens. This is someone who wants to use Rust for a particular purpose, not some sort of publicity stunt.
> I know nobody that programms or even thinks about rust. I’m from the embedded world a there c is still king.
Now’s a good time to look outside of your bubble instead of pretending that your bubble is the world.
> as long as the real money is made in c it is not ready
Arguably, the real money is made in JavaScript and Python for the last decade. Embedded roles generally have fewer postings with lower pay than webdev. Until C catches back up, is it also not ready?
Secondly the argument that because you don't use it in your area no one should use it in OS development is nonsensical.
People selling slop does not imply much about anything other than the people making the slop
(I similarly have yet to see a single convincing argument to try to fight past the awkward, verbose and frustrating language that is rust).
No changes required. Bringing up the fil-C toolchain on weird ports is probably less work than bringing up the Rust toolchain
It also doesn't help you to attract new contributors. With the changes we made over in Ubuntu to switch to rust-coreutils and sudo-rs, we have seen an incredible uptake in community contributions amongst other things, and it's very interesting to me to try to push APT more into the community space.
At this time, most of the work on APT is spent by me staying awake late, or during weekends and my 2 week Christmas break, the second largest chunk is the work I do during working hours but that's less cool and exciting stuff :D
Adding Rust into APT is one aspect; the other, possibly even more pressing need is rewriting all the APT documentation.
Currently the APT manual pages are split into apt-get and apt-cache and so on, with a summary in apt(8) - we should split them across apt install(8), apt upgrade (8) and so on. At the same time, DocBook XML is not very attractive to contributors and switching to reStructuredText with Sphinx hopefully attracts more people to contribute to it.
Rust is the present and the future and it's quite logical that it becomes a key requirement in Linux distributions, but I'm really not convinced by the wording here… This last sentence feels needlessly antagonistic.
But for end users on Debian trying to compile rust stuff is a nightmare. They do breaking changes in the compiler (rustc) every 3 months. This is not a joke or exaggeration. It's entirely inappropriate to use such a rapidly changing language in anything that matters because users on a non-rolling distro, LIKE DEBIAN, will NOT be able to compile software written for it's constantly moving bleeding edge.
This is an anti-user move to ease developer experience. Very par for the course for modern software.
That is, in fact, a gross exaggeration. Breaking changes to rustc are extremely rare.
The rustc version will be fixed for compaibility at every release and all rust dependencies must be ported to apts.
In the debian context, the burden imposed by rust churn and "cargo hell" falls on debian package maintainers.
First, Debian is not a distro where users have to compile their software. The packages contain binaries, the compilation is already done. The instability of Rust would not affect users in any way.
And second, as a developer, I never had a more unpleasant language to work with than Rust. The borrow checker back then was abysmal. Rust is not about developer happiness - Ruby is - but its memory safety makes it a useful option in specific situation. But you can be sure that many developers will avoid it like a plague - and together with the breakage and long compile times that's probably why moves like the one dictated here are so controversial.
I would be worried if even C++ dependencies were added for basic system utilities, let alone something like Rust.
Now, granted, I'm not an expert on distro management, bootstrapping etc. so maybe I'm over-reacting, but I am definitely experiencing some fear, uncertainty and doubt here. :-(
This is the status quo and always has been. gcc has plenty of extensions that are not part of a language standard that are used in core tools. Perl has never had a standard and is used all over the place.
... This is Debian we're talking about here?
... What distros are recommended for those who intend to continue trying to squeeze utility out of "retro computing devices"?
... And what sort of minimum specifications are we talking about, here?
I don't know if the rust compiler produces bigger binaries, but for a single program, it'll not make a big difference.
> I find this particular wording rather unpleasant and very unusual to what I'm used to from Debian in the past. I have to admit that I'm a bit disappointed that such a confrontational approach has been chosen.
Ref: https://lists.debian.org/debian-devel/2025/10/msg00286.htmlBecause that saves a lot of headaches down the line.
Don't want to introduce complex code to only copy the parts that are actually reachable would be silly and introduce bugs.
But keep in mind valgrind is super buggy and we spend quite a bunch of time working around valgrind false positives (outside of amd64)
Or I guess if you interpret this as a societal scale: we've collectively used C in production a lot, and look at all the security problems. Judgment completed. Quality is low.
By Sequoia, are they talking about replacing GnuPG with https://sequoia-pgp.org/ for signature verification?
I really hope they don't replace the audited and battle-tested GnuPG parts with some new-fangled project like that just because it is written in "memory-safe" rust.
Meanwhile, GnuPG is well regarded for its code maturity. But it is a C codebase with nearly no tests, no CI pipeline(!!), an architecture that is basically a statemachine with side effects, and over 200 flags. In my experience, only people who haven't experienced the codebase speak positively of it.
Loved this statement on the state of modern software using the backbone of C (in linux and elsewhere)
C modern C++
"int foo[5]" -> "array<int,5> foo"
It's easy to criticize simple examples like the one above, since the C++ (or Rust) version is longer than the C declaration, but consider something like this: char *(*(**foo[][8])())[];
and the idiomatic Rust equivalent: let foo: Vec<[Option<fn() -> Vec<String>>; 8]> = Vec::new();
The later can be parsed quite trivially by descending into the type declaration. It's also visible at a glimpse, that the top-level type is a Vec and you can also easily spot the lambda and it's signature.Another ergonomic aspect of the Rust syntax is that you can easily copy the raw type, without the variable name:
Vec<[Option<fn() -> Vec<String>>; 8]>
While the standalone C type looks like this: char *(*(**[][8])())[]
which is quite a mess to untangle ;)Also, I think C# is generally closer to Rust than to C when it comes to the type syntax. A rough equivalent to the previous example would be:
var foo = new List<Func<List<string>>?[]>();
I can't deny that "?" is more ergonomic than Rust's "Option<T>", but C# has also a way less expressive type system than Rust or C++, so pick your poison. > Be careful. Rust does not support some platforms well.[0] ANything
> that is not Tier 1 is not guaranteed to actually work. And
> architectures like m68k and powerpc are Tier 3.
>
> [0] <https://doc.rust-lang.org/beta/rustc/platform-support.html>.
[ The rustc book > Platform Support: https://doc.rust-lang.org/beta/rustc/platform-support.html ][ The rustic book > Target Tier Policy: https://doc.rust-lang.org/beta/rustc/target-tier-policy.html... ]
Thank you for your message.
Rust is already a hard requirement on all Debian release
architectures and ports except for alpha, hppa, m68k, and
sh4 (which do not provide sqv).
Create a plan to add support for {alpha, hppa, m68k, and
sh4,} targets to the Rust compiler- 2.5pro: "Rust Compiler Target Porting Plan" https://gemini.google.com/share/b36065507d9d :
> [ rustc_codegen_gcc, libcore atomics for each target (m68k does not have support for 64-bit atomics and will need patching to libgcc helper functions), ..., libc, liballoc and libstd (fix std::thread, std::fs, std::net, std::sync), and then compiletest will find thousands of bugs ]
So, CI build hours on those actual but first emulated ISAs?
"Google porting all internal workloads to ARM, with help from GenAI" (2025) https://news.ycombinator.com/item?id=45691519
"AI-Driven Software Porting to RISC-V" (2025) https://news.ycombinator.com/item?id=45315314
"The Unreasonable Effectiveness of Fuzzing for Porting Programs" (2025) https://news.ycombinator.com/item?id=44311241 :
> A simple strategy of having LLMs write fuzz tests and build up a port in topological order seems effective at automating porting from C to Rust.
Delusional overconfidence that developer “skill” is all that is needed to overcome the many shortcomings of C is not a solution to the problem of guaranteeing security and safety.
0: https://rustfoundation.org/media/ferrous-systems-donates-fer...
(Plus, architecture quantity isn’t exactly the thing that matters. Quality is what matters, and Rust’s decision to conservatively stabilize on the subset of LLVM backends they can reliably test on seems very reasonable to me.)
The war is over. ARM and x86 won.
(We detached this subthread from https://news.ycombinator.com/item?id=45782109.)
Rust also has multiple compilers (rustc, mrustc, and gccrs) though only one is production ready at this time.
But the people who use the language have an amazing talent to make people on the fence hate them within half a dozen sentences.
They remind me of Christian missionaries trying to convert the savages from their barbarous religions with human sacrifice to the civilised religion with burning heretics.
Not sure how that’s relevant when CL is basically dead and no one wants to work with it, while Rust is flourishing and delivering value
Fast forward 5 centuries, it turns out they were in fact pretty successful as South America central Africa are the places where Catholicism is the most active today, far more than in Europe.
What would really be scary would be a distro that won't even boot unless a variety of LLM's are installed.
Boo!
I struggle to believe that this is really about a call to improve quality when there seem to be some other huge juicy targets.