The resident C/C++ experts here would have you believe that the same is possible in C/C++. Is that true?
In C++? Maybe, but you’d need to make sure you stay on top of using thread safe structures and smart pointers.
What Rust does is flip this. The default is the safe path. So instead of risking forgetting smart pointers and thread safe containers, the compiler keeps you honest.
So you’re not spending time chasing oddities because you missed a variable initialisation, or you’re hitting a race condition or some kind of use after free.
While there’s a lot of people who say that this slows you down and a good programmer doesn’t need it, my experience is even the best programmers forget and (at least for me), I spend more time trying to reason about C++ code than rust, because I can trust my rust code more.
Put another way, Rust helps with reducing how much of the codebase I need to consider at any given time to just the most local scope. I work in many heavy graphics C and C++ libraries , and have never had that level of comfort or mental locality.
For me it isn't even that it catches these problems when I forget. It is that I can stop worrying about these problems when writing the vast majority of code. I just take references and use variables to get the business logic implemented without the need to worry about lifetimes the entire time. Then once the business logic is done I switch to dealing with compiler errors and fixing these problems that I was ignoring the first time around.
When writing C and C++ I feel like I need to spend half of my brainpower tracking lifetimes for every line of code I touch. If I touch a single line of code in a function I need to read and understand the relevant lifetimes in that function before changing a single line. Even if I don't make any mistakes doing this consumes a lot of time and mental energy. With Rust I can generally just change the relevant line and the compiler will let me know what other parts of the function need to be updated. It is a huge mental relief and time saver.
Smart pointers are no panacea for memory safety in C++ though: even if you use them fastidiously, avoiding raw pointer access, iterator invalidation or OOB access will come for you. The minute you allocate and have to resize, you're exposed.
For what it’s worth, the same is true of Swift. But since much of the original Rust team was also involved with Swift language development, I guess it’s not too much of a surprise. The “unsafe” api requires some deliberate effort to use, no accidents are possible there. It’s all very verbose through a very narrow window of opportunity if you do anything unsafe.
Talking of C++ it can be really solid to work with your own data structures where you control code on both ends. Using templates with something like boost::serialization or protobuf for the first time is like magic. E.g you can serialize whole state of your super complex app and restore it on other node easily.
Unfortunately it's just not the case when you actually trying to work with someone else API / ABI that you have no contol over. Even worse when it's moving target and you need to maintain several different adapters for different client / server versions.
>In 2021, society is driven by a virtual Internet, which has created a degenerate effect called "nerve attenuation syndrome" or NAS. Megacorporations control much of the world, intensifying the class hostility already created by NAS.
from Johnny Mnemonic
What can we do to make it more utopian?
If you stream without a face camera at all it generally hurts your ability to grow an audience, and unfortunately our society is still pretty focused on appearance so if you don't look great you're going to potentially get a lot of toxicity in your chat. A vtuber avatar acts as an equalizer in this sense and also lets people express their personality and aesthetics visually in a way that might not otherwise be easy - they can pick eye and hair colors they think represent them without having to use colored contacts or hair dyes, etc.
A few different people I know found that having a vtuber avatar made it much easier for them to get into streaming regularly and it did grow their audience, so I'm happy to see the technology catch on and improve.
I don't think someone sharing their craft through a virtual avatar is any more responsible for these things than the flying cars from Blade Runner would be.
Is it?
It's basically forums & avatars brought in the medium of audio and video communication.
> What can we do to make it more utopian?
A polity with an outmost shell of no bs ic spooks in a ratio of twenty to one cybersec defense to offense. There is the problem of sciengineers conceiving in the labs photonic computing but the committee member wage/salary slave cuts cost corners (or not but bloats up on unnecessary complexity) and we get the worsest join on the venn diagram in the industry spec.What would really push it into cyberpunk territory is if it turns out this is not an actual human but an AI-controlled virtual person.
I don't have the impression that in Marcan's case it was ever about anonymity, it is more about a creative expression.
Up until Lina's introduction on April 1st, I had never seen a vTuber stream, and I must say it is quite fun to watch. Though personally I wish Lina's voice is tweaked a bit, because it can be hard to understand what she is saying.
VR/AR just hasn't been done right as of now, but its getting close. Demand is there. Imagine virtual schooling during time like Covid, but instead of Zoom, kids actually see each other in VR and can interact with each other.
More recently, it's fairly common to use a hypervisor or simulator for kernel debugging in device driver development on Windows via Hyper-V.
A lot of Linux driver development is done using qemu as well, although this is usually more targeted and isn't quite the same "put a thin shim over the OS running on the hardware" approach.
The flexibility and I/O tracing framework in m1n1 are pretty uniquely powerful, though, since it was built for reverse engineering specifically.
But the modern day these development is crazy.
How can yo manage a 100+ structure in a language you just learnt (Rust) for a secret GPU the vendor does not share info.
… I miss my NeXTs..
- C64 shipped with 6526, a fixed version of 6522
- C64 is incompatible with 1540 anyway
They crippled C64 for no reason other than to sell more Commodore manufactured chips inside a pointless box. C128 was similar trick of stuffing C64 with garbage leftover from failed projects and selling computer with 2 CPUs and 2 graphic chips at twice the price. Before slow serial devices they were perfectly capable of making fast and cheaper to manufacture floppies for PET/CBM systems.
It's about the dominant unholistic approach to modern operating system design, which is reflected in the vast number of independent, proprietary, under-documented RTOSes running in tandem on a single system, and eventually leading to uninspiring and lackluster OS research (e.g. Linux monoculture).
I'm guessing that hardware and software industries just don't have well-aligned interests, which unfortunately leaks into OS R&D.
https://www.osfc.io/2022/talks/i-have-come-to-bury-the-bios-...
As for the components, at least their interfaces are standardized. You can remove memory sticks by manufacturer A and replace them with memory sticks from manufacturer B without problem. Same goes for SATA SSDs or mice or keyboards.
Note that I'm all in favour of creating OSS firmware for devices, that's amazing. But one should not destroy the fundamental boundary between the OS and the firmware that runs the hardware.
DNA is the worst spaghetti code imaginable.
The design is such a hack, that it's easier to let the unit die and just create new ones every few years.
For example, Intel's ME could be a really useful feature if we could do what we want with it. Instead they lock it down so it's just built-in spyware.
(Especially because wall clock time is not the only kind of performance that matters.)
I'm totally fine with it (I'm grateful the story is being told at all), but it is surreal tone for technical writing.
Vinesauce has been streaming since well before twitch, and their content got significantly more "Twitch"-y after they embraced the current system. It's obvious why, because if you play into the chat begging, the surface level """interaction""", then you get more money from the parasocial twelve year olds with mom's credit card.
But I don't want my content full of ten second interruptions as a robot voice reads off the same tired joke somebody paid ten dollars to get read off.
Couldn't they just not show themselves on camera at all?
When every statement is exciting and special, then none of them are.
I find it hard to analyze these things by numbers alone. It's context that really matters and if there truly is a baseline excitement, there really should be a high number of exclamations.
https://youtu.be/SDJCzJ1ETsM?t=1179
How can people watch this?
Mario Brothers would make more sense though. Whoever created this is a plumber par excellence.
> The compiler is very picky, but once code compiles it gives you the confidence that it will work reliably.
> Sometimes I had trouble making the compiler happy with the design I was trying to use, and then I realized the design had fundamental issues!
I experience a similar sentiment all the time when writing Rust code (which for now is admittedly just toy projects). So far it's felt like the compiler gives you just enough freedom to write programs in a "correct" way.
I don't really do unsafe/lower-level coding, so I can't speak to much there however.
The 2015MBP one was the last one that was passable for me, what came after is horrible. Even the new MBP that has real ports again is still not as good as the 2015 in terms of keyboard.
And the coprocessor called “ASC” also have similarities with Python, where the GPU is doing the heavy lifting, but the ASC (like Python) interact using shared memory. The same Python is doing with a lot of its libraries (written in C/C++)
It's a processor, not a programming language :) The team has essentially strapped the API into something that you can poke with Python instead of with a native driver.
Awesome job.
For one example, Windows ARM kernels are pretty tied to the GIC (ARM's reference interrupt controller), but Apple has its own interrupt controller. Normally on ntoskrnl this distinction would simply need hal.dll swapped out, but I've heard from those who've looked into it that the clean separation has broken down a bit and you'd have to binary patch a windows kernel now if you don't have source access.
What you can do is having a small hypervisor to simulate the needed bits…
"Apple designed their own interrupt controller, the Apple Interrupt Controller (AIC), not compatible with either of the major ARM GIC standards. And not only that: the timer interrupts - normally connected to a regular per-CPU interrupt on ARM - are instead routed to the FIQ, an abstruse architectural feature, seen more frequently in the old 32-bit ARM days. Naturally, Linux kernel did not support delivering any interrupts via the FIQ path, so we had to add that."
https://news.ycombinator.com/item?id=25862077
TL;DR: No standard ARM interrupt controller, custom controller requires quirky architectural features
Who is Asahi Lina? Is that an actual person?
Apple don't have linux drivers. It would be great if they wrote some, but it's never going to happen.
> Who is Asahi Lina? Is that an actual person?
The virtual persona of an actual person who has chosen to remain anonymous (hence the name which would be a crazy coincidence otherwise).
They are Canadian born, currently studying in Japan, so that explains some of the cultural mix.
Man... if I was a conspiracy theorist who believed Apple was genuinely evil, what if Asahi Lina is an Apple employee? ;)
Asahi Linux has been upstreaming, but of course it's ongoing. The GPU driver in particular depends on some rust inside the kernel bits which aren't in the mainline kernel, yet. The 6.1 kernel has some Rust bits, 6.2 will have more, but I don't believe that will be enough for the GPU driver ... yet.
Apple's drivers are upstreamed, in Darwin. I'm not aware of any reason to believe that Apple has any Linux drivers that they could upstream.
To what upstream project?