Most? You still haven't proved that. So most Rust programs mostly use GC, yet it's not a GC language; those are some very mind-contorting definitions.
> The laws of physics absolutely do not predict that the relative cost of CPU to RAM will decrease substantially.
Laws of physics do absolutely tell you that more computation means more heat. Also trying to approach the size of atoms is another no-go. That's why current chip densities have stalled but have been kept on life support via chip stacking and gate redesigns. The 2nm process is mostly a marketing term (https://en.wikipedia.org/wiki/2_nm_process) the actual gate is around 45x20nm.
Not to mention that when working with the way atoms work (i.e. their random nature) and small scales, small irregularities mean low yields.
They put a soft cap on any exponential curve. And hard cap by placing a literal singularity.
> I don't know how reasonable it is to think that.
Why not? With modern collections (std::vector, std::span, and std::string) and modern pointers (std::unique_ptr, std::shared_ptr) you get decent memory safety.
> Because it's both explicit and simple.
Being a simple language doesn't guarantee lack of complexity in implementation (see Brainfuck). The question is how much language complexity buys implementation simplicity. C++ of course has neither because it started with a backwards compatibility goal (it did get abandoned at some point).
By Zig's explicitness, you mean everything is public? I've seen that stuff backfire spectacularly, because you don't get any encapsulation, which means maximum coupling.