That's always one major thing I saw breaking old builds: old binaries stop being hosted, forcing you to rebuild them from old source, which no longer builds under current toolchains - making you either downgrade the toolchain that itself may be tricky to set up, or upgrade the library, which starts a cascade of dependency upgrades.
It's not like Node projects are distributed with their deps vendored; there's too much stuff in node_modules.
Yes it does, that's the whole point. You can still go and install the first version of express ever put on npm from 12 years ago. You can also install any of the 282 releases of it that have ever been put on npm since then. That's the whole point of a registry, it wouldn't be useful if things just disappeared at some random point in time.
The only packages that get removed are malware and such, and packages which the vendor themselves manually unpublish [0]. The latter has a bunch of rules to ensure packages that are actually used don't get removed, please see the link below.
Yes you can find edge cases with problems. Using this as an argument for "breaks 3 times per week" does not hold.
(Also note that outside the web/mobile space, projects that weren't updated in a year are still young, not old. "Old" is more like 5+ years.)
The two things are related. If your typical project has a dependency DAG of 1000+ projects, a bug or CVE fix somewhere will typically cause a cascade of potentially breaking updates to play out over multiple days, before everything stabilizes. This creates pressure for everyone to always stay on the bleeding edge; with a version churn like this, there's only so many old (in the calendar sense) package dists that people are willing to cache.
This used to be a common experience some years back. Like many others, I gave up on the ecosystem because of the extreme fragility of it. If it's not like that anymore, I'd love to be corrected.
It used to prior to npm 5 when lockfiles were introduced (yarn introduced lockfiles earlier).