I'm assuming by system you mean OS, which is a terrible, terrible idea. Dev stack and system libs should not coexist, especially because system libs should be vetted by the OS vendor, but you can't ask them to do that for dev libs.
> I have to create a new conda environment for almost every ML paper that comes out
That's how it's supposed to work: one env per project.
As for the rest, it's more telling about the C/C++ community building the things bellow the python wrappers.
That causes 50 copies of the exact same version of a 1GB library to exist on my system that are all obtained from the same authority (PyPI). I have literally 50 copies of the entire set of CUDA libraries because every conda environment installs PyTorch and PyTorch includes its own CUDA.
I'm not asking the OS to maintain this, but rather the package manager ("npm" or "pip" or similar) should do so on a system-wide basis. "python" and "pip" should allow for 1 copy per officially-released version of each package to live on the system, and multiple officially-released version numbers to coexist in /usr/lib. If a dev version is being used or any version that deviates from what is on PyPI, then that should live within the project.
Actually conda creates hardlinks for the packages that it manages. Found this out a few weeks ago when I tried migrating my envs to another system with an identical hierarchy and ended up with a broken mess.
> but rather the package manager ("npm" or "pip" or similar) should do so on a system-wide basis.
I basically agree with this. With the caveat that programs should not use any system search paths and packages should be hardlinked into the project directory structure from a centralized cache. This also means that a dev version looks identical to a centralized version - both are just directories within the project.