With git it's rare for a project that's actually in use to go completely memory-holed, every contributor effectively having a local copy of the resource.
Using git (generally github) repositories for dependency management is, IMO, a hack and so it's not surprising that it often breaks. I like the way buildroot handles it (I'm sure they're not the only ones, but that's the one project I'm most familiar with):
- The buildroot buildbot fetches third party packages dependencies and archive them.
- When you build your buildroot image locally, it attempts to fetch from the third party directly. If the file doesn't exist anymore, it falls back onto the buildroot cache instead.
You could also easily add your own caching layer in there if you wanted too. I think that's distributed computing at its best: simple and robust, with a clear and easily understandable architecture. No blockchain-based proof-of-stake distributed storage, just a series of wget. And of course since everything is authenticated with a strong hash it's always perfectly safe.