When's the last time ls, cat, date, tar, etc needed to be updated on your linux system? probably almost never. And composing them together always works. This set of linux tools, call it sbase, ubase, plan9 tools, etc, is one version of a metapackage. How often does a very large package need to be updated for bug fixes, security patches, or new versions?
Bad example: http://www.slackware.com/security/viewer.php?l=slackware-sec...
They find stuff like this fairly often in GNU coreutils even to this day.. it’s the main reason there’s a Rust coreutils effort.
coreutils: 17 results
linux kernel: 6752 results
x11: 184 results
qt: 152 results
gtk: 68 results
docker: 340 results
rust: 455 results
python: 940 results
node: 110 results
javascript: 5657 results
firefox: 3268 results
chrome: 3763 results
safari: 1465 results
webkit: 1346 results
The large monolithic codebases have a lot more CVEs. I'd also argue that patching a fix on code made up of small, modular parts is much easier to do, and much lower hanging fruit for any casual developer to submit a PR for a fix.
Who would've guessed. Also the older ones also got more CVE's than newer ones, even if they aren't that big.
I don’t think you understand my comment, because you are asking the wrong question again. It is not how often you need to actually update one dependency, but how often you need to check for updates that matters. That has to be done frequently no matter what and must be automated. E.g. despite low number of CVEs in coreutils you have to check them very often, because the impact can be very high. Once you have this process in place, there’s no advantage in using micro-libraries. I’d actually expect that in a micro-library environment most breaking changes happen when a library becomes unsupported and you need to choose an alternative, making the migration process more complicated than in case of a bigger framework which just changed the API.
All of those larger programs can mess things up in different ways, opening up all sorts of attack vectors and bugs. And a smaller utility that's much more easily auditable and easier to contribute fixes to is arguably less likely to have attack vectors in the first place.
I'm not sure you have to check large libraries less often. I would argue at least as much if not more often, because it's harder for people to audit larger codebases and to also contribute fixes to them. A significantly smaller number of devs can (or could, if they had bandwidth/time) understand a large self-contained codebase than a very small self-contained codebase.
I think that if a larger library is made of many smaller, modular, self-contained parts, that are composed together, and you can swap out any smaller building block for something that fits your use case (inversion of control), then that's a really good way to write a large library. Sloppier code might find it's way in due to time/resource constraints though, or the authors might not notice that some of the code isn't entirely modular/composable/swappable.
The same. The frequency of checks is not dependent on the library size and depends on the risk profile of your application, so library size cannot be considered an advantage with regards to updates. In most cases upgrade to newer version can be done automatically, because it is unrealistic expectation that developers will review the code of every dependency and understand it. Breaking changes likely occur with the same frequency (rare), albeit for different reasons, and impact can be on the same scale for tiny and large depenencies.