So not asking rhetorically, if we had all the insight and knowledge we have now, how would you make it different?
If you want answers, state clearly what _specific_ problem you're trying to solve. Whatever the solution to it might be, vague and fuzzy questions—while magnets for chatter since they can stand in for whatever someone wants to read out of them—are not the way to get there.
(You could say that this is needlessly tedious because everyone already knows what we're talking about, but that this isn't true is exactly my position. It's certain that something like half the people reading, thinking, and writing are have in mind one thing, while the other half are thinking of another—and the third half are thinking about something different from either of those. We're also programmers, so dealing with tedium and the constraints of having to be explicit should be second nature.)
This is a great line. If HN had a quote of the month or something, this should be nominated for that.
html, a crappy defective xml implementation refuses to grow up, js, while great for little html tweaks is not adopting any of the useful features found in popular npm packages. It was actively developed for 2 weeks. Ripping off it's head (nodejs) gave us a poor sailor jargon ~ but without the boats!
Therefore there is nothing wrong with npm, she is a fine ship. The harbor doesn't want to take it's much desired cargo, it must sail the 7 seas forever mon capitaine!
When you wonder whether to add a dependency, you should ask yourself: What are the upsides and downsides of adding this dependency. One downside is always that by adding a dependency, you add a potential security problem, a potential point of breakage, and more complexity.
There are situations where these are well justified. If your dependency is stable, from a trustworthy source, and if it is a functionality that you cannot quickly implement yourself. But if you include a dependency that is effectively one line of code, the question answers itself: The costs of adding a dependency is completely unreasonable. It your list of dependencies grows into the 100s, you're doing something wrong.
When you combine those together you end up with a situation where "normal" js code not from a library can't be trusted on the front end because it won't work for x% of your users, and offers a clumsy API on the backend that you'd prefer be wrapped in a helper. Developers learnt that they should reach for a library to e.g. deal with localstorage (because on Safari in private mode the built-in calls throw an error instead of recording your data and discarding it after the tab closes) or make a HTTP request (because node doesn't support fetch yet and you don't want to use the complicated streams and callbacks API from the standard lib) and they propelled that culture forward until everyone was doing it.
This feels like a bit of a strawman, since sorting is already in the standard library and there aren’t in fact popular sorting packages for each framework (that would in fact be ridiculous).
If you want to start a real debate though, bring up date/time pickers.
There are multiple date picker, time picker and datetime picker packages for each framework, and there are debates with good points on all sides about whether the browser-provided pickers are sufficient, or whether this is an area where a level of customization is needed and what that level is keeps changing as people discover new ways of designing date/time pickers and new use cases arise that require different tradeoffs. It’s both really frustrating but also kind of understandable.
That said, you can still have a core set of “blessed” packages that serve the common needs.
If analyzing the dependencies for showing in the NPM web UI, while analyzing, as you exceed 40 direct or transitive dependencies, abort and highlight this package in red, for having excessive dependencies.
If installing locally, you get what you get, don't install random or crazy packages, stick to well known high-quality minimal-dependencies packages. nodejs does include file reading and writing, http server, http client, json ... that will take you pretty far. Master the basics before getting too fancy. And remember, you don't need some company's client package just to make some http requests to their API.
Basically, you would need to start accepting that you are responsible for any dependencies you choose to include. Any upstream changes you would need to evaluate and bring in or patch yourself.
Definitely an impossible task given how broad and deep modern package dependencies are, but at least you’d start feeling the insanity of having all if them in the first place :P.
While it's ridiculous to expect that people will audit every single dependency and sub-dependency, it's not ridiculous to expect tooling to do the same.
Packages should be given an overall quality rating (and honestly it might be great for an ecosystem as large, diverse, and welcoming-to-beginners as JS/TS), part of the score comes from the number of different dependencies/sub-dependencies -- a social package score if you will. If a package causes the dependency graph to explode, give a warning before installing it.
Then, if you're NPM, you don't need all of these convoluted and exploitable policies around un-publishing.
It's not ridiculous at all. Professional programmers should answer for the dependencies they bring into their projects.
Our devops guys scream from the seething pain whenever the have to debug some pile of shit that decides it won't build unless all the runes are aligned precisely and all the RAM in the universe is available on the build runners. And pushing this to the developers results in importing more packages thus adding to the burning tyre fire.
And after several hours of builds and 9000 layers of packages you wake up one morning and in that 50 meg chunk of javascript that is excreted from the process, someone managed to inject a "Slava Ukraini!" banner into your web app.
Over-reliance on third party dependencies is a choice. One could argue that it's unreasonable not to do it if you want to stay competitive but good luck changing human nature then. If there are shortcuts, they will be taken.
Make the cost of reusing software non-zero again.
It doesn't have to be as painful as C++ without package managers, but should make every developer spend about 5~10 minutes labor work for adding each direct dependency, or one minute for each new dependency in the dependency closure.
(If you don't want Google to see what packages you're fetching, you can also turn this off with an environment variable.)
the community shouldn’t need to write a bunch of tiny utility packages to do common things.
in other words, make it easier to avoid the deeply nested dependency mess that js encourages.
Have smarter users. If your package breaks because it depends on trivial code which got deleted, you shouldn't have depended on that in the first place.
Preventing people from deleting their code -- always, or even just sometimes -- was never the right solution.
Not having a registry is neat, but I'm also unsure of what is going to happen over time as dependencies may be moved or removed. You can see that with old Maven pom.xml where some dependencies do not resolve anymore.
Once the referencing packages are updated are deleted or modified the shadow versions can be dropped.
> It was removed, but then reemerged under a different scope with over 33,000 sub-packages. It's like playing whack-a-mole with npm packages!
> This whole saga is more than just a digital prank. It highlights the ongoing challenges in package management within the npm ecosystem. For developers, it's a reminder of the cascading effects of dependencies and the importance of mindful package creation, maintenance, and consumption.
> As we navigate the open source world, incidents like the everything package remind us of the delicate balance between freedom and responsibility in open-source software.
Source: have done a bunch of AI-assisted writing to develop my own skills and the tics and specific turns of phrases really pop out to me.
Ironically, the most common place I read the tic of ending a piece of persuasive with a deliberate, unconnected conclusion that doesn't persuade and instead equivocates or states a trivialism ... is in student papers or similarly graded-like-assignments rote work.
Could be that there's a lot of that out there such that it's heavily represented in training data. Could just be a person doing a not-great writing job.
"accidentally broke NPM and all I got was this sweet permanent banner all over my Github (thats impossible to remove since they probably had to code it up last minute before removing the org/repo)"
When I was consulting for an R&D lab at eBay, we open sourced a bunch of our work in a GitHub org. It was sanctioned by eBay's OSPO; they even linked to it from their main open source directory.
7 years later, long after the team disbanded, someone in eBay's current legal team decided that the (now archival) org violated eBay's trademarks. For the last year+, every time I've opened GitHub, I've been met with the same undismissable banner.
Since the only choice they give you is to contact support, I did. Unfortunately, their support team is not responsive, and has a completely separate notifications system. It took an inordinately long time for them to respond. (I have poor reception here so I can't check, but I think it was months.) Since I'm not in the habit of checking GitHub Support for new messages, when they eventually replied, I missed it. I had to start a whole new ticket. That too was months ago, and I still haven't heard back.
So because I did some work for a skunkworks eBay team in 2015, the top 150px of my GitHub are unusable, and there's apparently nothing I can do about it until some call center decides to write me back.
You're right that a package that depends on literally everything would absolutely have a score of 0 in our system.
'everything' blocks devs from removing their own NPM packages - https://news.ycombinator.com/item?id=38873944 - Jan 2024 (102 comments)
Has no one thought of that? It seems like it should have been obvious that such an absolute rule could be easily abused to troll the system at scale.
Not sure if it's a problem though, perhaps all unpublishing requests should be reviewed by someone at the registry (and granted only when it makes sense).
Is npm specifically vulnerable to this kind of thing? Or is it just a cultural elelemnt of npm that there are more micro-packages?
At some point, Russ Cox got the Fear about this, and now https://proxy.golang.org/ is an on by default, caching proxy in the middle. You can still delete your packages whenever you want to though.
"Just install the everything package, then you will be sure to have the right package"
> First, just want to apologize about any difficulties this package has caused.
No rationale. No shame. Just the word “apologize” in a sentence.
Who downloaded it though? Surely as a dev if you download such a package it’s on you?
It’s the world of worse is better and they’re going for the widest possible area of effect. Should we crucify these guys? 100% not. Part of this is on npm’s design and implementation. Part of it is cultural.
But these guys owe the people who were needlessly “inconvenienced” a little more than just the word “apologize”. Not their first born but some rationale which justifies or reveals that they realise it was a bit pointless or stupid.
The wildcard "any version of dependency" preventing unpublish is clearly flawed. The "everything" package folks had no malicious intentions, and nobody would benefit from a long-winded, ashamed apology. If not for NPM's flawed unpublish policy the everything team would've unpublished to resolve the issue.
I just think it would have been good to give the “I was hoping to investigate X, I did not expect Y, I can see now that it was irresponsible to do X.”
I don’t think that’s particularly long winded.
Upon rereading the article I can see that the word “unintended” is actually not Patrick’s but the author of the recap’s word.
Beyond that you seem to be ascribing benign intent. Reading it from the horses mouth [1] it doesn’t seems like they had any intent other than trying to find out if it could be done. In a world of worse is better creating the largest possible area of effect for your experiment seems to be a pretty easy way to amp up the consequences of your actions regardless of the risk.