Which adds support for ES modules: https://medium.com/@nodejs/announcing-core-node-js-support-f...
However the exports syntax requires a relative url, e.g. ‘./index.mjs’ not ‘index.mjs’. The fix is here: https://github.com/then/is-promise/pull/15/commits/3b3ea4150...
Not that long, but my issue with this release snafu is that:
- the build didn't pass CI in the first place
- the CI config wasn't updated to reflect the most recent LTS release of node
- the update happened directly to master (although that's to how the maintainer wants to run their repo. it's been my experience that it's much easier to revert a squashed PR than most other options)
- it took two patch versions to revert (where it may have only taken one if the author could have pressed "undo" in the PR)
Another way is to pin down the specific versions without ~ or ^ in the package.json so your updates don't break stuff.
Of course, such a situation can't last forever. If the idea is good enough, eventually someone will come along and, as Linux did to Unix, kill the parent and hollow out its corpse for a puppet, leaving the vestiges of the former ecosystem to carve out whatever insignificant niche they can. Now the major locus of incompatibility in the "Unix" world is in the differences between various distributions, and what of that isn't solved by distro packagers will be finally put to rest when systemd-packaged ships in 2024 amid a flurry of hot takes about the dangers of monoculture.
Bringing it back at last to the subject at hand, Deno appears to be trying to become the Linux of Javascript, through the innovative method of abandoning the concept of "package" entirely and just running code straight from wherever on the Internet it happens to live today. As a former-life devotee of Stack Overflow, I of course applaud this plan, and wish them all the luck they're certainly going to need.
The impetus behind "lol javascript trash amirite" channer takes today is exactly that behind the UNIX-Haters Handbook of yore. I have a printed copy of that, and it's still a fun occasional read. But those who enjoy "javascript trash lol" may do well to remember the Handbook authors' stated goal of burying worse-is-better Unix in favor of the even then senescent right-thing also-rans they favored, and to reflect on how well that played out for them.
The UHH is a fun read, yes, but the biggest real-world problem with the Unix Wars was cross-compatibility. Your Sun code didn't run on Irix didn't run on BSD and god help you if a customer wanted Xenix. OK, you can draw some parallel here between React vs. Vue vs. Zeit vs. whatever.
But there was also the possibility, for non-software businesses, to pick a platform and stick to it. You run Sun, buy Sun machines, etc. That it was "Unix" didn't matter except to the software business selling you stuff, or what kind of timelines your in-house developers gave.
There is no equivalent in the JS world. If you pick React, you're not getting hurt because Vue and React are incompatible, you're getting hurt because the React shit breaks and churns. Every JavaScript community and subcommunity has the same problem, they keep punching themselves in the face, for reasons entirely unrelated to what their "competitors" are doing. Part of this is because the substrate itself is not good at all (way worse than Unix), part is community norms, and part is the piles of VC money that caused people to hop jobs and start greenfield projects every three months for 10 years rather than face any consequences of technical decisions.
Whatever eventually hollows out the mess of JS tech will be whatever figures out how to offer a stable developer experience across multiple years without ossifying. (And it can't also happen until the free money is gone, which maybe has finally come.)
I agree that VC money is ultimately poison to the ecosystem and the industry, but that's a larger problem, and I could even argue that it's one which wouldn't affect JS at all if JS weren't fundamentally a good tool.
(To your edit: granted, and React, maybe and imo ideally plus Typescript, looks best situated to be on top when the whole thing shakes out, which I agree may be very soon. The framework-a-week style of a lot of JS devs does indeed seem hard to sustain outside an environment with ample free money floating around to waste, and React is both easy for an experienced dev to start with and supported by a strong ecosystem. Yes, led by Facebook, which I hate, but if we're going to end up with one de facto standard for the next ten years or so, TS/React looks less worse than all the other players at hand right now.)
The UHH is a fun read, yes, but the biggest real-world
problem with the Unix Wars was cross-compatibility.
Your Sun code didn't run on Irix didn't run on BSD
and god help you if a customer wanted Xenix.
OK, you can draw some parallel here between
React vs. Vue vs. Zeit vs. whatever.
But
You made your point, proved yourself wrong, and then went ahead ignoring the fact that you proved yourself wrong.POSIX is a set of IEEE standards that have been around in one form or another since the 80s, maybe JavaScript could follow Unix's path there.
Vanilla -> jQuery -> Angular.js -> Angular 2+, React pre-Redux existence -> modern React -> Vue (and hobby apps in Svelte + bunch of random stuff: Mithril, Hyperapp, etc)
I have something to say on the topic of:
> "If you pick React, you're not getting hurt because Vue and React are incompatible, you're getting hurt because the React shit breaks and churns."
I find the fact that front-end has a fragmented ecosystem due to different frameworks completely absurd. We have Webcomponents, which are framework-agnostic and will run in vanilla JS/HTML and nobody bothers to use them.
Most frameworks support compiling components to Webcomponents out-of-the-box (React excepted, big surprise).
https://angular.io/guide/elements
https://cli.vuejs.org/guide/build-targets.html#web-component
https://svelte.dev/docs#Custom_element_API
If you are the author of a major UI component (or library of components), why would you purposefully choose to restrict your package to your framework's ecosystem. The amount of work it takes to publish a component that works in a static index.html page with your UI component loaded through a <script> tag is trivial for most frameworks.
I can't tell people how to live their lives, and not to be a choosy beggar, but if you build great tooling, don't you want as many people to be able to use it as possible?
Frameworks don't have to be a limiting factor, we have a spec for agnostic UI components that are interoperable, just nobody bothers to use them and it's infuriating.
You shouldn't have to hope that the person who built the best "Component for X" did it your framework-of-choice (which will probably not be around in 2-3 years anyways, or have changed so much it doesn't run anymore unless updated)
---
Footnote: The Ionic team built a framework for the singular purpose of making framework-agnostic UI elements that work with everything, and it's actually pretty cool. It's primarily used for design systems in larger organizations and cross-framework components. They list Apple, Microsoft, and Amazon as some of the people using it in production:
Deno always sounded more "like the Plan 9 of Javascript" personally to be honest. It seems to be better (yay for built-in TypeScript support! Though I have my reservations about the permission management, but that's another discussion) but perhaps not better enough (at least just yet) to significantly gain traction.
This, exactly this. Young me thought this was a point of the whole thingy we call Internet.
And exactly that is what I like about QML from Qt. Just point to a file and that's it.
I really like Deno for this reason. Importing modules via URL is such a good idea, and apparently it even works in modern browsers with `<script type="module">`. We finally have a "one true way" to manage packages in JavaScript, no matter where it's being executed, without a centralized package repository to boot.
So I'm not sure how much everything-used-to-be-great-nostalgia is justified here.
Any package and package manager has hot points:
- no standards, api connection issues (different programming styles and connection overhead)
- minor version issues (just this 1 hour bug 0-day)
- major sdk issues (iOS deprecate OpenGL)
- source package difference (Ubuntu/CentOS/QubesOS need a different magic for use same packages)
- overhead by default everywhere that produce multiple issues
There are trade-offs, absolutely. Waiting on a vendor to fix a problem _for months_, while sending them hefty checks, is far inferior to waiting 3 hours on a Saturday for a fix, where the actual issue only effects new installations of a CLI tool used by developers, and can trivial be sidestepped. If anything, it's a chance to teach my developers about dep management!
I'm positive my stack includes `is-promise` about 10 times. And I have no problem with that. If you upgrade deps (or don't) in any language, and don't have robust testing in place, the sysadmin in me hates you - I've seen it in everything from Go to PHP. There is no silver bullet except pragmatism!
Sadly, I dream of doing this very thing every day. I'm at that notch on the thermometer just before "burned out". I love creating a working app from scratch. However, I'm so sick of today's tech. The app stores are full of useless apps that look like the majority of other apps whose sole purpose is to gather the user's personal data for monetizing. The web is also broken with other variations of constant tracking. I'm of an age where I remember time before the internet, so I'm not as addicted as younger people.
Making upstream changes indeed would be very, very hard. But I never have to make upstream changes because they’ve spent quite a large amount of effort on stability.
Pragmatism - do programming to solve real life problems rather than create a broken ecosystems which requires constant changes (and learning just to be on top of them) to fix a bad design
I think the snark is obscuring the point of this comment.
function isPromise(obj) {
return !!obj && (typeof obj === 'object' || typeof obj === 'function') && typeof obj.then === 'function';
}A function like this should be a package. Or, really, part of standard js, maybe.
A) The problem it solves is real. It's dumb, but JS has tons of dumb stuff, so that changes nothing. Sometimes you want to know "is this thing a promise", and that's not trivial (for reasons).
B) The problem it solves is not straightforward. If you Google around you'll get people saying "Anything with a .then is a promise' or other different ways of testing it. The code being convoluted shows that.
Should this problem be solved elsewhere? Sure, again, JavaScript is bad and no one's on the other side of that argument, but it's what we have. Is "just copy paste a wrong answer from SO and end up with 50 different functions in your codebase to check something" like other languages that make package management hard so much better ? I don't think so.
At work, our big webapp depended at some point indirectly on "isobject" "isobj" and "is-object", which were all one liners (some of them even had dependencies themselves!!). Please let's all just depend on lodash and it will actually eventually reduce space and bandwith usage.
const isFalsy = require("is-falsy");
const isObject = require("is-object");
const isFunction = require( "is-function" );
const hasThen = require( "has-then" );
function isPromise(obj) {
return !isFalsy(obj) && ( isObject(obj) || isFunction(obj) ) && hasThen( obj );
}
Just because the code line is more than 50 characters, doesn't mean that we need a new library for that.I think this would be the solution. I feel like a lot of the NPM transitive dependency explosion just comes from the fact that JavaScript is a language with a ton of warts and a lack of solid built-ins compared to e.g. Python. Python also has packages and dependencies, but the full list of dependencies used by a REST service I run in production (including a web framework and ORM) is a million times smaller than any package-lock.json I've seen.
x instanceof Promise
It works for standard promises, sure there are non standard promises, ancient stuff, that to me shouldn't be used (and a library that uses them should be avoided). So why you need that code in the first place?Also that isPromise function will not work with TypeScript, imagine you have a function that takes something that can be a promise or not (and this is also bad design in the first place), but then you want to check if the argument is a Promise, sure with `instanceof` the compiler knows that you are doing, otherwise not.
Also, look at the repo, a ton of files for a 1 line function? Really? You take less time to write that function yourself than to include that library. But you shouldn't have to write that function in the first place.
One-liners without dependencies like this should live as a function in a utility file. If justification is needed, there should be a comment with a link to this package's repo.
At the very least, W3, or Mozilla Foundation, or something with some kind of quasi-authority should release a "JS STD" package that contains a whole bunch of helper functions like this. Or maybe a "JS Extras" package, and as function usage is tracked across the eco-system, the most popular/important stuff is considered for addition into the JS standard itself.
Having hundreds of packages that each contain one line functions, simply means that there are hundreds of vectors by which large projects can break. And those can in turn break other projects, etc.
The reason, cynically, that these all exist as separate packages, is because the person who started this fiasco wanted to put a high a download count as possible on his resume for packages he maintains. Splitting everything up into multiple packages means extra-cred for doing OSS work. Completely stupid, and I'm annoyed nobody has stepped up with a replacement for all this yet.
In properly designed languages, values have either a known concrete type, or the interfaces that they have to support are listed, and the compiler checks them.
Even in JavaScript/TypeScript, if you are using this, you or a library you are using are doing it wrong, since you should know whether a value is a promise or not when writing code.
It's a part of node, at least: https://nodejs.org/docs/latest-v12.x/api/util.html#util_util...
See Promise.resolve https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...
And is there a reason to use '!!' inside a conditional? Wouldn't obj&& do basically the same thing?
The predictable explosion in dependency trees has caused the predictable problems like this one. I feel I much prefer the C/C++ way of "modules are easy to make, but difficult to use".
class World { then () { return 0; } } isPromise(new World) // true
If there really isn't a safe and better way to tell if an object is an instance of Promise…then color me impressed.
(async () => ({
then() {
console.log("Called")
}
}))()I think the real evil here is that by default npm does not encourage pinned dependency versions.
If I npm install is-promise I'll get something like "^1.2.1" in my package.json not the exact "1.2.1". This means that the next time someone installs my CLI I don't know exactly what code they're getting (unless I shrinkwrap which is uncommon).
In other stacks having your dependency versions float around is considered bad practice. If I want to go from depending on 1.2.1 to 1.2.2 there should be a commit in my history showing when I did it and that my CI still passed.
I think we miss the forest for the trees when we get mad about Node devs taking small dependencies. If they had pinned their version it would have been fine.
The whole point of semantic versioning is to guarantee breaking changes are expressed through major versions. If you break your package’s compatibility and bump the version to 1.2.1 instead of 2.0.0 then people absolutely should be upset.
Yes, this is by design. If this weren't the case, the ecosystem would be an absolute minefield of non-updated transitive dependencies with unpatched security issues.
And it's even worse in cargo, because specifying "1.2.1" means the same thing as "^1.2.1".
Unless you mean Create React App should pin all of their (transitive) dependencies and release new versions multiple times a day with one of those dependencies updated.
There's a reason companies stick with old COBOL solutions, modern alternatives simply aren't stable enough.
Is someone going to fix that?
Probably not. There is too much code in the wild, and NPM owns the entire JS ecosystem, and there has been too much investment in that ecosystem and its culture at this point for a change in course to be feasible.
The JS universe is stuck with this for the foreseeable future.
The problem is when you try to level criticism at this culture and a cloud chorus of people will show up to assert that somehow tiny deps are good despite these glaring issues (a big one just being security vulns). And funnily enough, the usual suspects are precisely people publishing these one-liner libs. Then people regurgitate these thoughts and the cargo cult continues.
So there's no "fix" for NPM (not even sure what that would mean). I mean, anyone can publish anything. People just have to decide to stop using one-liner libs just because they exist.
Thinking of the package as a black box, if the implementation for left-pad or is-promise was 200 lines would it suddenly be ok for so many other packages to depend on it? Why? The size of the package doesn't make it less bug-prone.
I see plenty of people who are over-eager to always be up-to-date, when there really isn't any point to it if your system works well, and so they don't pin their versions. This will break big applications when one-line packages break, but also when when 5000-line packages break. Dependencies are part of your source, don't change it for the sake of changing it, and don't change it without reviewing it.
Of course it does. It's more bug-prone just by being a package. More code is more bugs and more build-system annoyance is more terror (=> more bugs). If I only need one line of functionality I will just copy and paste that line into my project instead of dealing with npm or github.
> Dependencies are part of your source
I agree. If you see news about broken packages like this and you don't just shrug your shoulders your build-system might be shit.
This was an honest oversight, and even somewhat inevitable with so many expected supported ways to import/export between cjs mjs amd umd etc. It will happen again.
And when it happens the next time, if it ruins your life again, take issue with yourself for not pinning your dependency versions, rather that package maintainers trying to make it all happen.
Dependency management is not as simple as you seem to think.
Pinning isn't meant to be a forever type of commitment. You're just saying, "all works as expected with this particular permutation of library code underneath." And the moment your dependencies release their hot-new you can retest and repin. Otherwise you're flying blind and this type of issue will arise without fail.
Users of Debian Stable missed Heartbleed entirely. It simply never impacted them.
I agree with the parent that it’s important to lock to avoid surprises (in Ruby, we commit the Gemfile.lock for this reason), but it’s equally as important to stay up to date.
1. https://nimbleindustries.io/2020/01/31/dependency-drift-a-me...
Github even bought Dependabot last year, so it's now free.
Works for existing apps, but people using create-react-app and angular CLI can't even start a new project.
Without doing that bit of diligence, this type of issue should be 100% expected.
It's also just not needed. Simply specifying an exact version ("=2.5.2") will avoid this problem. The code for a version specified in this manner does not change.
And then to see "npm detected 97393 problems" or whatever the message exactly is.
When you want to upgrade your dependencies, then go ahead and do that, on your own schedule, with time and space to fix whatever issues come up, update your tests, QA, etc.
const aPromise = Promise.resolve(1);
const notAPromise = 2;
Promise.resolve(aPromise).then((x) => console.log(x));
Promise.resolve(notAPromise).then((y) => console.log(y));
// Logs:
// 1
// 2> why a library like this is even necessary?
Do you know how to determine whether something is a Promise?
Wrong. Also the first few StackOverflow answers are wrong or incomplete.
You know what's better? Using the same library 3.4 million repos depend on, that is tested and won't break if you use a package-lock.
> Can't you just wrap everything and treat it like a promise?
Maybe. Maybe not. Treating everything as a Promise means you have to make your function asynchronous even if not necessary.
https://journal.stuffwithstuff.com/2015/02/01/what-color-is-...
(red is async, blue is sync)
You trust that minor version upgrades won't break the system, or that malicious code won't be introduced. But we're human... things break.
This can happen in any ecosystem, but npm is particularly vulnerable because of it's huge dependency trees. Which is only possible due to the low overhead of creating, including and resolving packages.
That's why npm has the "package-lock" file, which takes a snapshot of the entire dependency tree, allowing a truly reproducible build. Not using this is a risk.
Things like this are so not worth a package, ever, it's something when you see it you go "oh yeah, that's the obvious, easy way of doing this" it's not a package, it's a pattern. I can promise you, this was only ever added to packages because people wrongly assumed because since it's about "promises" (spooooky) it must be complex and worthy of packaging.
As someone who doesn't do front-end work regularly, but also sank about 3 consecutive weeks (~6-8 hours/day) in the last year into understanding generators, yielding, and promises... I can tell you, the actually scary part about all of this, is pretty much no one just reads the fucking docs or the code they're adding.
Moral of the story, especially in the browser: the reward of reading the code before adding it is enormous, you'd be surprised how often the thing you want is just a simple pattern. Taking that pattern and applying it to your specific use case, instead of imposing that pattern on your use case will give you giant wins.... Learn the patterns and you're set for life.
This is manageable when you are using Packagist, this is manageable when you are using Maven, where all dependencies are flat. When compatibility issues arise they have to be dealt with upstream.
This is NOT manageable when you are using NPM that will go fetch 30 different versions of the same package because crazy dependency resolution.
This is not a JS issue like people claim here, this is 100% a NPM issue because whoever designed this was too busy being patronizing on Twitter rather than making sensible design decisions.
Sometimes I also take a look at the code.
And I've chosen the one with less dependencies often enough.
FWIW, I also won't add something to my project if I see it has a ton of dependencies on stupid shit. Literally, I gave up on react after realizing `create-react-app` is what the community recommends. I'm glad I did too. It's an insane amount of bloat, for nothing included but a view renderer, and if that's how that community rolls... I'm gonna have to pass.
If the code is in your codebase it does not need a test.
In fact, there are alternatives to React.
The idea is that pinning major versions lets you get non-breaking improvements from package authors who use semver properly, and pinning exact known-good versions lets you avoid surprises in your CI builds.
It works pretty well when you start from a known good state and vet your dependencies reasonably well. The trouble here seems to be largely that CRA is designed, among other purposes, to serve people just getting into the ecosystem of which it's a part, and those people are unlikely to be familiar enough with the details I've described to be able to effectively respond.
The comparison with left-pad is easy, but this isn't at all on the same scale. It's a bad day for newbies and a minor annoyance for experienced hands. And, of course, cause for endless spicy takes about how Javascript is awful, but such things are as inevitable as the sunrise and merit about the same level of interest.
But as we're still doing human versioning one way or another in package management, there will always be cases where it doesn't perfectly follow its versioning scheme or otherwise behaves unexpectedly because of a change. It's almost like we need new ways of programming where the constructs and behavior of the program/library are built up via content-addressing so you can version it down to it's exact content.
For example:
Running `yarn why is-promise` in a CRA app:
`Hoisted from "react-scripts#react-dev-utils#inquirer#run-async#is-promise"`
Currently, running a `yarn upgrade-interactive --latest` doesn't indicate there are any updates, so presumably, this is still a problem upstream.
Also, if anyone's in a pinch right now, luckily enough, I made this yesterday, for an interview I had only a couple hours ago. I lucked out! But if anyone else might need it, maybe it'll help someone:
https://github.com/cryptoquick/demo-cra-ts
Oh, and, uh, pardon the pun... :/
https://classic.yarnpkg.com/en/docs/selective-version-resolu...
For many who are hell-bent on entering these companies, yet have no known packages under their belt, they very well might fire off a one line package that actually gets some downloads, to be better "prepared" when screened.
return !!obj && (typeof obj === 'object' || typeof obj === 'function') && typeof obj.then === 'function';
https://github.com/then/is-promise/blob/master/index.js
This is insane
This is much less of an issue when using a lockfile, at least for existing packages/projects.
With optional chaining I would however use this check:
typeof x?.then === 'function'
Or if I was code golfin: x?.then?.call
The first case does not account for built in prototype extensions and the second has false positives with certain data structures.So the function in is-promise should be available as Promise.isThenable or Promise.is.
The package devs clearly violated semver guidelines and npm puts a lot of faith in individual packages to take semver seriously. By default it opts every user into semver.
If you need semver to be explained to you bottom up (lists of 42 things that require a major bump) then you don't get semver. All you have to do is think: will releasing this into a world full of "^1.0.0" break everyone's shit?
This and left-pad are extreme examples. But any maintainer with a package.json who tries to do right by `npm audit` knows that there is an endless parade of suffering at the hands of semver misuse. Most of it doesn't make the news.
npx @angular/cli new hello-world-project
and that worked. I have remote Angular training on Monday and didn't want to do a global install.If deps are immutable, then nothing anyone does in any other package (short of having the package repository take the code down) should be able to break your future builds.
If that were true, TFA would not be news.
Why are these threads filled with people who know nothing about node?
npm and yarn both have lockfiles for this purpose. Vendoring only bloats your repos.
That’s a quite bad assumption from your part based on almost no information.
I don’t know about the rest of the thread but I’m personally quite familiar with node. A lock file doesn’t fix the same issues vendoring does. The lock file gives you an explicit list of version used, vendoring save the exact copies of the dependency with the rest of your code.
By vendoring anyone who is working on the project is using the exact same version of a dependency, AND you don’t have to care about an external provider (the registry being up, etc, that’s way easier for you CI too), AND you can review dependencies upgrade via git as if it was your code.
Of course that’s a mess when the JavaScript ecosystem has an infinite amount of dependencies for a hello world.
I regularly extract features from my apps into new npm packages. This way they can be reused by other apps.
Troglodytes can keep copy-pasting code between apps while npm users publish once and update everywhere.
Why NPM?? What is the point?
No point at all.
A waste of time and a yawning security hole
Maybe not to this extend, but if X (where X is whatever you are thinking about) had similar amount of people using it (especially junior people) this would happen there as well.
Other languages don't publish/import packages that are one line of code. I have never seen an issue like this with any other language that I've worked with.
Any sane developer that needed a one-liner like this would just manually implement it.
Not to mention that these sorts of functions are unnecessary in languages with a good stdlib or statically typed languages like rust, etc.
As a comparison, Django, a large Python web framework, has only three dependencies (pytz, sqlparse, and asgiref), which don't have dependencies themselves
declare function isPromise<T, S>(obj: Promise<T> | S): obj is Promise<T>;
This is, indeed, the only line of exported code in the entire package.I genuinely don't understand the NPM world.
EDIT: I wasn't dissing the developers. They have regression, this was just an accident. I was stating it is important. My bad (too late to delete).
See https://github.com/then/is-promise/blob/master/.travis.yml (missing v11, v12, v13, v14)
return !!obj && (typeof obj === 'object' || typeof obj === 'function') && typeof obj.then === 'function'