Looks like they intend to merge back with node mainline... ?
EDIT: Found this: http://blogs.windows.com/buildingapps/2015/05/12/bringing-no...
They're doing this to be able to run Node.js apps on Windows 10 ARM (on which V8 supposedly doesn't run?)
"We will be submitting a pull request to Node.js after stabilizing this code, fixing key gaps and responding to early community feedback."
"Going forward, we plan to work closely with the Node Foundation, the Node.js Technical Committee(s), IO.js contributors and the community to discuss and participate in conversations around creating JavaScript engine agnostic hosting APIs for Node.js, which provide developers a choice of JavaScript engine that they would want to use in their Node.js workflow"
Looks like the pull request will consist mostly of exposing new hooks to integrate with Chakra / other JS engines and won't involve pulling any Chakra code into Node.js (which would be unlikely to be merged). Might lead to a SpiderMonkey version of Node.js at some point, too. Nice to see IO.js mentioned. Looks like a very positive initiative (assuming it doesn't complicate Node core too much)
I worked on that 4 years ago :) <http://zpao.com/posts/about-that-hybrid-v8monkey-engine/>. The Node community at the time wasn't a huge fan, though it's effectively the same thing that MS just did (build a minimal V8 API shim on top of another JS engine). I guess everybody is ok with a little fragmentation now. Our intention was also to try to get this upstreamed, however with low interest and other things to do, we didn't follow through.
I'm excited to see this, and especially to have the MS folks involved with the TC. I'd love to see an engine-agnostic API but realistically I don't think it'll happen, at least not anytime soon. Right now Node itself definitely relies pretty heavily on the V8 APIs. Those APIs can be abstracted for the most part (even if each engine is just shimming those parts of the V8 API) but the other problem is the longer tail of binary npm modules. Right now they have the full V8 API to work with. If they do a shim layer then it will come at cost for every vendor except V8. And then maintaining that layer as V8 changes APIs. If you go the engine-agnostic API route, then you will need coordination between engine vendors. That opens the doors to a multitude of problems.
I'm not sure exactly where I sit on this one. In many regards I think I like it, allowing developers to use the right engine for the right job, for example. Each engine has its own strengths and it may be that v8 isn't the engine for you / your workload. It's just this could hurt as much as it helps.
V8 definitely has an ARM runtime, so maybe this is a result of the restrictions on what's allowed to run on the platform? (e.g. iOS and Windows Phone don't allow JIT compilers except the ones provided by the platform itself)
Though this is a job that is already done by JXCore.
I was experimenting a couple of days ago with high duty and extreme DOM nodes crunching and FF's SpiderMoneky blew Chrome's V8 out of the water for 7 - 9 multiples gain in performance measured in time elapsed to complete the operations.
Chrome's V8 engine at this point is so overrated
Sorry but unless you have deep knowledge of how both engines and browsers work (knowing how `appendChild` is actually implemented for starters), you simply cannot write a working benchmark. Even then, it's very hard and tedious.
If you don't have time to obtain such expertise, you could take a shortcut and compare realistic end-to-end benchmark. E.g. if your game runs at 210-270 fps in firefox but only at 30 fps in chrome, then you could claim that "firefox blows chrome out of the water".
It's very easy (just look at 80%+ of jsperfs) to construct benchmarks that don't look completely broken to the untrained eye but actually are. The common theme is the benchmark missing many aspects of realistic code and being reduced to measuring irrelevant optimizing compiling features. For example the benchmark could only be measuring how thorough the engine's dead code elimination pass is even though what you wanted to benchmark is string concatenation performance.
This likely has nothing to do with the JS engines themselves and everything to do with the browser they were running in. To actually benchmark something like that you'd need to simulate the dom with something like https://github.com/tmpvar/jsdom
IMO it's not too surprising - it's relatively simple to fast forward to io.js from where it is now, wheras reverse engineering it backwards to Node compatibility would be mayhem.
https://msdn.microsoft.com/en-us/library/aa365247(VS.85).asp...
This criticism is a bit misplaced here, since the whole reason for this limitation is due to backward compatibility with older versions of Windows (and software written for older versions).
It's not like this is a new change in Windows; it's been there for ages.
https://github.com/Microsoft/node/tree/ch0.12.2/deps/chakras...
Edge (and particularly IE) are fairly heavily tied to the OS in a bunch of places. IE, for example, can do weird FTP and Windows Explorer stuff. The infamous "Internet Settings" dialog and the way IE deals with stuff like proxy servers is only sort-of part of IE. IE's network stack is largely dependant on the bits and pieces available in the OS below (consider IE11 can only use SPDY on Windows 8). I wouldn't be surprised if open sourcing the browser wholesale would start unraveling a lot of things that MS doesn't intend to be public.
You know it includes multiple code parts under various licenses, Wikipedia says: BSD license, MIT License, LGPL, MS-PL and MPL/GPL/LGPL tri-licensed ( http://en.wikipedia.org/wiki/Chromium_(web_browser) )
There is a reason why major open source projects like Linux, etc. choose licenses like GNU GPL v2+. http://en.wikipedia.org/wiki/Embrace,_extend_and_extinguish and http://en.wikipedia.org/wiki/Fear,_uncertainty_and_doubt
Yeah, because the tech world didn't have enough problems with projects being immature, unreliable, stale 30+ year designs, abandoned, incompatible, not provided by a specific distribution, coflicting, patented and 100 other issues to consider.
It just had to also add 200 legal distinctions behind what you can and you cannot do, and how you can link stuff and under what circumstances.
To the GP, though, I don't immediately see how linking to Chakra in this way would be a license issue. The more important thing is the license information for Node, though, not Chromium: https://github.com/joyent/node/blob/master/LICENSE (some overlap but quite a bit that doesn't)
[x] embrace
[x] extend
[ ] extinguish
It's actually exactly the opposite. Their API abstraction work will only increase competition, especially since they aren't trying to run a competing fork. Despite their history, those in favor of a more open Node.js platform should commend this.
That said, there are plenty of examples of the extinguish phase not working out or resulting in less of a bang and more of a whimper. It's always been more of an infected blanket than nuclear warheads.