I think the cause of computing freedom is likely better served by building high-quality wasm disassemblers (radare has an open ticket, for instance) and by making sure that wasm code is so tightly sandboxed that DRM can't work, i.e., that you have the equivalent of an "analog hole" because you can write a browser extension / patch that taps all the data and the inside code can't tell. Hoping that technologies won't get developed has historically not been a productive approach for software freedom; the folks who want to take our freedom have enough resources that they'll do it whether or not there's a standards process involved.
Even if it isn't, it's not like the browser itself is just some binary blob these days. Taking the source for firefox or blink or webkit and compiling your own version with slight changes is not only possible, it's already done in many instances. What's Mozilla's response when you take their browser and provide a fork with the sole change being to provide more freedom and choice? Not that it even matters, as it's not like you need to get mindshare from the public for this, developers that want to see the source will actively search for and find solutions, or make their own. There are simple extensions to bypass CORS controls for most (all?) browsers. If they didn't exist, browser variants disabling those same security mechanisms would exist.
It's all opt-in gentlemen's agreement style security. Both ends have to by in for it to work, and you control your end...
Most likely these tools will just be built-in, the same way http/2 doesn't have a text-based protocol but you don't notice anything different when using devtools to see network requests.
Many websites today have readable JavaScript because that's the natural thing to do; you just send down the JavaScript source in the original form and it runs.
Many native applications today have unreadable source because that's the natural thing to do; you compile your C or C++ code, and you only need to ship the binary. Your binary even gets smaller if you remove debugging symbols.
You can do otherwise in both cases (obfuscaters in the former, provide source in the latter) but it requires an active decision to do so. Much fewer software authors make the conscious decision to leave their source code readable or unreadable based on what they want a priori. Same with server- vs. client-side development; you can easily hide all your source by keeping it server-side, but for the sake of some technical goal people will decide to move parts client-side, and decide that having it be world-readable is okay.
OP is advocating for a world where people continue to default to providing their source, not one where people are compelled to.
If you render to a canvas rather than generate plain text, then I have to use screen readers with built-in OCR to perform "copy," which is a pain. It doesn't protect you, but it makes my experience worse.
What we're learning from music and movies is that any movement to try to restrict users just leads to user flight; any movement that opens up and enables users to have a great experience with your IP, leads to user delight.
If anything, the WebAssembly form might be more readable.
"With this, developers can start shipping WebAssembly code. For earlier versions of browsers, developers can send down an asm.js version of the code. Because asm.js is a subset of JavaScript, any JS engine can run it. With Emscripten, you can compile the same app to both WebAssembly and asm.js.Even in the initial release, WebAssembly will be fast. But it should get even faster in the future, through a combination of fixes and new features."
And it doesn't make sense to use it for physics because GPUs can do that much faster (so that would be WebGL, which like everything else is JS only).
Even still, it's great to see that things are still moving on smoothly (and the new logo looks really nice!).
Interestingly enough, the "clunky" Emscripten compilation path is quite a bit faster than the WASM backed ATM because it bypasses all the cruft in the LLVM backend which can be pretty slow.
I think this could be much more useful if at least some APIs didn't have to go through JS. Not saying that's easy.
Part of the problem is the whole idea that every program is supposed to test for the existence of every feature it might need. I think that's ridiculous. I suggested on github that actually what needs to happen eventually is to decompose the web platform into a a bunch of semantically versioned modules. One big problem with that is that modern modularization is not really first class in C++ because of its legacy worldview.
So if I wanted to support a standard that prioritizes easy, accessible exchange of information, openess and user control with my server, where would I look?
A part of the web has simply become an application distribution system. That's not necessarily a bad thing, many other important websites are still very much open and accessible, like Wikipedia for example.
The web just became bigger than it was before 2001.
One potential issue:
"If you have lots of back-and-forth between WebAssembly and JS (as you do with smaller tasks), then this overhead is noticeable."
As far as I'm aware, asm.js code does not have an issue with this, as it is just js code. Is this correct?
(edit: I should have mentioned that I'm primarily interested from an electron.js point of view at the moment, where Firefox asm.js optimizations are unavailable)
Memory allocation is particularly painful in asm.js, for example.
It's not, because asm.js is treated specially in some cases. For instance, in Firefox asm.js calls to JS have to go through an FFI, IIRC.
It looks like Mozilla's asm.js implementation used to have the exact same problem: https://hacks.mozilla.org/2015/03/asm-speedups-everywhere/#c... I'm not familiar enough with asm.js to know if that's still a problem.
The web browser is still a second-rate user interface toolkit compared to a native app, but at least this gives us a slight step forward. Whether that's enough, or whether most application development will be made using native toolkits in walled gardens, remains to be seen.
[1] https://github.com/kripken/emscripten/wiki/New-WebAssembly-B...
-s WASM=1 -s USE_SDL=2
for WebAssembly + SDL2.
The test_sdl2* tests in emscripten's test suite have working samples in them,
https://github.com/kripken/emscripten/blob/master/tests/test...
A full SDL1 testcase is in the tutorial,
http://kripken.github.io/emscripten-site/docs/getting_starte...
not a compiler? ok.