So much website JavaScript these days is just poor design, tracking, and bloat.
And it will get worse with WASM. At least now we can see the entirity of the code and even patch it if required, and WASM might make that way harder.
I’d argue that having a sandbox that can run binaries with a limited and customizable feature set is actually a good thing for the web. I think there are more technically competent solutions, but the fact that WASM is available on virtually every machine and os, makes it pretty powerful.
If implemented right WASM might speed up our web apps, keep the browser sandbox that is actually quite nice, and run on pretty much any machine. If they open sourced the code, that’d be even better.
Between minified js and WASM, I think I’d take WASM (I can’t understand minified js anyway). Between a pure html site and WASM, I think I’d take the pure html site (but I don’t think we will be living in that world anytime soon).
The difference between minified JS and WASM is that you can un-minify one with relatively good results, whereas decompiling WASM is similar to decompiling normal binaries - pretty hard to read. This means that even experienced users can’t really understand or change WASM binaries.
For WASM you can probably use tools like ghidra to decompile and read.
Minified js not a lot better then raw ASM, single letter names and crazy optimisation patterns will make your life hell. Patching both I think is out of the question, maybe just inject some new js that interact with the DOM.
Did a bit of reverse engineering on binaries in my life, and also spent too much time reading the youtube minified js. Both are hard as hell.
For WASM you can probably use tools like ghidra to decompile and read.
Sure, as I said it’s similar to decompiling normal binaries, which is hard to read (even when you’re used to it).
Minified js not a lot better then raw ASM, single letter names and crazy optimisation patterns will make your life hell. Patching both I think is out of the question, maybe just inject some new js that interact with the DOM.
I’m not talking about reading minified JS. I’m saying: un-minifying JS gets you a way more readable result than decompiling native binaries does. I’ve done both more than often enough to know this difference well.
I’ve written mods and patches for dozens of minified sites, and it’s never been too hard. I’ve written mods and patches for native applications, and it’s waaaay harder - even just finding free space in the binary where you can inject your code and jump to/from is annoying, let alone actually writing your changes in ASM. All of this is immediately solved even with minified JS.
Hmm i guess I just haven’t spent enough time trying to parse unminified js.
I still would think though, if the code is simple enough to understand when you unminify the js, equivalent code should be similarly simple to understand if it’s wasm passed through IDA.
You lose way more information during compilation than you do during minification. This makes reversing the latter much easier than the former.
Remember that JS is much, much higher level than WASM is. Each language will have their own special behaviours and constructs when compiled to WASM, so reversing an algorithm can look completely differently depending on the source language and environment.
Ya, okay that is understandable.
To be honest I have never tried a wasm reversing challenge. I may need to give it a shot.
The problem with sandboxes is that there isn’t a perfect prision. Eventually, ways will be found to break out of it, and there will be bad actors that will take advantage of such.
I completely agree.
However, I still would rather have all the websites I visit pass through my browser’s api than be making straight syscalls.
I think it’s not perfect security but a good line of defense.
I’ll grant that COM, ActiveX, and Adobe/Shockwave Flash turned out to be security nightmares.
But maybe it’ll be fine this time…/s
It’s technically possible that widespread use of hallucination-prone AI code-assist is the quality control tool that was missing in the several previous attempts…
See: the web pyramid, from The Website Obesity Crisis.
33% down, 67% to go.
33% down, 100% to go you mean?
TL;DR, from what I can tell: Dropbox was using a JS bundler that didn’t support code-splitting or tree-shaking (y’know, the staples of modern JS bundling) and swapped to one that does. Not that there aren’t plenty sub-optimal components in code I work on, at home and at work, but there’s nothing revolutionary going on here.
I did this in like 2017 on my first react app. Thought this would be standard practice by now…