Without nonlinearities like optimization/type checking/etc, the runtime is basically just the time to loop through the bytes. Make the input a quarter the size and you have a quarter the bytes to loop through. College classes will say “oh it’s all O(N) anyways, no big deal”. Industry on the other hand says “why make tens of millions of people wait 4 seconds for this to start when we could make them wait 1 second just as easily”.
The product’s source maps are publicly available in a way the debugger understands and the original code is freely accessible so there’s no real downside.