I didn't note timing numbers, but rendering digits in hex was basically instantaneous, even when doing a million (decimal) digits.
Rendering the digits in decimal was significant delay (like 40 seconds for a million digits). That's why it just shows the hex until the very end, and it shows how long the decimal conversion takes.
You sure you don't mean converting digits to hex instead of rendering? The constant rendering of the hex digits as the work progresses results in a ~3x slowdown for 100k digits in my testing (see my other comment in this tree).
Maybe you're right. I know displaying in hex was way faster than displaying in decimal and seemed fast enough, so I stopped there. I didn't test rendering speed specifically.
Doing a million digits in the console (with no status updates) took about an hour and doing it in that page took just over two hours.