1152 qubits sounds like the D-Wave chips. So does that mean 6000 D-wave chips ?
Even if you reverse the calculation, that would be 60000 minutes on 1 chip, which is about 42 days only, so. Quantum Too Good
Hype aside - the largest number factored using Shor on a physical device is 21 (unclear if they actually used the result of the factoring to design the circuits like they did with 15).
Back then I modelled the quantum circuit as a set of unitaries (by parametrizing them through their generator), that operate on one or two qubits, set a limit to the amount of steps and the amount of controlled gates and then threw different optimization algorithms at it. I got the best performance using simple dense neural networks. What's cool is that I could generate a training set really quickly since I could just randomly build tensor products of unitary matricies to create billions of unitaries of up to 7 qubits in minimal time and then just see how close I can get given a fixed length for the quantum circuit and a fixed number of control gates.
I really liked this approach and it was fun to work on. However it was ultimately limited as the size of the matrices scales exponentially with the number of qubits.
https://arxiv.org/abs/1905.09749 | How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits
Crypto does not, for a lot of reasons, but biggest I can think of is that hashing is still one-way, public keys are hidden (until used, which is why it is important to expose your public key only when using funds).
When there is a viable ECC attack vector, it will not be much effort to migrate to a more mature PQC. Better to wait as long as possible, maybe even have a crypto built on PQC to field test it with money on the line -- a few billion in market cap goes a long way to incentivizing breaking the crypto involved.
No quantum computer has ever been used for that purpose in real life, however.