> "but then WHAT is a good measure for QC progress?" [...] you should disregard quantum factorization records.
> The thing is: For cryptanalytic quantum algorithms (Shor, Grover, etc) you need logical/noiseless qubits, because otherwise your computation is constrained [...] With these constraints, you can only factorize numbers like 15, even if your QC becomes 1000x "better" under every other objective metric. So, we are in a situation where even if QC gets steadily better over time, you won't see any of these improvements if you only look at the "factorization record" metric: nothing will happen, until you hit a cliff (e.g., logical qubits become available) and then suddenly scaling up factorization power becomes easier. It's a typical example of non-linear progress in technology (a bit like what happened with LLMs in the last few years) and the risk is that everyone will be caught by surprise. Unfortunately, this paradigm is very different from the traditional, "old-style" cryptanalysis handbook, where people used to size keys according to how fast CPU power had been progressing in the last X years. It's a rooted mindset which is very difficult to change, especially among older-generation cryptography/cybersecurity experts. A better measure of progress (valid for cryptanalysis, which is, anyway, a very minor aspect of why QC are interesting IMHO) would be: how far are we from fully error-corrected and interconnected qubits? [...] in the last 10 or more years, all objective indicators in progress that point to that cliff have been steadily improving
Even 21 was only possible by cheating (optimizing away the difficult part using prior knowledge of the results) [1]. Craig Gidney has a blog post that shows the actual quantum circuit for factoring 21 which is far beyond the capabilities of current quantum computers [2].
*ends as soon as practical quantum computers, something which might never happen, exist.
The author mentions:
> RSA-2048: ~4096 logical qubits, 20-30 million physical qubits
> 256-bit ECC: ~2330 logical qubits, 12-15 million physical qubits
For reference, we are at ~100 physical qubits right now. There is a bit of nuance in the logical to physical correlation though.
Scepticism aside, the author does mention that it might be a while in the future, and it is probably smart to start switching to quantum resistant cryptography for long-running, critical systems, but I'm not a huge fan of the fear-mongering tone.
And no clear quantum Moore law emerging for the yearly increase in qbits (https://arxiv.org/abs/2303.15547)... The quantum panic pushes people to deploy immature solutions, and the remedy sure sometimes looks worse than the illness...
Yeah, I was taking the author's numbers there, and there is a lot of nuance to the logical vs physical qubits relationship. Not super up to date on the latest work there, you got any links?
Given some of the comments in this thread, I would like to link this here:
https://gagliardoni.net/#20250714_ludd_grandpas
An abstract:
> "but then WHAT is a good measure for QC progress?" [...] you should disregard quantum factorization records.
> The thing is: For cryptanalytic quantum algorithms (Shor, Grover, etc) you need logical/noiseless qubits, because otherwise your computation is constrained [...] With these constraints, you can only factorize numbers like 15, even if your QC becomes 1000x "better" under every other objective metric. So, we are in a situation where even if QC gets steadily better over time, you won't see any of these improvements if you only look at the "factorization record" metric: nothing will happen, until you hit a cliff (e.g., logical qubits become available) and then suddenly scaling up factorization power becomes easier. It's a typical example of non-linear progress in technology (a bit like what happened with LLMs in the last few years) and the risk is that everyone will be caught by surprise. Unfortunately, this paradigm is very different from the traditional, "old-style" cryptanalysis handbook, where people used to size keys according to how fast CPU power had been progressing in the last X years. It's a rooted mindset which is very difficult to change, especially among older-generation cryptography/cybersecurity experts. A better measure of progress (valid for cryptanalysis, which is, anyway, a very minor aspect of why QC are interesting IMHO) would be: how far are we from fully error-corrected and interconnected qubits? [...] in the last 10 or more years, all objective indicators in progress that point to that cliff have been steadily improving
The largest number factored by Shor's algorithm is 21.
https://en.wikipedia.org/wiki/Integer_factorization_records
Even 21 was only possible by cheating (optimizing away the difficult part using prior knowledge of the results) [1]. Craig Gidney has a blog post that shows the actual quantum circuit for factoring 21 which is far beyond the capabilities of current quantum computers [2].
[1] https://www.nature.com/articles/nature12290
[2] https://algassert.com/post/2500
That article is likely LLM generated. It has the typical signs and a Grok-like pseudo casual tone.
*ends as soon as practical quantum computers, something which might never happen, exist.
The author mentions: > RSA-2048: ~4096 logical qubits, 20-30 million physical qubits > 256-bit ECC: ~2330 logical qubits, 12-15 million physical qubits
For reference, we are at ~100 physical qubits right now. There is a bit of nuance in the logical to physical correlation though.
Scepticism aside, the author does mention that it might be a while in the future, and it is probably smart to start switching to quantum resistant cryptography for long-running, critical systems, but I'm not a huge fan of the fear-mongering tone.
And no clear quantum Moore law emerging for the yearly increase in qbits (https://arxiv.org/abs/2303.15547)... The quantum panic pushes people to deploy immature solutions, and the remedy sure sometimes looks worse than the illness...
sota for rsa 2048 is <1 million physical qbits
Yeah, I was taking the author's numbers there, and there is a lot of nuance to the logical vs physical qubits relationship. Not super up to date on the latest work there, you got any links?
You mean it will come right when AGI comes?
Fusion powered AGI!
Fusion powered Quantum AGI! (on the blockchain?) ;-)
No, no, at that point they'll busy figuring out an actually quantum-proof blockchain (but powered by AGI)
In your flying car, no less.
That drives/flies itself
...and 100, quite useless qubits too, with insane error rates and extremely fast decoherence times.