Written by: Stephen Hsu
Primary Source: Information Processing
I received an email from a physicist colleague suggesting that we might be near a “tipping point” in quantum computation. I’ve sort of followed quantum computation and quantum information as an outsider for about 20 years now, but haven’t been paying close attention recently because it seems that practical general purpose quantum computers are still quite distant. Furthermore, I am turned off by the constant hype in the technology press…
But perhaps my opinion is due for an update? I know some real quantum computing people read this blog, so I welcome comments.
Here’s part of what I wrote back:
I’m not sure what is meant by “tipping point” — I don’t think we know yet what qubit technology can be scaled to the point of making Shor’s Algorithm feasible. The threat to classical cryptography is still very far off — you need millions* of qubits and the adversary can always just increase the key length; the tradeoffs are likely to be in favor of the classical method for a long time.
Noisy quantum simulators of the type Preskill talks about might be almost possible (first envisioned by Feynman in the Caltech class he gave in the 1980s: Limits to Computation). These are scientifically very interesting but I am not sure that there will be practical applications for some time.
* This is from distant memory so might not be quite right. The number of ideal qubits needed would be a lot less, but with imperfect qubits/gates and quantum error-correction, etc., I seem to remember a result like this. Perhaps millions is the number of gates, not qubits? (See here.)
Here’s a summary of current and near-term hardware capability: