I found the paper in May that put two different qubit chips up against software and hardware solvers in a very specific class of problem, and the results were that with a problem that is most suitable for the chip it found a solution in half a second, several thousand times faster than the best traditional methods.
There were a lot of ifs/buts and exceptions however the V5 and V6 chips tested, when given the right kind of problem, were indeed able to solve it (it was an annealing problem) in a grand flash.
A clear eyed summary on that May paper, and the D-Wave devices is here:
http://spectrum.ieee.org/computing/hardware/dwaves-year-of-computing-dangerouslyI've no doubt that quantum computing is going to be an arms race, and it will start to solve in parallel more problems over time. Whether that includes searching for keys or cryptography I've no idea but if it does you CAN BET THAT NSA WILL NOT TELL US ABOUT IT.
One thing I have pointed out in the past (and likely will need to continue to point out for a very long time ) is that DWAVE's system implements (or it stated to implement there is some controversy over exact what is happening) quantum annealing algorithm which while it does (or may) have some quantum properties is not the same thing as a general purpose quantum computer.
To attack public encryption requires a general purpose quantum computer (QPQC) capable of implementing Shor's algorithm (or one built to only implement ONLY Shor's algorithm, a quantum "ASIC" if you will). Even if DWAVE's computer was a quadrillion qubits it would be absolutely useless for attacking public encryption.
The global recognition for successfully factoring larger numbers using a GPQC and Shor's algorithm is a pretty big deal. To date nobody publicly has shown the ability to factor numbers larger than 143 (an 8 bit number). Progress has also been agonizingly slow:
2001
First successful demonstration of Shor's algorithm.
Computer size: 7 qubits
Number factored: 15 (into the factors 5 & 3)
Equivelent RSA keysize: 4 bit
2011
First successful factorization of a three digit number (143)
Computer size: 8 qubits (actually 4 bits using an iterative process)
Number factored: 143 (into the factors 11 & 13)
Equivelent RSA keysize: 8 bit
So the bit strength of "encryption" (I use this term loosely because one could crack 8 bit encryption with paper and pencil) has roughly doubled after a decade of research. Even if growth continues it would be 80+ years before even current RSA keys could be compromised using Quantum Computing and moving to larger keys is always possible. ECDSA is a little different than RSA in that it doesn't involve factoring large numbers and instead uses the properties of elliptical curves. In general ECC based keys have a higher bit strength then the equivelent sized key using RSA. This is one reason that ECC was used over RSA for Bitcoin. To achieve 128 bit security (which is considered beyond brute force for classical computing) requires a 256 bit ECDSA key but requires a 3,072 bit key. Using RSA would be just as secure but transactions would be up 12x as large.
Still both RSA and ECC in theory can be broken faster than what is possible with classical computing by using a large enough (and fast enough) quantum computer implementing Shor's algorithm. I have only found one paper to date which compares the qubit cost of implementing Shor's algorithm against both RSA & ECC.
6.3 Comparison with the quantum factoring algorithm
One of the main points of this paper is that the computational quantum advantage
is larger for elliptic curve discrete logarithms than for the better known
integer factoring problem. With our proposed implementation we have in particular
achieved similar space and time requirements. Namely the number of qubits needed is
also of O(n) and the number of gates (time) of order O(n^3)), although in both cases
the coefficient is larger. Note that the input size n is also the key size for RSA resp.
ECC public key cryptography. Because the best known classical algorithms for breaking
ECC scale worse with n than those for breaking RSA, ECC keys with the same computational
security level are shorter. Below is a table with such key sizes of comparable security
(see e.g. [25]). The column to the right roughly indicated the classical computing resources
necessary in multiples of C, where C is whats barely possible today (see. e.g. the RSA
challenges [26] or the Certicom challenges [27]). Breaking the keys of the last
line seems [15,360 RSA or 512 bit ECDSA] to be beyond any conceivable classical computation,
at least if the presently used algorithms cant be improved.
Summary:
Breaking 256 bit ECDSA (128 bit key strength) requires 1800 logical qubits and a time of 6.0*10^9 operations.
Breaking 512 bit ECDSA (256 bit key strength) requires 3600 logical qubits and a time of 5.0*10^9 operations.
Where f(n) and f'(n) are as in section 6.2 with ǫ = 10. The time for the
quantum algorithms is listed in units of 1-qubit additions, thus the number
of quantum gates in an addition network per length of the registers involved.
This number is about 9 quantum gates, 3 of which are the (harder to implement)
Toffoli gates (see e.g. [5]). Also it seems very probable that for large scale
quantum computation error correction or full fault tolerant quantum computation
techniques are necessary. Then each of our logical qubits has to be encoded
into several physical qubits (possibly dozens) and the logical quantum gates
will consist of many physical ones. Of course this is true for both quantum
algorithms and so shouldn't affect the above comparison. The same is true for
residual noise (on the logical qubits) which will decrease the success probability
of the algorithms. The quantum factoring algorithm (RSA) may have one advantage,
namely that it seems to be easier to parallelise.
I bolded the important portions. So we are looking at a general purpose quantum computer which needs at least 1,800 logical qubits. Nobody (AFAIK) has even done research on the amount of error correction required for a commercial general purpose quantum computer (because none exists) but lets take the authors estimate of "dozens of physical qubits per logical qubit" and use 24x as a low end. This 1,800 * 24 = 43,200 physical qubits. To put that into perspective the largest (AFAIK please correction if you find a larger example) general purpose quantum computer capable of implenting Shor's algorithm is ... 7 qubits.
Still this may happen within our lifetime. There are a couple things which can be done. The simplest would be to implement a new address type which uses a larger ECC curve. 512 bit keys doubles the required qubits and 1,024 will quadruple them. If GPQC proves viable this would buy the network some time for a longer term solution. I don't know if 1,024 bit ECC even exist simply because 256 bit keys are consider beyond brute force and 512 bit is beyond beyond brute force. A longer term and more complex solution would be moving to address schemes based on post-quantum algorithms (
http://en.wikipedia.org/wiki/Post-quantum_cryptography). The largest issue with these systems other than the fact that they have no widespread deployment and thus there may be unknown flaws is that generally they involve much much larger public keys. The bandwidth and storage requirements of Bitcoin are highly correlated to the size of the public key and we are talking about public keys in the kilobytes (notice the plural) range so 80x to 200x larger than current 256 bit public keys. Now it is entirely possible that in the future the effect of Moore's law on available storage and bandwidth will mean that 50,000 bytes in 2048 is no larger than 256 bytes today on a relative basis.