No, I mean literally in terms of qubits.
Thanks for the clarification - I was making a lazy assumption. But I do think that, in the end, we come back to this:
the problems are more to do with phenomena such as decoherence, error rates and a near-absolute-zero temperature constraint, rather than, as we might think, the number of qubits. It's more single problems to be overcome than it is scaling up the processing power. Tremendously challenging problems, yes, but a different sort of problem to what we're used to thinking of in terms of computer advancement.
It's not really adding extra qubits that's the problem, it's more ensuring that the quantum system is stable enough to allow sufficiently persistent entanglement - and whilst obviously the challenges
do increase as we scale up the number of qubits, if the challenges affecting low-qubit systems can be overcome, then adding extra qubits and scaling up the processing power is much less of a problem and is likely to happen very quickly.