QC scaling as 2n is a common misconception. As n grows, the system scales worse and worse. At certain point, for n<50, the noise dominates
No, it's not a misconception, the number of potential classical outcomes that a QC can assess
does scale 2^n, it's inherently true based on how QCs work.
A classical bit can be 0 or 1, either/or. A qubit, because of quantum superposition, is in a sense partially both values, a probability smear across the two, until it is measured, when it resolves to a definite classical 0 or 1 outcome. In a system with multiple entangled qubits, the number of values covered increases 2^n. Two entangled qubits cover 2^2=4 possibilities, 00, 01, 10, 11. Three entangled qubits cover 2^3=8, 000, 001, 010, 011, 100, 101, 110, 111. And so on.
Having said that, I absolutely understand and agree with your main point that number of qubits isn't everything, it's merely a headline figure, which can be misleading. 2^n means nothing if there is a high rate of error in the final result. Decoherence - the loss/corruption of information - is
the fundamental obstacle to achieving large-scale functioning quantum computers. Adding and entangling additional qubits is not what is stopping QCs today, it is, as you say, the increased noise as number of qubits increases. But it doesn't change the 2^n scaling that makes QCs so efficient at for example integer factorisation and the discrete logarithm problem.
The scaling is an inherent truth due to immutable physical laws. The noise is an engineering problem.
Let me doubt.
A qbit is not simultaneously 0 and 1, it is probably 0 or 1, and eventually - when measured - certainly 0 or 1. That's why 2
n is wrong. A system of n qbits is not simultaneously in 2
n states, it is probably in one of them. Altering it via any constraints reduces the probability that the system is in certain state. But the system was already at certain state, and could transition to the more favorable one forced by constraints. So it has to travel to the correct state. But if there's a reasonable algorithm for this, no quantum stuff is needed.
Or let's say, that somehow, a system of n qbits is in most of the 2
n states simultaneously. Then constraints are placed, and some of the states become "forbidden". The favorable states would have lower energy. Either there's a transition to lower energy, which has to be released, or influx of energy to make up for the unfavorable states. At the end finally we arrive at the correct single state. All this energy has to go somewhere. All 2
n bits of energy. Would the solar system survive such energy blast? Would the Milky Way?
There's a lot of wishful thinking in quantum physics. It is believe based, and attracts all kind of believers. Something is fishy. It looks a lot like the geocentric solar system in Middle Ages. Circles upon circles, a very complex stuff, but never correct (except for a few isolated cases). So I would say, it is fundamentally wrong, and it is obvious. After all quantum physics is statistics, and statistics is never reality. Sometimes, many times, very useful, but still wrong.