QC scaling as 2n is a common misconception. As n grows, the system scales worse and worse. At certain point, for n<50, the noise dominates
No, it's not a misconception, the number of potential classical outcomes that a QC can assess
does scale 2^n, it's inherently true based on how QCs work.
A classical bit can be 0 or 1, either/or. A qubit, because of quantum superposition, is in a sense partially both values, a probability smear across the two, until it is measured, when it resolves to a definite classical 0 or 1 outcome. In a system with multiple entangled qubits, the number of values covered increases 2^n. Two entangled qubits cover 2^2=4 possibilities, 00, 01, 10, 11. Three entangled qubits cover 2^3=8, 000, 001, 010, 011, 100, 101, 110, 111. And so on.
Having said that, I absolutely understand and agree with your main point that number of qubits isn't everything, it's merely a headline figure, which can be misleading. 2^n means nothing if there is a high rate of error in the final result. Decoherence - the loss/corruption of information - is
the fundamental obstacle to achieving large-scale functioning quantum computers. Adding and entangling additional qubits is not what is stopping QCs today, it is, as you say, the increased noise as number of qubits increases. But it doesn't change the 2^n scaling that makes QCs so efficient at for example integer factorisation and the discrete logarithm problem.
The scaling is an inherent truth due to immutable physical laws. The noise is an engineering problem.