...
The UXTO constraint may never be solved in an acceptable (sub)linear way, or the solution(s) could for political reasons never be implemented in BTC.
...
Almost certainly 'never' by any realistic definition of various things.
...
Solving 'the UTXO problem' would require what is by most definitions 'magic'. Perhaps some future quantum-effect storage, communications, and processing schemes could 'solve' the problem but I'm not expecting to pick up such technology at Fry's by the next holiday season (Moore's law notwithstanding.)
A comment from chriswilmer got me thinking
The UTXO set is actually bounded. The total amount of satoshis that will ever exists is
(21x10^6) x (10^8) = 2.1 x 10^15 = 2.1 "peta-sats"
...
...
OK, now let's be reasonable! Let's assume that 10 billion people on earth each control about 4 unspent outputs on average. That's a total of 40 billion outputs, or
(40 x 10^9) x (65 bytes) = 2.6 terabytes
With these assumptions, it now only takes about 20 of those SD cards to store the UTXO set:
(2.6 x 10^12) / (128 x 10^9) = 20.3,
or, three 1-terrabyte SSDs, for a total cost of about $1,500.
...
I have thought about this bounding (mostly in the context of the current rather awkward/deceptive 'unspendable dust' settings.) I think that there is currently, and probably for quite some time, some big problems with this rosy picture:
- UTXO is changing in real time through it's entire population. This necessitates currently (as I understand things) a rather more sophisticated data-structure than something mineable like the blockchain. UTXO is in ram and under the thing that replaced BDB (forgot the name of that database at the moment) because seeks, inserts, and deletes are bus intensive and, again, in constant flux. Trying to do this on even high-ish end secondary storage architecture is a fail. Large players use what amounts to clustered ramdisk to achieve scale. Yes, it is within the realistic grasp of at least mid-sized companies to build such clusters, but it may never be so for lesser players at commodity scale due to market dynamics (no real demand.)
- The picture painted (each individual controlling four unspent outputs) is pretty optimistic. In order to achieve it there would need to be constant optimizations on the individual level which further adds to the messaging load and that's what pretty much everyone sees as the bottleneck.
- The picture painted (each individual controlling four unspent outputs) is kind of a snapshot. Even if each person makes only one spend per day, it means effectively a high percentage of the UTXO data in the database would be recycled daily. Anyone who has tried to do even a relatively simple operation on a relatively modest size file (say a simple md5 or even a wc on a gigabyte sized file) will recognize that passing info from storage into a processor can be a real drag. Some datastructures are conducive to multi-processing and whatever nosql databases which might hold the UTXO are probably in that category, but again one needs a cluster to do such things.
The risk of centralization to players who are prone to come under pressure is significant. I may be willing to retract my assertion that, within the current 'bounds', 'magic' would be required, but I stand fast to my contention that it is far from practical for Bitcoin's support framework to remain distributed at these kinds of scales. Edit: ...but would be interested to see a proof-of-concept, simulator, prototype, etc.