I feel these debates have been going on for years. We just have wildly different ideas of what is affordable or not.
I don't think the most fundamental debate is about how high the limit should be. I made some estimates about how high it would have to be for worldwide usage, which is quite a wild guess, and I suppose any estimation about what is achievable with either today's or tomorrow's technology is also a wild guess. We can only hope that what is needed and what is possible will somehow continue to match.
But the most fundamental debate is about whether it is dangerous to (effectively) disable the limit. These are some ways to effectively disable the limit:
- actually disabling it
- making it "auto-adjusting" (so it can increase indefinitely)
- making it so high that it won't ever be reached
I think the current limit will have to be increased at some point in time, requiring a "fork". I can imagine you don't want to set the new value too low, because that would make you have to do another fork in the future. Since it's hard to know what's the right value, I can imagine you want to develop an "auto-adjusting" system, similar to how the difficulty is "auto-adjusting". However, if you don't do this extremely carefully, you could end up effectively disabling the limit, with all the potential dangers discussed here.
You have to carefully choose the goal you want to achieve with the "auto-adjusting", and you have to carefully choose the way you measure your "goal variable", so that your system can control it towards the desired value (similar to how difficulty adjustments steers towards 10minutes/block).
One "goal variable" would be the number of independent miners (a measure of decentralization). How to measure it? Maybe you can offer miners a reward for being "non-independent"? If they accept that reward, they prove non-independence of their different mining activities (e.g. different blocks mined by them); the reward should be larger than the profits they could get from further centralizing Bitcoin. This is just a vague idea; naturally it should be thought out extremely carefully before even thinking of implementing this.
Of all the posts this one makes the most sense to me - a layman. This I'm aware means practically nothing aside from me not being willing to download a new version of Bitcoin-Qt if I wont like the hard fork rules. A carefully chosen auto-adjusting block size limit that makes the space scarce and encourages fees keeping mining reasonable open to competition while solving the scalability issue seems like a good compromise.
But how many of all transactions should on average fit into a block? 90%? 80%? 50%? Can anyone come up with some predictions and estimates how various auto-adjusting rules could potentially play out?