8mb block outta thin future extrapolations is as dumb as any other blocksize BIP proposal.
...
1MB Blocksize purposely prevent spam, and it is working just fine.
I'm glad that you're supporting the current 1MB limit to stay long enough, so do I.
The 8MB number comes from the idea that changes in the consensus rules need to be rare and discreet, otherwise the internal competition in the system would tear the whole idea of consensus apart fairly quickly. From that perspective the limit needs to be high enough for us not to bother changing it again for a long period of time and at the same time be able to counter potential external competition in this space. But because the new limit is so high compared to the current one we need to be careful with the appropriate timing for the change to take effect.
Well, suppose that bitcoin is, in fact, growing and being adopted as we speak. While I'm obviously not too bothered by the idea of a fee market, I still think a fee market is less desirable than no fee market. So, if we have already maximized efficiency/minimized spam (we have not), I think addressing throughput directly is the next logical answer.
So, to the extent that we can increase the block size limit without foregoing or jeopardizing key tenets of bitcoin (consensus, decentralization, network security), I think we should do that. But that necessarily entails a very conservative approach where we can achieve testable conditions (i.e. roughly in line with actual demand).
8MB, exponential scaling = straight up reckless.
8MB is indeed more of a throughput suggestion than anything else. There are concerns about super-linearity property of block verification time (meaning that single 8MB block might take longer to verify than 8x1MB blocks), so those need to be taken into account when the actual solution is going to be implemented.
Fee market is another interesting aspect of the system as the competition is supposed to prevent the economic pressure from building up too high, but the network effect and the costs to switch should work in favor of keeping each particular system pressurized. In the idealistic scenario that I keep in mind a few healthy coins representing the core values of the original idea might work as cylinders in the combustion engine, where each system is pressurized to the max at different moments in time and decides to raise its bandwidth target before the pressure leaves to another one. We will see how much of that actually manifests.