I'm not sure, if I missed something, but why isn't block size limit defined dynamically based on previous usage (plus some safety margin)?
Is it impossible to implement a self-regulating block size limit mechanism similar to the way difficulty is adjusted, which allows the block size limit to be increased and decreased based on "demand"?
I imagine that a dynamic mechanism would be much better at encouraging responsible (resource preserving) network use.
I'm very sceptical regarding a fixed-percentage increase, because there is zero assurance that Moore's "law" will remain true in the future. Because - as you know - past performance is no indicator of future results. And we're quickly approaching the atomic level in storage solutions for example. Decentralization should be preserved by all means possible, because it is the very core that ensures the safety and thereby the value of Bitcoin.