...
We agree on the problem, we disagree on the solution. Treating it as a 'production server' replacement, would be the wrong approach.
How many times do you want to replace this 'production server' for the same reason?
If we are going to a dynamic limit, it should be one that isn't going to need to change later, and can be assured that it will be fit for purpose, and without opening up new vulnerabilities.
The problem with that... it isn't simple.
If the limit were say 10x the average size of the last 1000 blocks, it would still provide the anti-spam protection and keep the node distribution from getting too centralized
Not necessarily.

My only point is that a fixed 1MB limit is a bad idea long term. So increasing it makes a lot of sense.
What is the formula for the sweet spot? Beats me.

We have a year or two to figure that out.
The thing is, I don't think there will ever be mass adoption of the system as it exist today. The 'average Joe' will be using some payment processor and not the blockchain directly. So the blockchain will have transactions from payment processors and early adopters/enthusiasts. The rest will be in closed systems and/or side chains that can solve a lot of the volume issues.
But still, 1 MB is not enought.
We're probably in agreement on this then. I don't like the 'exponential best guess' approach. I'd favor either a new static limit, maybe 8MB to give a bit more time for a real solution, or... a real solution.
A real solution would be a limit that right-sizes as blocks are added that both prevents abuse, and allows for transaction growth so that we don't get queued transactions that are chipping in reasonable fees and accounts for days destroyed.
We aren't anywhere close to mass adoption.