Yes, a geometric increase is what I meant. Gavin's current proposal is for it to double every two years, meaning it increases 1.4X every year, which gives it a reasonably high chance of staying behind consumer bandwidth growth (and thus keeping the network highly decentralized).
So to answer your question, yes I would support something along the lines of what you're proposing. I think 5 MB would be much better than 2 MB though, because Bitcoin has a tendency to see sudden spurts in adoption, and so it'd be nice for it to have some room to grow quickly in the coming years.
Thanks for posting.
And I think one thing that often gets lost in the discussion on the hard limit, and that I can't stress enough:
There are other ways to stop bloat besides a hard limit in the protocol. A protocol limit is the most crude and inflexible way to counter spam/bloat. If a 40-50% increase per year ends up being faster than connection speed increases, it is very unlikely that other means will not be found to limit block sizes.
Yes, we have effective spam/bloat countermeasures. That's why at present most blocks aren't full.
And Bitcoin certainly sees sudden spurts in adoption. Thus my concern with the ultimate form of bloat:
widespread actual usage.

We need to understand how the system reacts to heavy actual usage.
Will anything break, or rather, what will degrade/break first? How will the markets react? What can be optimized and/or substituted given proper incentives such as the removal of free riders and their subsidized blockchain space?
It's nice we agree on a geometric increase, but wouldn't it be great to have actual data on which to better determine the optimum initial increase and eventual rate of increase?
Before changing the max_blocksize constant, we should know what happens to the BTC function at (and over) the 100% limit of the tx/block variable.