It doesn't make sense to guess at this. Any guess is bound to be wrong.
If after picking the low hanging fruit, there is still an issue here (and there may be).
It ought not be resolved by a guess when there is data within the block chain that would be useful for making a determination on max block size.
In the same way that difficulty adjustment is sensitive to data within the block chain, so also this could be.
I don't know what the right answer is anymore than Gavin does, but making an estimation would not be the best way to solve this in any case.
.....
One example of a better way would be to use a sliding window of x number of blocks 100+ deep and basing max allowed size on some percentage over the average while dropping anomalous outliers from that calculation. Using some method that is sensitive to the reality as it may exist in the unpredictable future give some assurance that we won't just be changing this whenever circumstances change.
Do it right, do it once.
There isn't a way to predict what networks will look like in the future, other than to use the data of the future to do just that.
Is this 50% per year intended to be a hardcoded rule like the block reward?
That's not how I interpreted Gavin's report. It sounded more like a goal that the developers thought was attainable.
That said, 50% per year does seem aggressive. At some point, the opportunity cost of including more transactions is going to exceed the tx fee value, certainly as long as the block reward exists, so the blocksize cannot increase indefinitely. And so what if there is little room in the blockchain? Not every single tiny transaction needs to be recorded indefinitely. Since the (I expect) cost of increasing the block size is increased centralization, shouldn't the developers be hesitant to make such a commitment without allowing for discretion?
I also wonder what the best approach will be, way out in the future, when the block reward is near zero. Can there be an equilibrium transaction fee if the difficulty is allowed to continue to fall? A simple, kludgy solution might be to fix the difficulty at some level, allowing blockrate to depend on the accumulated bounty of transaction fees.
Though I'm sure some new kind of proof of work/stake approach could best solve this problem and make the network more secure and cheaper.