Please see Gavin's writeup below:
https://bitcoinfoundation.org/2014/10/a-scalability-roadmap/I think this is a very good and interesting read with some fantastic points, however this is likely to be considered controversial by some in this community. In particular a fixed schedule for increasing the block size limit over time is a significant proposal.
Is Gavin saying this should grow at 50% per year because bandwidth has been increasing at this rate in the past? Might it not be safer to choose a rate lower than historic bandwidth growth? Also how do we know this high growth in bandwidth will continue?
Gavin mentioned that this is "similar to the rule that decreases the block reward over time", however the block reward decreases by 50%, an increase by 50% is somewhat different. A 50% fall every 4 years implies that there will never be more than 21 million coins, 50% growth in the block size limit implies exponential growth forever. Perhaps after 21 million coins is reached Bitcoin will stop growing, therefore if one wants to make a comparison, the block size limit increase rate could half every 4 years, reaching zero growth when 21 million coins are reached. Although I do not know the best solution to this. Can anyone explain why exponential growth is a good idea?
In my view, should volumes increase above the 7 transaction a second level in the short term, a quick fix like doubling the block size limit should be implemented. A more long term solution like an annual increase in the block size limit could require more research into transaction fees and the impact this could have on incentivising miners. Ultimately we may need a more dynamic system where the block size limit is determined in part by transaction volume, the network difficulty and transactions fees in some way, as well as potentially a growth rate. Although a more robust theoretical understanding of this system may be required before we reach that point.
Many thanks