http://sourceforge.net/p/bitcoin/mailman/message/34090559/I don't think 1MB is magic; it always exists relative to widely-deployed
technology, sociology, and economics. But these factors aren't a simple
function; the procedure I'd prefer would be something like this: if there
is a standing backlog, we-the-community of users look to indicators to
gauge if the network is losing decentralization and then double the
hard limit with proper controls
This is Greg Maxwell talking about the blocksize increase.
He advocates raising the limit only if there is a "standing backlog"
(and apparently also only if some "indicators say we're losing decentralization")
Not only does that sound dangerous for adoption, but its probably
not as easy as he makes it sound (he posted that in May),
as we're seeing first hand how challenging hard fork consensus is.
Also, Mike Hearn has suggested it could cause technical problems (nodes crashing).
Greg's other concern seems to be about fees.
While that's important, I don't think it should be a major consideration this early
when coinbase subsidies will likely be greater than fees for
at least a decade, and so not a reason to forestall an increase.