Post
Topic
Board Development & Technical Discussion
Re: Funding of network security with infinite block sizes
by
acoindr
on 05/04/2013, 18:59:24 UTC
Anyway, ultimately this will be decided by Gavin and so far he's been saying he wants to raise the block size limit.

That gives us pretty much zero information. I'm sure 99% of us "want to raise the block size limit". The question is how. Do we raise it to 2MB or 10MB or infinite? Do we raise it now? If not now when? Do we raise it once? What about dynamically? Dynamically using data or preset parameters? Do we consider hard fork risks in the decision?

There are many ways to raise the limit and all have different ramifications. No matter the precise course of action someone will be dissatisfied.

Actually, what Gavin said, quoting directly, is this:

A hard fork won't happen unless the vast super-majority of miners support it.

E.g. from my "how to handle upgrades" gist https://gist.github.com/gavinandresen/2355445

Quote
Example: increasing MAX_BLOCK_SIZE (a 'hard' blockchain split change)

Increasing the maximum block size beyond the current 1MB per block (perhaps changing it to a floating limit based on a multiple of the median size of the last few hundred blocks) is a likely future change to accomodate more transactions per block. A new maximum block size rule might be rolled out by:

New software creates blocks with a new block.version
Allow greater-than-MAX_BLOCK_SIZE blocks if their version is the new block.version or greater and 100% of the last 1000 blocks are new blocks. (51% of the last 100 blocks if on testnet)
100% of the last 1000 blocks is a straw-man; the actual criteria would probably be different (maybe something like block.timestamp is after 1-Jan-2015 and 99% of the last 2000 blocks are new-version), since this change means the first valid greater-than-MAX_BLOCK_SIZE-block immediately kicks anybody running old software off the main block chain.


I think this shows great consideration and judgement because I note and emphasize the following:

Quote
100% of the last 1000 blocks is a straw-man; the actual criteria would probably be different ...  since this change means the first valid greater-than-MAX_BLOCK_SIZE-block immediately kicks anybody running old software off the main block chain.

What I think is of greatest value in Gavin's quote is that it's inclusive of data from the field. It's not him unilaterally saying the new size will be X, deal with it. Instead he essentially says the new size can be X if Y and Z are also true. It appears he has a regard for the ability of the market to decide. Indeed no change remains an option and is actually the default.

I think Jeff Garzik's post on the issue is apropos, particularly his last point:

Thanks for that link. I hadn't seen that post and I think it's brilliant. It probably aligns with my views 99.999%. Ironically it's his last point I disagree with most:

Quote
Just The Thing people are talking about right now, and largely much ado about nothing.

I completely disagree. Think how easily this issue could have been solved if in 2009 Satoshi implemented a rule such as Jeff suggests here:

Quote
My off-the-cuff guess (may be wrong) for a solution was:  if (todays_date > SOME_FUTURE_DATE) { MAX_BLOCK_SIZE *= 2, every 1 years }  [Other devs comment: too fast!]  That might be too fast, but the point is, not feedback based nor directly miner controlled.

I think the above could be a great solution (though I tend to agree it might be too fast). However, implementing it now will meet resistance from someone feeling it misses their views. If Satoshi had implemented it then it wouldn't be an issue now. We would simply be dealing with it and the market working around it. Now however there is a lot of money tied up in protocol changes and many more views about what should or shouldn't be done. That will only increase, meaning the economic/financial damage possible from ungraceful changes increases as well.

I also note early in Jeff's post he says he reversed his earlier stance, my point here being people are not infallible. I actually agree with his updated views, but what if they too are wrong? Who is to say? So the same could apply to Gavin. That's why I think it's wise he appears to include a response from the market in any change, and no change is the default.