Post
Topic
Board Development & Technical Discussion
Re: How a floating blocksize limit inevitably leads towards centralization
by
misterbigg
on 20/02/2013, 22:45:23 UTC
I don't think many here are rooting for a change where all scarcity is removed

I beg to differ. There are plenty who feel that there should be no limit, so that Bitcoin will "scale" to accommodate any amount of transaction volume. It should be clear that, storage and bandwidth issues aside, this will result in fees trending to zero.

Quote
There are however methods to retain some scarcity without having too much of it.

Agreed, and I posted a simple method of adjusting MAX_BLOCK_SIZE based on measured scarcity.

Quote
My favorite so far is the solution where we give more freedom to the miners, they can basically decide the block size.

It seems obvious that all miners will converge on the same algorithm for producing blocks: Include all transaction with fees. This maximizes mining revenue. But only in the short term. Once ALL miners start doing this, then we have the equivalent of no limit and my comment earlier applies: fees will trend towards zero. Giving "freedom" to miners isn't really a choice. We already know what strategy miners will use. It will be to maximize their fees. Anyone who doesn't follow this strategy will go bankrupt.

Quote
At the same time we would put hard limits on block validation

At which point, miners will again converge on the same algorithm: Attempt to fill each block up to the hard limit of size, choosing the transactions which offer the highest fee per kilobyte.

Quote
I believe the hard limit for block size needs to be...lifted entirely

Which goes against what you earlier agreed to, that scarcity is a requirement for fees to reach a non zero equilibrium See, there ARE some people who are rooting for limits to be removed!  Grin

Quote
The soft limit is another issue

Keep in mind that the soft limit is just an artificial barrier. The instant that it becomes profitable for miners to comment out that piece of code and substitute it with the "winning" strategy (fill the block with transactions having the highest fees per kilobyte), they will do so.

if the average speed of a full node is the main factor we use to determine this...

Any scheme for dynamically adjusting the block size should be based ONLY on the information contained in the block chain, and not any other information like the "speed" of nodes, or more importantly attributes of the memory pool of pending transactions.