Post
Topic
Board Bitcoin Discussion
Re: New video: Why the blocksize limit keeps Bitcoin free and decentralized
by
gmaxwell
on 17/05/2013, 23:00:38 UTC
gmaxwell, it is always a relief to read your level-headed analysis of a problem after a serious amount of arm-waving and hyperbole.

The video is completely dishonest from the point where data-centers is mentioned. It makes the false case that up to 1MB blocks allow for decentralization and anything larger needs PayPal-like server farms for each node. It may be that the network would hum along fine with 2MB or 5MB blocks right now. We just don't know.
At the same time, it's a video made in an environment where some people are saying that it would be totally fine to _completely_ uncap it or leave it up to some hash-power majority— some even going so far as arguing that people who urge caution have dishonest motivations.  It's outright toxic at times.
(I'd provide some citations— they're easy enough to find... but I think it might be a little unfair because in places where the debate has become heated some people have made arguments that I don't think they would have made outside of the heat of the argument, and I don't think they ought to be held personally accountable for them... the fact that the discussion goes acrimonious so easily is problematic, not the people)

I'm personally suffering some conflict over the "controversy" making needless drama and the fear that the video exaggerates some points while at the same time feeling thankfulness that someone has taken an extreme position that moves the middle and maybe makes thoughtful dialog _easier_  because careful discussion about the very real tradeoffs might suffer less from people blowing it off saying its a non-issue.  If you want to argue it's a non-issue, first you must duke it out with the one-meggers. In the mean time, people who want to think instead of fight are free to find the middle path.  I was also happy to see that the website linked from the video seemed to have a more even handed presentation.

I certainly think the nuance here is far more easily explained as a tension between two completely legitimate engineering objectives which we— as a community— have to carefully compromise over, rather than just some right vs wrong binary question of "is a bigger blocksize bad".

From one perspective— bitcoin as an unstoppable, unregulatable, absolutely trustworthy egold—  practically any block size is bad... smaller is pretty much always better. People are already using SPV wallets in large numbers because of the current cost of validating the chain, people are choosing centeralized pools over P2pool because of the cost of dealing with the chain. There is no doubt in my mind that this is already hurting the decentralization of the system to an unacceptable degree: Hacking or kidnapping just _two_ people (or hacking one and DOS attacking one or two others) is enough to do enormous chain rewrites right now. For untrusted high value transaction people should be waiting 10-20 confirmations now— or more: ASIC miner claims plans to have a signficant multiple of the whole network's hash rate in a few months: What if they already have it now?   Centralization creeps in easily, and it really undermines our security model... but at least being fully decenteralized is still _possible_, the centralization we see now is an artifact of the path of least resistance rather than a requirement at our scaling level.

And at the same time— from another perspective— Bitcoin as practical unit for common every days payment cheaply available to as many people as possible— practically any block size restriction is bad. It's also completely clear to me that transaction costs— even insubstantial ones— have already turned some people off from using Bitcoin. A tiny sub-bitcent fee is _infinitely_ worse than zero by some metrics. If we can first just accept that each of these views are valid conclusions from different objectives for the system... then after that we can have a polite discussion about where on the compromise spectrum the system will delivers the best value to the most people.

An interesting question that this begs is how do we measure the decentralization impact. Right now the best I have is basically a "gmaxwell-test"— what is my personal willingness to run a node given a certain set of requirements? Given that I'm an exceptional sample in many regards— including having hundreds of cores and tens of terabytes of storage— at home, if some scale level would dissuade me it's probably a problem.  Whatever qualities or flaws that criteria might have as an engineering objective, though, one way that it completely fails is that its not persuasive for other people. It doesn't form arguments that can act as a consensus mechanism, excepting insofar is that other people might apply their personal version of it and get the same results that I do.