Post
Topic
Board Bitcoin Discussion
Re: Bitcoin 20MB Fork
by
NewLiberty
on 17/02/2015, 20:58:43 UTC
This is where my conversation with Gavin fell apart.  He was not able to acknowledge the concept of a too-high limit.  His reasoning was that since the limit was only one-sided (blocks with size above it are prevented) that it couldn't be too high.

Huh what?

I am not proposing infinitely sized blocks, so I obviously acknowledge the concept of a too-high limit as being plausible.

If you want to continue the conversation, please be very explicit about what problem you think needs solving, and how whatever solution you're proposing solves that problem.

We might agree or disagree on both of those points, but we won't have a productive conversation if you can't say what problem you are trying to solve.

To summarize my position: I see one big problem that need solving:

Supporting lots (millions, eventually billions) of people transacting in Bitcoin.
  Ideally at as low a cost as possible, as secure as possible, and in the most decentralized and censorship-resistant way possible.

It is hard to get consensus on HOW to solve that problem, because no solution is obviously lowest cost, most secure, and most decentralized all at the same time, and different people assign different weights to the importance of those three things.

My bias is to "get big fast" -- I think the only way Bitcoin thrives is for lots of people to use it and to be happy using it. If it is a tiny little niche thing then it is much easier for politicians or banks to smother it, paint it as "criminal money", etc. They probably can't kill it, but they sure could make life miserable enough to slow down adoption by a decade or three.

"Get big fast" has been the strategy for a few years now, ever since the project became too famous to fly under the radar of regulators or the mainstream press.

The simplest path to "get big fast" is allowing the chain to grow. All the other solutions take longer or compromise decentralization (e.g. off-chain transactions require one or more semi-trusted entities to validate those off-chain transactions). I'm listening very carefully to anybody who argues that a bigger chain will compromise security, and those concerns are why I am NOT proposing an infinite maximum block size.

There is rough consensus that the max block size must increase. I don't think there is consensus yet on exactly HOW or WHEN.


Thank you for this additional insight into your thinking on the matter.
My impression was that you were not proposing "no block size limit" only because that would not be likely to attract the preponderance of consensus support, and that you were instead proposing as high as you thought you could get agreement upon.  I like the goal, I hate the method.

We (you and I and whomever else agrees) share the bolded goal you articulated above.  Where we diverge may be on the value of the "get big fast" bias and maybe the ordering of the criteria in the bolded goals.  Though I would be delighted to see "big fast" happen, if it is done at the cost of security, decentralization, censorship or transaction/network cost (which is also a type of censorship), then the cost of growth is too high.

My view is that the "mass adoption" isn't a short term goal, but an eventual fait accompli.  Getting the max block size out of the way as an impediment would be greatly beneficial.  

I like JustusRanvier's thought work on the matter, but as a proposal it is pretty far from where we are today and likely unreachable with Bitcoin internally.  If miners are induced to 'tip out' to nodes somehow, it may happen external to the protocol through groups like chain.com or apicoin.

Where the current proposals fail in my humble opinion is the risk/reward.

To go from a static block size limit, to one over an order of magnitude larger and adding a dynamic limit (with exponential growth) has risks to node centralization.  It invites spam transactions which create costs to the node maintenance in perpetuity.

Bitcoin has less than 10K nodes operating, if node running could be made to be 20x more expensive (in network and verification/storage/search) there are going to be less nodes, not more.  This is a loss in security, decentralization and cost (and where it loses decentralization and cost, it also creates censorship risks).  So a too-high cost-of-risk for the growth.  There is also the risk that it may be to low of a limit.  Maybe we get some event in the world that brings people to Bitcoin in droves.  Either too-high or too-low and we get a crisis induced change and are back to where we are today.

Other advancements may mitigate this somewhat.  Fractional/shard nodes may help long term by making the cost more granular, (what have we decided to call these?), but in the short term the total node count may fall as some currently full nodes become shards.  

Where does that leave me?
I'd support lower risk proposals:
1) A static but higher max block size.  This just kicks the can down the road until we can achieve...
2) A dynamic max block size that increases or decreases based on the needs of people to use Bitcoin, not on a current best guess of what the distant future may appear to be (as good as that may be).  and ultimately...
3) No block size limit, because the economics are in place to make it unnecessary

Lots of us have had ideas on how to get to (2) but the pieces are not all in place yet.  What we really need is a strategy to get from where we are now to (2).  This is the discussion I would like to see happening, and assist with to the extent possible.  A proposal to get Bitcoin protocol through the next handful of decades, rather than the Bitcoin software through the next handful of years.  (3) may be a problems for the next generation to handle, I'd like to give them the chance to get there.