Post
Topic
Board Bitcoin Discussion
Re: Bitcoin XT - Officially #REKT (also goes for BIP101 fraud)
by
poeEDgar
on 04/09/2015, 18:17:26 UTC
You can not simulate or test the social and economy impact of 8MB blocks in the test net, that's the key difficulty

No one can see the future of a complex decentralized social and economy system, the safest way is to make the change as small as possible, one small step at a time

Exactly. My thinking is that allowing for exponential scaling in this fashion is basically turning the blockchain into a testnet for the next two decades. There is too much money at stake to naively assume that nothing will ever go [very, very] wrong on the path to scaling 8000x in block size limit when we have never scaled past 1x.

An incremental approach is the only responsible approach. As I said above, we can probably safely run a 2MB block regime coming from a .5MB environment -- but an 8GB block regime from a .5MB environment? That's simply irresponsible. Everything that can go wrong will go wrong.

I do like that idea, however that means we would have to hard fork on a regular basis. I do not think that this is practical or even possible without splitting Bitcoin, especially as more people become involved, it will become even more difficult to reach consensus. It would be better if we do not have to debate this again in a few years from now.

I keep hearing this. People need to get over the fact that bitcoin is, and has always been, a work in progress. This is absolutely not the last contentious debate the community will have (likely this will pale in comparison to future issues). And it is very, very silly to have the mindset that "this is the fork to end all forks." Irrational fear of controversy, and of the idea of hard forking in the future (i.e. making decisions) is not an adequate reason to push a reckless regime of exponential scaling.

Have you considered that it's technically much better to have generous limits which you can reduce via soft forks if needed, than having the limit set too low and then be forced to do another hard fork to raise it.

Also we've already had 32 MB limit in Bitcoin's history and it created no problems. And the fact that Satoshi's idea was to remove blocksize limit entirely when light clients became available.

The eventual solution will be to not care how big it gets.

So you'd rather create arbitrarily large limits now, so we can fork them back down after serious lapses in network security? (Can you guarantee network security in an 8GB environment? How?) As one example, is waiting for mass orphaning of blocks to appear > waiting for transaction volume to warrant raising the limit?

What did Nick Szabo call that -- the Mark Karpeles formula?

You're taking Satoshi out of context. Yes, that may be the eventual solution. That doesn't mean everything should be decided with one hard fork in 2015.

Applying this patch will make you incompatible with other Bitcoin clients.
+1 theymos.  Don't use this patch, it'll make you incompatible with the network, to your own detriment.

We can phase in a change later if we get closer to needing it.

A need basis doesn't justify an 8000x scaling regime, when we have never scaled beyond 1x.