Post
Topic
Board Bitcoin Discussion
Re: Bitcoin 20MB Fork
by
NxtChg
on 19/02/2015, 09:37:07 UTC
...since it is not just about going to 20 mb blocks but the code that is going to ramp the block size up bigger and bigger over time , even by 2020 looking at 100 mb and it goes up and up from there , actually looking at the spread sheet linked here https://docs.google.com/spreadsheets/d/1CuOEM9uwO5w-RwWGCCZpVGVFwhHHHegxJZqTP5KyapI/edit?pli=1#gid=0 from Gavins blog eventually a couple dozen blocks are going to be the size of what the blockchain was two years ago.   

The algorithm proposed by Gavin is very simple.

"Let's kick the can far enough each time so it's always far ahead of us" Smiley

There are probably reasons for that. Maybe adding block size into the header is too difficult or some other concerns. I haven't yet found a detailed explanation of why this particular algorithm should be selected, but that's probably argument from ignorance Smiley

Granted, anything even remotely more complex has a very slim chance of being implemented and agreed upon. So we probably won't be able to put any additional pressure on miners and will have to hope that everything will eventually work out somehow.

But are there really any good reasons to prefer the exponential curve to the simple sliding window algorithm, which stays reasonably ahead of the current block size?