Post
Topic
Board Development & Technical Discussion
Re: using Shannon's information to measure proof-of-work
by
grondilu
on 10/07/2011, 07:25:21 UTC
Satoshi's algorithm is much better.  If you switch away from weighting with the sum of difficulties, and go to the sum of log of difficulties, you introduce a new vulnerability into easily taking over the block chain.  That's because someone could make a very long chain of "difficulty 1" blocks and they could quickly be made to weigh more than blocks being created at the current difficulty n, because they are n times more difficult to create, but only weigh log n, which is a much smaller proportion of n as n increases.  All while difficulty 1 blocks will weigh the most in proportion to their actual creation difficulty.  So this idea is a bunch of nonsense, merely adds unwarranted complication at the expense of security.

Making a long chain of small difficulties requires time, as this information would be computed in a sequential manner.  That is the whole point, actually.  The other way of computing information is to make parallel calculus.  This would require much more power.  Shannon's information is easier to calculate by cutting it in smaller successive parts, i.e. in long chain of successive events, which is all what time is about.

I actually think there is no qualitative difference between my algorithm and Satoshi's, but I don't understand Satoshi's algorithm well enough.  Anyway, I just think my version is just more "pure", and that it can allow to get rid of the 6 block per hour limit, for instance.