The estimated next difficulty in my app projects what the next change will be IF the blocks/time solved since last diff change remain constant.
is that in a time of hashrate being added to the network (which has been nearly constant over the last several years) the blocks/time solved since last diff change will decidedly NOT remain constant, but will drop. Perhaps Bitcoinwisdom has over-built this idea into their algorithm, so in times like the present where hashrate seems to be leveling off, it is more inaccurate than during times of rising hashrate. As praeluceo said, any predictor is inherently wrong because the future is unknowable, but I have found Bitcoinwisdom to be a better predictor than all others that I've seen. It certainly converges to the true value as the blocks lead up to the difficulty change.
I'm not talking about using the overall blocks/time to project next change, I'm talking about only the blocks/time since last change.
In other words, we know the target is 6 blocks an hour, or one every 10 minutes. We know how many blocks have been solved since the last change, and we know when the last change was. With those values we can surmise that we're currently solving one every X minutes ... and that value compared to 10 will tell us what the next change will be, assuming we stay at that solve rate.
The downside to this is immediately after the difficulty change the number of blocks solved is so small it makes the estimate vary quite a bit. On the flip side, of course, the closer we get to the next change, the more accurate it is.
That's what I do, and it must be what Bitcoinwisdom is now doing. (I think we're saying the same thing..)
M
(sorry for going so far off-topic in the Eligius thread)
1) Average over how many blocks? The graphs at Bitcoinwisdom show averages over the last 2016, 1008, and 504 blocks, as well as the last 16 blocks on a different graph.... of course the average over 2016 blocks varies less (and is ultimately what determines the difficulty increase), but the average over 1008 or 504 blocks might be more reflective of trends in hashrate. If you are averaging ONLY over the blocks since the last difficulty increase, you're throwing out useful information, because....
2) Taking the average time to solve (however you determine that average), and doing some math with the current difficulty level, lets you back into an estimate of network hashrate. This is the more stable number since of course the time to solve will change at every difficulty reset. (In other words, the hashpower is more stable versus the block time.... in the real world, the block time is determined by the network hashpower, but since we have no way of knowing the true hashpower of the network due to its decentralization, we have to work backwards mathematically, and derive the hashpower from the block time.) The bitcoinwisdom graphs show that the network has reached roughly 300,000 TH/s several times on both the 504-block and 1008-block lines, and come close on the 2016-block line.
So the upshot is that you can and should use the block times prior to the latest difficulty reset. By doing calculations based on the difficulty at that earlier time, you can estimate the network hashrate, which has a tendency to monotonically increase (i.e., it may go up or stay flat, but it will rarely go down for any sustained period).