Here's a fun thing:
Pick a random point in time, then:
A) the average amount of time from that point to the next CLAM block is 1 minute
B) the average amount of time from that point to the previous CLAM block is also 1 minute
C) the average time between CLAM blocks is also 1 minute
Wouldn't you expect A + B = C? Yet A, B, and C are all 1 minute.
Are you sure that you used the same point in time to calculate the first 2 points? If C is correct then in average you should always find a point in or on a 1 minute timeframe. When judging a and b for the same point then this would mean the average has to be a fraction of a minute in each direction because when C is correct then the average for A and B can't be higher. I think that goes against probability.
Can you check if your script has no error really?
Edit: I see, you know the result? Then i guess i wait for where you deceived us.
