If there is 50.8% chance for heads and 49.2% for tails, then you can calculate the entropy like that:
H(X) = - P(heads) * log2(P(heads)) - P(tails) * log2(P(tails)) = - 0.508 * log2(0.508) - 0.492 * log2(0.492) = 0.999815327
I read it, but can't believe it. That would mean that a dice that rolls heads 3.25% times more than tails would only lose 0.02% of the entropy after 256 rolls.
I know it's been a while since I studied statistics, so bear with me. Let's say we have a 3-sided coin, with 2 sides heads and 1 side tails.
Using your formula, that gives:
- 0.6667 * log2(0.6667) - 0.3333 * log2(0.3333) = 0.9183[/b]
I would expect this 2-to-1-dice to create much less entropy. Please [i]convince[/i] me you used the correct formula.