Post
Topic
Board Development & Technical Discussion
Re: Entropy, how to calculate it from series of outcome
by
theymos_away
on 24/10/2018, 20:36:06 UTC
https://en.bitcoin.it/wiki/Passphrase_generation#Generating_keys.2C_seeds.2C_and_random_numbers_.28Advanced.29

On Linux, that's basically how /dev/{,u}random works anyway. It does something like sha1(past_randomness + new entropy from keyboard etc.) repeatedly in order to produce endless random data. (This is a slight simplification, but it's more-or-less like this.)

CPUs offer a randomness instruction, but it's not used on Linux because people don't trust it. The CPU behaves deterministically, and entropy is gathered from elsewhere.

You can analyze the quality of random data to *some* extent using eg. ent (http://www.fourmilab.ch/random/), but it is logically impossible to know whether some data is truly random. For example, the output of a secure hash function should on average always test as perfectly random, indistinguishable from perfect quantum randomness, even if it's a hash of "1234" etc. OTOH, highly ordered-looking data can come out of a true random source sometimes.