Talk:Entropy Sourced Random Number Generation

From Free Knowledge Base- The DUCK Project: information for everyone
Revision as of 19:02, 19 December 2020 by Littleguy (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Password Entropy

Entropy is a measure of what the password could have been so it does not really relate to the password itself, but to the selection process.

We define the entropy as the value $S$ such the best guessing attack will require, on average, $S/2$ guesses. "Average" here is an important word. We assume that the "best attacker" knows all about what passwords are more probable to be chosen than others, and will do his guessing attack by beginning with the most probable passwords. The model is the following: we suppose that the password is generated with a program on a computer; the program is purely deterministic and uses a cryptographically strong PRNG as source of alea (e.g. /dev/urandom on a Linux system, or CryptGenRandom() on Windows). The attacker has a copy of the source code of the program; what the attacker does not have is a copy of the random bits that the PRNG actually produced.

Entropy is easy to compute if the random parts of the selection process are uniform (e.g. with dice or a computer with a good PRNG -- as opposed to a human being making a "random" chance in his head). For instance, if you have a list of 2000 words and choose one among them (uniformly), then the entropy is $S = 2000$. Entropy is often expressed in bits: an entropy of $n$ bits is what you get out of a sequence of $n$ bits which have been selected uniformly and independently of each other (e.g. by flipping a coin for each bit); it is a simple logarithmic scale: "$n$ bits of entropy" means "entropy is $S = 2^n$" (and the attack cost is then $2^{n-1}$ on average).

If you think of a password as two halves chosen independently of each other, then the total entropy is the product of the entropies of each half; when expressed with bits, this becomes a sum, because that's what logarithms do: they transform multiplications into sums. So if you take two words, randomly and independently (i.e. never ruling out any combination, even if the two words turn out to be the same), out of a list of 2000, then the total entropy is $2000\cdot2000 = 4000000$. Expressed in bits, each word implies an entropy of about 11 bits (because $2^{11}$ is close to $2000$), and the total entropy is close to 22 bits (and, indeed, $2^{22}$ is close to $4000000$).

This answers your question about digits: a decimal digit has entropy 10, as long as it is chosen randomly and uniformly and independently from all other random parts of the password. Since $10 = 2^{3.321928...}$ then each digit adds about 3.32 extra bits to the entropy.

If a human being is involved in the selection process then calculating the entropy becomes much more difficult. For instance, if a human chooses two digits and the first digit is '4', then the probability that the second digit is '2' is quite higher than $\frac1{10}$. It could be argued that it is also difficult for the attacker: he will also have more work to do to sort the potential passwords so that he begins with the most probable. But this becomes a psychological problem, where the attacker tries to model the thinking process of the user, and we try to model the thinking process of the attacker: it will be hard to quantify things with any decent precision.