Solving for probability given entropy

If a coin comes up heads with probability p and tails with probability 1-p, the entropy in the coin flip isS = –p log2 p – (1-p) log2 (1-p).

It’s common to start with p and compute entropy, but recently I had to go the other way around: given entropy, solve for p.

It’s easy to come up with an approximate solution.

Entropy in this case is approximately quadraticS ≈ 4p(1-p)and sop ≈ (1 ± √(1-S))/2.

This is a good approximation if S is near 0 or 1 but mediocre in the middle.

You could use solve for p numerically, say with Newton’s method, to get more accuracy if needed.

.. More details

Leave a Reply