# Re: Entropy of draw from normal or binomial distribution

*From*: unruh <unruh@xxxxxxxxxxxxxxxxxxxxxxx>*Date*: Tue, 30 Aug 2011 22:18:57 GMT

On 2011-08-30, Jeffrey Goldberg <nobody@xxxxxxxxxxxx> wrote:

Let me give a specific example of my question first.

Suppose Alice flips a coin 100 times but doesn't record the sequence.

All she has is the number of heads and tails. How many bits of entropy

can we assign to the number of heads that show up?

More generally, how do we calculate the bits of entropy that we can get

from a drawing from a binomial distribution (as in the example above) or

a normal distribution?

-sum_i p_i ln(p_i)

is the usual expression for the entropy of a distribution. So in your

case, p_r= (1/2)^100 100!/(r! (100-r)!)

will be the probability of getting r heads (assuming a fair coin) in 100

tosses.

Note that the distributions need to be modified so that there is only a

finite number of possible results from a draw. (Otherwise the entropy

would be infinite, unless I'm mistaken.)

Nope. Because the probablility drops off rapidly for large values.

.

So suppose draw integers distributed normally with say, mean 0, and

standard deviation 100. How many bits of entropy do we get out of a

single draw? (I assume that only the s.d. matters. I can't see why the

mean would play a role in the answer.)

I'm happy with answers of RTF[MAPB] (Manual, Article, Paper, Book) as

long as you point me toward those resources.

Cheers,

-j

**Follow-Ups**:**Re: Entropy of draw from normal or binomial distribution***From:*Jeffrey Goldberg

**References**:**Entropy of draw from normal or binomial distribution***From:*Jeffrey Goldberg

- Prev by Date:
**Re: Entropy of draw from normal or binomial distribution** - Next by Date:
**Re: Entropy of draw from normal or binomial distribution** - Previous by thread:
**Re: Entropy of draw from normal or binomial distribution** - Next by thread:
**Re: Entropy of draw from normal or binomial distribution** - Index(es):