Entropy of draw from normal or binomial distribution



Let me give a specific example of my question first.

Suppose Alice flips a coin 100 times but doesn't record the sequence.
All she has is the number of heads and tails. How many bits of entropy
can we assign to the number of heads that show up?

More generally, how do we calculate the bits of entropy that we can get
from a drawing from a binomial distribution (as in the example above) or
a normal distribution?

Note that the distributions need to be modified so that there is only a
finite number of possible results from a draw. (Otherwise the entropy
would be infinite, unless I'm mistaken.)

So suppose draw integers distributed normally with say, mean 0, and
standard deviation 100. How many bits of entropy do we get out of a
single draw? (I assume that only the s.d. matters. I can't see why the
mean would play a role in the answer.)

I'm happy with answers of RTF[MAPB] (Manual, Article, Paper, Book) as
long as you point me toward those resources.

Cheers,

-j


--
Jeffrey Goldberg http://goldmark.org/jeff/
I rarely read HTML or poorly quoting posts
Reply-To address is valid
.