# Entropy of draw from normal or binomial distribution

Let me give a specific example of my question first.

Suppose Alice flips a coin 100 times but doesn't record the sequence.
All she has is the number of heads and tails. How many bits of entropy
can we assign to the number of heads that show up?

More generally, how do we calculate the bits of entropy that we can get
from a drawing from a binomial distribution (as in the example above) or
a normal distribution?

Note that the distributions need to be modified so that there is only a
finite number of possible results from a draw. (Otherwise the entropy
would be infinite, unless I'm mistaken.)

So suppose draw integers distributed normally with say, mean 0, and
standard deviation 100. How many bits of entropy do we get out of a
single draw? (I assume that only the s.d. matters. I can't see why the
mean would play a role in the answer.)

I'm happy with answers of RTF[MAPB] (Manual, Article, Paper, Book) as
long as you point me toward those resources.

Cheers,

-j

--
Jeffrey Goldberg http://goldmark.org/jeff/
I rarely read HTML or poorly quoting posts
.

## Relevant Pages

• Re: Entropy of draw from normal or binomial distribution
... All she has is the number of heads and tails. ... how do we calculate the bits of entropy that we can get ... a normal distribution? ... finite number of possible results from a draw. ...
(sci.crypt)
• Re: Entropy of draw from normal or binomial distribution
... Suppose Alice flips a coin 100 times but doesn't record the sequence. ... All she has is the number of heads and tails. ... how do we calculate the bits of entropy that we can get ... a normal distribution? ...
(sci.crypt)
• Re: Entropy of draw from normal or binomial distribution
... Suppose Alice flips a coin 100 times but doesn't record the sequence. ... All she has is the number of heads and tails. ... How many bits of entropy ... a normal distribution? ...
(sci.crypt)
• Re: Entropy of draw from normal or binomial distribution
... All she has is the number of heads and tails. ... How many bits of entropy ... interested in the "minentropy" (which is essentially the entropy in the ... closer to a normal distribution than a uniform one. ...
(sci.crypt)
• Re: behavior as mapping
... estimating a probability distribution, the distribution ... sequence with equal probability - since you have microsecond temporal ... reduction of the entropy Pto the entropy P ... If there were 4 genes we would need 2 bits of binding site info. ...
(comp.ai.philosophy)