# Re: Entropy of draw from normal or binomial distribution

"Jeffrey Goldberg" <nobody@xxxxxxxxxxxx> wrote in message
news:9c7vr5Ff2oU1@xxxxxxxxxxxxxxxxxxxxx
On 11-08-31 5:37 PM, Scott Fluhrer wrote:
"Scott Fluhrer" <sfluhrer@xxxxxxxxxxxxx> wrote in message
news:1314738299.225600@xxxxxxxxxxxxxxxxxxx

However, note that in cryptography, we're
sometimes interested in the "minentropy" (which is essentially the
entropy
in the worse case), rather than the standard definition of entropy.
Minentropy is defined as:

minentropy = -log2( p )

where p is the probability of the most common case [...=

I'm pretty sure no one cares,

I do.

but I just noticed something nonintuitive; the
minentropy actually increases if the coin is slightly biased (!). In
particular, if the probability of "heads" is 0.5049505, then the highest
probability is shared between 50 heads, and 51 heads, with each having a
probability 0.07920007. This gives us a minentropy of 3.6583545 bits,
which
is slightly higher (all this assuming, of course, that the flips are
still
independent).

I wasn't able to construct such an excellent example, but it did bother
distribution.

Most mappings from a probability distribution to a single real number will
discard almost all the information about the distribution [1] :-) I suspect
that, in this case, some of the information that's being discarded "feels"
relevent.

I suspected that it would be possible to construct a
distribution with a flatter top but a smaller variance then something
like this would happen.

For practical matters it isn't an issue, because minentropy is already
more conservative that classic entropy. But the definition of
"minentropy" had a quick and dirty feel to it, and now we see why.

It's not that it's quick and dirty, but what it is designed to do.
Minentropy is designed to answer the question "if I take a sample, how much
entropy am I guarranteed to get from that sample". If that's the question
question, then perhaps you need to look for a better answer..

[1] Yes, I'm aware that there are only aleph_1 continuous distributions, and
hence there really are injective (information preserving) mappings between
continuous distributions and real numbers; don't mess up my rhetoric with
such trivia...

--
poncho

.

## Relevant Pages

• Re: Entropy of draw from normal or binomial distribution
... Minentropy is designed to answer the question "if I take a sample, ... entropy am I guarranteed to get from that sample". ... distribution is too difficult to perform that calculation from. ... probability 2**-1001) a 1 bit, followed by the 1000 bit bitpattern. ...
(sci.crypt)
• Re: Entropy of draw from normal or binomial distribution
... Suppose Alice flips a coin 100 times but doesn't record the sequence. ... All she has is the number of heads and tails. ... How many bits of entropy ... sometimes interested in the "minentropy" (which is essentially the entropy ...
(sci.crypt)
• Re: Entropy of draw from normal or binomial distribution
... All she has is the number of heads and tails. ... How many bits of entropy ... interested in the "minentropy" (which is essentially the entropy in the ... closer to a normal distribution than a uniform one. ...
(sci.crypt)
• Re: Entropy of draw from normal or binomial distribution
... All she has is the number of heads and tails. ... How many bits of entropy ... sometimes interested in the "minentropy" (which is essentially the entropy ... particular, if the probability of "heads" is 0.5049505, then the highest ...
(sci.crypt)
• Re: Entropy of draw from normal or binomial distribution
... sometimes interested in the "minentropy" (which is essentially the entropy ... rather than the standard definition of entropy. ... particular, if the probability of "heads" is 0.5049505, then the highest ...
(sci.crypt)