Re: Entropy of draw from normal or binomial distribution

"Jeffrey Goldberg" <nobody@xxxxxxxxxxxx> wrote in message
On 11-08-31 5:37 PM, Scott Fluhrer wrote:
"Scott Fluhrer" <sfluhrer@xxxxxxxxxxxxx> wrote in message

However, note that in cryptography, we're
sometimes interested in the "minentropy" (which is essentially the
in the worse case), rather than the standard definition of entropy.
Minentropy is defined as:

minentropy = -log2( p )

where p is the probability of the most common case [...=

I'm pretty sure no one cares,

I do.

but I just noticed something nonintuitive; the
minentropy actually increases if the coin is slightly biased (!). In
particular, if the probability of "heads" is 0.5049505, then the highest
probability is shared between 50 heads, and 51 heads, with each having a
probability 0.07920007. This gives us a minentropy of 3.6583545 bits,
is slightly higher (all this assuming, of course, that the flips are

I wasn't able to construct such an excellent example, but it did bother
me that minentropy was discarding almost all information about the

Most mappings from a probability distribution to a single real number will
discard almost all the information about the distribution [1] :-) I suspect
that, in this case, some of the information that's being discarded "feels"

I suspected that it would be possible to construct a
distribution with a flatter top but a smaller variance then something
like this would happen.

For practical matters it isn't an issue, because minentropy is already
more conservative that classic entropy. But the definition of
"minentropy" had a quick and dirty feel to it, and now we see why.

It's not that it's quick and dirty, but what it is designed to do.
Minentropy is designed to answer the question "if I take a sample, how much
entropy am I guarranteed to get from that sample". If that's the question
you're asking, minentropy is a good answer. If you're asking a different
question, then perhaps you need to look for a better answer..

[1] Yes, I'm aware that there are only aleph_1 continuous distributions, and
hence there really are injective (information preserving) mappings between
continuous distributions and real numbers; don't mess up my rhetoric with
such trivia...