# Re: Entropy of draw from normal or binomial distribution

*From*: "Scott Fluhrer" <sfluhrer@xxxxxxxxxxxxx>*Date*: Wed, 31 Aug 2011 22:44:02 -0400

"Jeffrey Goldberg" <nobody@xxxxxxxxxxxx> wrote in message

news:9c7vr5Ff2oU1@xxxxxxxxxxxxxxxxxxxxx

On 11-08-31 5:37 PM, Scott Fluhrer wrote:

"Scott Fluhrer" <sfluhrer@xxxxxxxxxxxxx> wrote in message

news:1314738299.225600@xxxxxxxxxxxxxxxxxxx

However, note that in cryptography, we're

sometimes interested in the "minentropy" (which is essentially the

entropy

in the worse case), rather than the standard definition of entropy.

Minentropy is defined as:

minentropy = -log2( p )

where p is the probability of the most common case [...=

I'm pretty sure no one cares,

I do.

but I just noticed something nonintuitive; the

minentropy actually increases if the coin is slightly biased (!). In

particular, if the probability of "heads" is 0.5049505, then the highest

probability is shared between 50 heads, and 51 heads, with each having a

probability 0.07920007. This gives us a minentropy of 3.6583545 bits,

which

is slightly higher (all this assuming, of course, that the flips are

still

independent).

I wasn't able to construct such an excellent example, but it did bother

me that minentropy was discarding almost all information about the

distribution.

Most mappings from a probability distribution to a single real number will

discard almost all the information about the distribution [1] :-) I suspect

that, in this case, some of the information that's being discarded "feels"

relevent.

I suspected that it would be possible to construct a

distribution with a flatter top but a smaller variance then something

like this would happen.

For practical matters it isn't an issue, because minentropy is already

more conservative that classic entropy. But the definition of

"minentropy" had a quick and dirty feel to it, and now we see why.

It's not that it's quick and dirty, but what it is designed to do.

Minentropy is designed to answer the question "if I take a sample, how much

entropy am I guarranteed to get from that sample". If that's the question

you're asking, minentropy is a good answer. If you're asking a different

question, then perhaps you need to look for a better answer..

[1] Yes, I'm aware that there are only aleph_1 continuous distributions, and

hence there really are injective (information preserving) mappings between

continuous distributions and real numbers; don't mess up my rhetoric with

such trivia...

--

poncho

.

**Follow-Ups**:**Re: Entropy of draw from normal or binomial distribution***From:*Jeffrey Goldberg

**References**:**Re: Entropy of draw from normal or binomial distribution***From:*Scott Fluhrer

**Re: Entropy of draw from normal or binomial distribution***From:*Jeffrey Goldberg

- Prev by Date:
**Re: Entropy of draw from normal or binomial distribution** - Next by Date:
**Re: Entropy of draw from normal or binomial distribution** - Previous by thread:
**Re: Entropy of draw from normal or binomial distribution** - Next by thread:
**Re: Entropy of draw from normal or binomial distribution** - Index(es):