Re: A question on Shannon entropy
From: M. Boreale (boreale_at_dsi.unifi.it)
Date: 6 Jul 2004 04:19:51 -0700
firstname.lastname@example.org (David Wagner) wrote in message news:<email@example.com>...
> M. Boreale wrote:
> >Is there any result relating Shannon entropy to collision probability?
> Sounds like you want to relate Shannon entropy to Renyi entropy
> of order 2.
> >I.e., to be more precise, suppose:
> >- X is a discrete r.v. with prob. distribution p1,...,pN;
> >- IC(X) is the probability of collision for X, Sum_i pi^2 (i.e. the
> >probability of getting twice the same outcome in two independent
> >- H(X) is Shannon entropy of X, -Sum_i pi*log(pi);
> Yup. The Renyi entropy of order 2, usually written H_2(X),
> is - lg IC(X).
> Cachin's thesis includes a proof that H(X) >= H_2(X). When X
> is close to the uniform distribution on N items, then both are
> close to lg N. See the above URL for a pointer.
Many thanks for these references.