# Re: A question on Shannon entropy

**From:** M. Boreale (*boreale_at_dsi.unifi.it*)

**Date:** 07/06/04

**Next message:**Francois Grieu: "Re: Verifying CRCs"**Previous message:**Gregory G Rose: "Re: hash of multiple strings"**In reply to:**David Wagner: "Re: A question on Shannon entropy"**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ] [ attachment ]

Date: 6 Jul 2004 04:19:51 -0700

daw@taverner.cs.berkeley.edu (David Wagner) wrote in message news:<cccahn$4ss$1@agate.berkeley.edu>...

*> M. Boreale wrote:
*

*> >Is there any result relating Shannon entropy to collision probability?
*

*>
*

*> Sounds like you want to relate Shannon entropy to Renyi entropy
*

*> of order 2.
*

*>
*

*> >I.e., to be more precise, suppose:
*

*> >- X is a discrete r.v. with prob. distribution p1,...,pN;
*

*> >- IC(X) is the probability of collision for X, Sum_i pi^2 (i.e. the
*

*> >probability of getting twice the same outcome in two independent
*

*> >experiments);
*

*> >- H(X) is Shannon entropy of X, -Sum_i pi*log(pi);
*

*>
*

*> Yup. The Renyi entropy of order 2, usually written H_2(X),
*

*> is - lg IC(X).
*

*> http://www.cs.berkeley.edu/~daw/my-posts/entropy-measures
*

*>
*

*> Cachin's thesis includes a proof that H(X) >= H_2(X). When X
*

*> is close to the uniform distribution on N items, then both are
*

*> close to lg N. See the above URL for a pointer.
*

Many thanks for these references.

cheers,

- Michele

**Next message:**Francois Grieu: "Re: Verifying CRCs"**Previous message:**Gregory G Rose: "Re: hash of multiple strings"**In reply to:**David Wagner: "Re: A question on Shannon entropy"**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ] [ attachment ]