Re: Computational secure entropy extraction

From: Ernst Lippe (ernstl-at-planet-dot-nl_at_ignore.this)
Date: 10/28/04


Date: Thu, 28 Oct 2004 22:18:15 +0200

On Thu, 28 Oct 2004 07:45:17 -0700, Anton Stiglic wrote:

> Ernst Lippe <ernstl-at-planet-dot-nl@ignore.this> wrote in message
> news:<pan.2004.10.27.11.56.00.42480@ignore.this>...
>> Somewhere hidden in the /dev/random thread was a nice discussion about
>> distilling entropy from an unknown distribution. One of the main topics
>> was if there existed some universal entropy distiller that could be used
>> on all input distributions. With a information theoretical definition of
>> security such an animal does not exist. But the situation becomes a lot
>> more interesting when we have two different and independently
>> distributed inputs and use a computational security approach.
>>
>> One of the questions that David Wagner asked was if it was possible to
>> show that there existed a universal entropy distiller with two
>> independent inputs that was computationally secure. I believe that under
>> certain conditions the answer is "yes".
>>
>> An m-bit distiller D(k,x) takes strings k,x of specified fixed length n1
>> and n2 respectively as inputs and produces an m-bit string.
>>
>> D is (h1,h2,t,epsilon)-secure if given that k is drawn from any
>> distribution with min-entropy h1, and x is drawn independently from any
>> distribution with min-entropy h2, no attack running in time at most t
>> can distinguish D(k,x) from uniform with advantage better than epsilon.
>>
>> I will only consider the case where k contains maximum entropy, i.e.
>> where h1 = n1. In a certain sense this is the optimal case, if it is not
>> possible to securely extract entropy under these conditions "secure
>> entropy distillation" is certainly not possible.
>
> There has been allot of work done in this area, notably by David
> Zuckerman, with information theoretical results. Are you aware of these
> results?
I have tried to read some of his papers, but from what I can
see his approach uses an information theoretical perspective (because
it considers all possible input distributions, instead of only
the ones that could effectively found by an adversary).

> If you have long enaugh random string, you can always use it to pick a
> universal hash function and hash the other distribution that does not have
> full entropy (IIRC this is related to the leftover hash lemma).
>
> What I would like to see is an extractor that doesn't need any true
> randomness, and that can smooth out a distribution that has n min-entropy
> to a uniform distribution of n bits or something close to that.
> Computational setting would be fine (doesn't have to be information
> theoretical...), as long as the assumptions are reasonable (not just
> saying that SHA1 acts like a random oracle...).

It seems that you are objecting to my requirement that k should have
maximal entropy (i.e. h1=n1).
It turns out that this was completely unnecessary for this
simple analysis (I needed it for another analysis that
I have not posted yet).

The adversary constructs a set Tk of k values and Tx of x values, where
|Tk| = 2^h1 and |Tx| = 2^h2.

Suppose that the adversary has examined A combinations in the cartesian product
Tx X Tk and that H of them are hits. Then the advantage
epsilon = (H - A 2^(-m)) 2^(-h1 - h2)
Because A >= H we have epsilon <= H*(1 - 2^(-m)) 2^(-h1 - h2)
and consequently H >= epsilon 2^(h1 + h2) / (1 - 2^(-m)). But in order to
get H hits the adversary has to examine at least H * 2^m elements, so we get
t >= epsilon 2^(h1 + h2 + m) / (1 - 2^(-m))

This is the same result that I got in my previous post.

Ernst Lippe



Relevant Pages

  • Re: behavior as mapping
    ... estimating a probability distribution, the distribution ... sequence with equal probability - since you have microsecond temporal ... reduction of the entropy Pto the entropy P ... If there were 4 genes we would need 2 bits of binding site info. ...
    (comp.ai.philosophy)
  • Computational secure entropy extraction
    ... distilling entropy from an unknown distribution. ... there existed some universal entropy distiller that could be used on all input ... D is -secure if given that k is drawn from any distribution ... Let's define that a "hit" is the case ...
    (sci.crypt)
  • Re: behavior as mapping
    ... estimating a probability distribution, the distribution ... sequence with equal probability - since you have microsecond temporal ... reduction of the entropy Pto the entropy P ... If there were 4 genes we would need 2 bits of binding site info. ...
    (comp.ai.philosophy)
  • Re: Computational secure entropy extraction
    ... > distilling entropy from an unknown distribution. ... > there existed some universal entropy distiller that could be used on all input ... You need little "true" randomness in order to be able ...
    (sci.crypt)
  • Re: new /dev/random
    ... >A c1,c2 entropy generator takes any input k and produces a string of ... >fixing any probability distribution it likes over those strings, ... >long as the entropy exceeds k. ... our mixer to do a good job with, we can then ask whether applying SHA1 ...
    (sci.crypt)