# Re: What is distribution ensemble ?

On Sep 30, 1:12 pm, Sergei <silent...@xxxxxxxxx> wrote:
On Sep 29, 7:36 pm, Ilmari Karonen <usen...@xxxxxxxxxxxxxx> wrote:

On 2008-09-29, crypter <crypte...@xxxxxxxxx> wrote:

On Sep 29, 10:12 am, Ilmari Karonen <usen...@xxxxxxxxxxxxxx> wrote:
On 2008-09-26, crypter <crypte...@xxxxxxxxx> wrote:

By definition : It is a sequence of infinite random variables !!
Any example more elaborate and easy to understand explanation ?

Are you familiar with the concept of a random variable?  If so, I
don't see what would be so hard to understand about taking an infinite
sequence of them.

Thnx. I had picked up the definition from wikipedia itself. And so, is
my example right ?:

For example rolling a die :
r.v_1 = o/p of a die {1, 2, 3, 4, 5, 6}
r.v_2 = avg. of rolling a die n times {1, 2, 3, 4, ....}
r.v_3 = sum of rolls is even or odd  {1, 0}
etc....

Sure, that counts as a distribution ensemble -- it's a rather general
concept.  It's not the kind of ensemble one would usually deal with in
cryptography, but it is an ensemble.

But then I'm trying to understand the indistinguishability between two
distribution ensembles ... why is it required ? If a distribution is
indistinguishable from another distribution, we should be done ... but
then why do we require all the random variables to be taken into
account ?

Because it's a weaker property: we don't require the individual
distributions to be identical, we merely require that they become more
and more similar (in a specific rigorously defined sense) as you get
further along the sequence.

For example (and someone please correct me if I'm talking nonsense,
since I don't actually _know_ this stuff; I'm just working from what
it says on Wikipedia), let X = <X_k> and Y = <Y_k>, where k = 1, 2, 3,
..., be sequences of biased coin tosses, such that Pr[X_k = "heads"] =
Pr[Y_k = "tails"] = 1/2 + exp(-k).  Clearly none of the distributions
are identical, but the statistical difference Delta(X_k, Y_k) tends to
zero faster than the reciprocal of any positive polynomial function.

--
Ilmari Karonen

Not quite sure that this is correct. As far as I understand, the
definition of indistinguishability of distribution ensembles has
nothing to do with convergence of these sequences. It is just a
statement that random variables X_k and Y_k (the k are the same, of
course) are indistinguishable. The ensembles are introduced to work
with cryptographic primitives without a fixed length of the output
(e.g. 1, ...,n, n+1, etc.).

So, let say that we have distribution ensembles X={X_1, X_2,...} and
ensemble Y={Y_1, Y_2, ...} and they are computationally
indistinguishable, then X_1 is computationally indistinguishable from
Y_1, X_2 from Y_2, etc.

Sergei- Hide quoted text -

- Show quoted text -

Guyz, thnx...I'm still not clear about it. By the way IImari defines
it, I toss a coin once, and I get a sequence of infinite random
variables ? I don't think that works !

I'm looking for something simple, illustrative and clear. as it is I'm
nt that a geek.
.