Re: Compression leads to encryption NEW COMPRESSION METHOD!

Einstein wrote:
Compression White Paper #5

Lossless compression by Michael Harrington.

I have created a means to compress random binary data. It involves 8
main stages, and a half dozen or so minor stages. It incorporates a
revolutionary new means to use data in computer language format, and
this is the key to its success.

Stage one is a simple means to alter the odds of occurrence in a binary
sequence. Using a old Shannon-Fano type like layout it creates a 40%
occurrence of 1's to 60% of 0's statistically. This does come at a
small size increase.

Stage two involves a complex means to identify the most likely to occur
in a 16 bit sequence, and replace it with the most compressible (via my
methods) of a 16 bit sequence.

Stage three involves using a simple 1 bit switch to tell us the
sequence is in the best ½ of the 16 bits, or the worst ½ of the 16
bits. The switch is temporarily stored elsewhere. If it is in the worst
16 bits group it is further switched with the best 16 bit sequence in
proper order.

Stage four involves a new method utterly of making three distinct code
groups out of the one. The first code group is where 25% of the savings
occurs. More on that later. The other two code groups are 'clones'
almost and between them we see the remaining 75% of the savings. Here
we use a simple 2 bit conversion that gives us a significant amount of
'play' with our now severely altered odds of occurrence (0's
occur so much more in this sequence at this stage, off of random binary
data, than 1's that it is laughable).

Stage five takes the 4 types of saved data, and formats two of them a
little further, resulting in the savings for the 25% area, and a means
to adequately monitor the whole.

Net result on random binary data: 82% (rounded up) of the original

However to achieve this compression ratio we had to use every iteration
of 16 bits from the onset. So in this case it's approximately 1 mb of
size getting compressed. In theory it can go as low as about 10 kb and
still get 5 to 8 cycles before we can no longer compress, but this can
be on previously compressed data. However this should be explained the
smaller the file gets, the less likely it will compress, as per a bell
curve. It is possible to get a file as low as 10 bytes, but in practice
this would be time consuming and not worth attempting (the number of
repetitions required, and the number of alterations in the codes layout
via 'cycling' which ones are altered in stage #1 would get
extreme). However this works in reverse as well. It IS possible to get
the entire worlds knowledge on a DVD. I don't want to be around for
the time/processing power it would take to do this, but it is possible.

My website covers more details<

Additional notes: It should be possible to use this method with
existing encryption techniques to make literally unbreakable
encryption. No chance of the NSA decrypting the data before our sun
explodes regardless of the time span, at cost of a common word.

This paper is Copyrighted, permission to use as a news artical is
granted as is the right to have it posted on The Data Compression News
Blog. All other rights reserved Coyright April 10th 2006

Is it April 1 again - and I had such great plans for next year.

If your serious how about I send you some random data and you let us all know how well it compresses ?

Relevant Pages

  • =?iso-8859-1?q?Re:_Kolmorgorov_Complexity_and_Kim_=D8yhus?=
    ... >>Sure a protein string could be generated from by organic Turing machine ... >>For example, a proteins sequence, like a lactase sequence, could indeed ... > proper compression. ... >>The notion of randomness is dependent upon chaos. ...
  • Re: =?iso-8859-1?q?Re:_Kolmorgorov_Complexity_and_Kim_=D8yhus?=
    ... There is no compression possible, ... compressed expression is shorter than the expressed string), ... makes no sense to ask whether the DNA-base sequence is shorter than the ... > language system code needed to compress and decompress the sequence. ...
  • Re: compressibility as classroom test for randomness
    ... coin 100 times to get a truely random sequence ... and another one to make up such a sequence. ... Unfortunately the compression programs I found ... I could replace the coin by some dice with, say, 16 ...
  • Optimal encoding of monotonic integer sequences
    ... For the linearly increasing sequence these ... Here you will encode integers not via the fixed length of s bits (which ... can trade off some compression optimality for speed by breaking ...
  • Re: Theoretical Limits for Compression Algorithms and Random Sequences
    ... there isn't a compression algorithm that provides compression for any ... If our algorithm just compresses a few range of the all possible ... The "random" sequence lies over the majority of the input space. ... You forget - compression functions don't on the whole compress. ...