Re: Compression - is this it, or is there more to come?



"Dave -Turner" <admin@xxxxxxxxx> writes:
Compression algorithms have obviously come a long way, but now it seems
they're all standing at the end gates of what's possible. With encryption we
can continue to make things harder and harder to crack, but with compression
it seems there's only so much you can do to crunch data ... are we reaching
the limits of compression or is there still a lot more to come?

We're reaching the limits of the almost-general-purpose lossless
schemes, yes. They by their nature have to work out a suitable model
for the data themselves depending on what they've already encountered.

However, there's nothing stopping us from coming up with far better
models for specific types of data. Natural language text seems to
be something that's screaming for better compression ratios. But alas
it seems hard to teach computers what to expect from human languages.

Phil


--
I find the easiest thing to do is to k/f myself and just troll away
-- David Melville on r.a.s.f1
.



Relevant Pages

  • Re: Someone said 256 bits is not enough
    ... these idiots even claim that compression would be related to AI. ... There's a pretty general compressor architecture based on a predictor-corrector structure and entropy coding. ... It seems reasonable to expect that beyond a certain point, improving the compression ratio of natural language text would require models that exploits more than just the morphological and syntactic features of the languageinvolved. ... the likelihood of every character in the alphabet given ...
    (sci.crypt)
  • Re: Someone said 256 bits is not enough
    ... as they drugged up lunatics have been claiming to be doing ... these idiots even claim that compression would be related to AI. ... It seems reasonable to expect that beyond a certain point, improving the compression ratio of natural language text would require models that exploits more than just the morphological and syntactic features of the languageinvolved. ... The work that has been done in natural language understanding seems like a natural place to look for ideas for building better (prediction) models. ...
    (sci.crypt)
  • Re: Inventing store 1.12Gb raw video in 5Mb!
    ... >> probability distribution is nonuniform. ... in natural language, then yes. ... Compression is just a useful way to ...
    (comp.compression)
  • Re: Someone said 256 bits is not enough
    ... Generally better prediction accuracy translates to better compression. ... It seems reasonable to expect that beyond a certain point, improving the compression ratio of natural language text would require models that exploits more than just the morphological and syntactic features of the languageinvolved. ...
    (sci.crypt)
  • Re: Compression - is this it, or is there more to come?
    ... the limits of compression or is there still a lot more to come? ... As far as entropy coding is concerned, we know the limits, and the limits have been reached. ... It also tells how far you can compress for which error, and known coding technology can go pretty close to these limits. ... there is still something to gain - by carefully considering the often unspoken assumptions of the current mathematical models. ...
    (comp.compression)