In case any of you are still in doubt about this "new compression scheme", I encourage you to read this discussion of exactly this matter, (
http://www.faqs.org/faqs/compression...section-8.html)
A quote from this document:
Quote:
|
It is mathematically impossible to create a program compressing without loss*all* files by at least one bit (see below and also item 73 in part 2 of this FAQ). Yet from time to time some people claim to have invented a new algorithm for doing so. Such algorithms are claimed to compress random data and to be applicable recursively, that is, applying the compressor to the compressed output of the previous run, possibly multiple times. Fantastic compression ratios of over 100:1 on random data are claimed to be actually obtained.
|