View Single Post
  #14   Spotlight this post!  
Unread 08-09-2004, 15:32
Alan Anderson's Avatar
Alan Anderson Alan Anderson is offline
Software Architect
FRC #0045 (TechnoKats)
Team Role: Mentor
 
Join Date: Feb 2004
Rookie Year: 2004
Location: Kokomo, Indiana
Posts: 9,113
Alan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond repute
Re: New compression method

Quote:
Originally Posted by ErichKeane
In a real file, there are enough failsafes, that it may take as much as 30% to make a file unrecoverable. The same cannot be said about a compressed file.
Assuming that the compressed files are under 1k in size, it's no big deal to add "failsafes" such as error-correcting codes or even brute force redundancy after the compression. That will of course increase the resulting file size, but if an original megabyte of raw data still ends up less than a few kilobytes of damage-tolerant compressed data, it's a major net win.

Given a sufficiently restricted set of possible input files and a sufficiently large shared data base, I can achieve miraculous compression too. For example, I can "encode" any static data currently on the World Wide Web into a short string of characters: just reference it by URL. But arbitrary multimegabyte files compressed to 500-odd bytes? To say I am skeptical would be an understatement.