View Single Post
  #22   Spotlight this post!  
Unread 08-09-2004, 13:24
ErichKeane ErichKeane is offline
Registered User
FRC #3210
Team Role: Mentor
 
Join Date: Nov 2003
Rookie Year: 2004
Location: Hillsboro, OR
Posts: 113
ErichKeane is just really niceErichKeane is just really niceErichKeane is just really niceErichKeane is just really niceErichKeane is just really nice
Send a message via AIM to ErichKeane
Re: New compression method

Im calling BS too. As i said before, even if you could compress stuff really well, it wouldnt be worth the time.

Processing power is expensive--it takes a TON of money to get a processor that would be able to do an iterative compression method in any reasonable amount of time.

Storage on the other hand, is relatively cheap: you said that making nightly backups would be the way to go. What happens if nightly backups take say, 16 hours? It makes it not worth it, especially because then you need to factor in transfer time. It is just as cheap to set up a raid array and just live with an extra HD around.

Compression was big a few years ago, but it really isnt important at all right now. Most corperations wont even trust compression, because they lack some of the failsafes that maintain the integrity of the compressed file. In a compressive algorithm, it is possible to screw up the entire file with a relatively small amount of loss (slightly above 1 byte). In this "method" listed by the thread starter, i would assume that a loss of 4 bits could screw the file to hell.

In a real file, there are enough failsafes, that it may take as much as 30% to make a file unrecoverable. The same cannot be said about a compressed file.