View Single Post
  #5   Spotlight this post!  
Unread 08-09-2004, 11:45
ErichKeane ErichKeane is offline
Registered User
FRC #3210
Team Role: Mentor
 
Join Date: Nov 2003
Rookie Year: 2004
Location: Hillsboro, OR
Posts: 113
ErichKeane is just really niceErichKeane is just really niceErichKeane is just really niceErichKeane is just really niceErichKeane is just really nice
Send a message via AIM to ErichKeane
Re: New compression method

A few things: newer compression methods do not base themselves on true ability to compress. Even if you COULD do what you claim, which i highly doubt, it would not be very worth while.

Right now, hard drive space is cheap. Time is not cheap. So if your algorithm took say, 2 hrs to make my 1gb file into a 500byte file, and winrar turns it into 400mb in 3 minutes, what is the advantage? Sure, i saved about 400mb, but i lost 2 hours! Also, you can get a hard drive for a buck a gig, so whats the point?

Also, most people who care about larger files, have broadband. I personally downloaded the entire mandrake linux pack a few years ago on a 1mbit connection. So, i let it run overnight. Consider this: the ISO files totaled to about 2 gigs, it took me overnight to download, so no big deal. Using winrar, i possibly could have gotten these 2 gigs down to about 1.3 gb, but it would have taken 20 minutes or so.

Now, based on the fact that you say it is an exponentially enlarging algorithm, i would find no problem assuming this compression/decompression would take longer than it did to download the whole file on my DSL connection.

NOW, place that file accross my 10mbit connection at college, and compression becomes useless. It used to be for compression that the order of matter was this : Size, Speed, Packageability. Now the order is: Packageability, size, speed.

So unless your "algorithm" can turn the 2gb files into a 500byte file within 20 minutes, it would not be worth while.