View Single Post
  #10   Spotlight this post!  
Unread 08-09-2004, 15:08
Aalfabob's Avatar
Aalfabob Aalfabob is offline
Registered User
#0201 (FEDS)
 
Join Date: Sep 2004
Rookie Year: 2004
Location: Rochester, MI
Posts: 27
Aalfabob is on a distinguished road
Send a message via AIM to Aalfabob
Re: New compression method

OK at these speed issues,
This is the information I have compiled about the compressor so far:
If is able to run the compression routine for 7.5 megs in 1 second (It is actually alot faster then this but incase I have any more ideas for the test version). Now if this file was highly in the compressors favor it could reach around a 10.9% gain in that one second for 7.5 megs. This means that it would shorten the file from 7.5 megs to around 6.67 megs in close to 1 second depending on your cpu and memory clock speed. This test was taken on a 1.47 Ghz XP athlon with 133 mhz memory.

Another thing about the time being exponentially larger:
Even though it has more passes to run for a small gain, the data being compressed is much smaller which gives it a pretty constant change vrs. time. If you look at those graphs ive posted earlyer, you can see that it includes no time but only passes vrs file size.

Btw the file needing to be in its 'favor' does not have a very great chance on random files but still can occur. But on files with charators near each other (text files, logs, ect.) It will be very close to to its max gain per pass (10.9%). Now on average, with my last build which wasnt programmed very well came to around 1.5% to 4% per pass on a .rar file, but the current build has been heavly modifyed to get higher compressions per pass.

O and here is another graph that shows that sometimes a file can actually get larger for a run but in the end, the compression method is in the favor of getting smaller. (Have tested many files of different sizes and types this is just one).

edit - If you think these times are impossible, they are due to the file being loaded into memory and of course it does take larger files in chunks if memory is needed. Also the speed comes from the compression process consisting of a couple of if thens, the qsort routine, and then the bytes being put back into memory. I expect that a professional programmer would be able to chop these times down by a ton by using assembly or faster librarys.
Attached Thumbnails
Click image for larger version

Name:	Graph3.jpg
Views:	78
Size:	54.2 KB
ID:	2468  

Last edited by Aalfabob : 08-09-2004 at 15:15.