![]() |
Re: New compression method
Quote:
I can give you build 1 of the compressor and a word document of how it works if you like. The document pretty much shows how it can reverse and how it compresses / uncompresses. Ill try to send you it by tonight because I have to leave in 30 minutes for a couple of hours but when i come back I should have plenty of time. |
Re: New compression method
Quote:
Rob |
Re: New compression method
...and the plot thickens...
|
Re: New compression method
An update:
Last night, Aalfabob sent me his compressor along with a short description of how it works. I then proceeded to take 5 files from my computer (1 text file, 1 JPG image, 1 exe, 1 rar file, and 1 file of random data) and run them through his compressor. All files ended up being right around 800 bytes. I then emailed the compressed files back to him, and am currently waiting for him to decompress them and send them back to me. I'll let you guys know if they come back the same as the originals as soon as I get them. -Rob |
Re: New compression method
Quote:
|
Re: New compression method
Quote:
|
Re: New compression method
Quote:
|
Re: New compression method
Quote:
|
Re: New compression method
Alright theres something going on with my C++,
A variable is changing after a certain amount of time when im not even referring to it in the hole program except for the start. Ive checked to see if I misplaced a line somewhere and I havent, the compiler says that the variable doesnt even come up in the area where its changing. Any ideas? It seems very odd that its doing this unless another program is somehow editing the memory when its running. |
Re: New compression method
Quote:
|
Re: New compression method
Quote:
|
Re: New compression method
RBayer checked over my code a little and found a little problem in it. In one of my if statements i put an = instead off an == which somehow through off the results. With this fixed it seems that the files are getting larger. But in another build ive made without this error they seem to get smaller. There probebly is an error in that one to, but ill check it another day just incase it can get some files to really small sizes.
The problem wasnt really with the ability to uncompress the data to normal state, but the way the data comes out as when it compresses each time. So say you compress it once and it gains compression, the way it is put back into the file makes it impossible to gain any more compression, and will ussually loss some. I dont know how this actually happens but there must be a law to keep random data from being compressed (Not the way stated before with the n-1 or whatever it was). And as stated before it is impossible to do this lol, if anyone wishes to take a look or anything about how I tryed to do this just leave me your email and ill send you a .doc or I could just post it here if anyone wishes. Hopefully noone has wasted to much time with this thread. |
Re: New compression method
Post it here please!
|
Re: New compression method
Quote:
Quote:
The "law" is pretty simple: you need a certain number of bits in order to represent a certain amount of information. If a file has redundancy, or repeating patterns, it contains less information than if it had no such patterns. The redundancy can be removed and the file can be represented in a smaller form. Text tends to exhibit statistical patterns. Executable programs and pictures and audio have other patterns. Truly random data by definition has no redundancy, and thus can't be compressed without losing information. |
Re: New compression method
Quote:
The reason we can sort of decide if data is random is that we can easily some types of patterns and use that as our "compression scheme." |
| All times are GMT -5. The time now is 13:31. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi