![]() |
Re: New compression method
Look, the math is quite simple on why this doesn't work...
Let me reduce it down for 4 bits to make things simple. Let's say I make a program and claim that it can compress things to 4 bits. That means there are 16 possible files. Only 16 possible files. It is mathematically impossible to compress more than 16 files into 4 bits with the same procedure, no matter how many times you run it. (In the case of recursiveness you'd have to keep track of how many times which would immediately add to the file size.) This same concept applies when you make a claim to be able to compress any 1 megabyte file into 500 bytes. It is simply absurd. Compression algorithms only work because they are good at finding patterns and taking patterned data and manipulating it. This is why you will almost never be able to compress random data. Thanks for playing, hit the books and come back with something that's possible. |
Re: New compression method
I'd be interested to test out the decompressor as well. My email is fizixman@vgamp.com so let me know when... and send out a non-disclosure agreement or something or other if you're worried.
EDIT: Let's not forget that "Jack" here on Chiefdelphi is his friend... so Aalfabob, you should get together with Jack and show him your program working, and then you can testify Jack :D |
Re: New compression method
I'd be interested in seeing this work as well. You can contact me at DubreuilM@wit.edu.
I'm most interested in seeing how the test will handle a 5MB dump from /dev/random on a UNIX machine. |
Re: New compression method
Mike!! I had no idea anyone else from here went to WIT! Im a freshman on campus, tudbury mid, such a small world eh?
anyway, i think this is going to be one of those situations where someone has somethign solved in their mind supposedly, but then just "forgets" one small, simple thing that screws up the whole boat. |
Re: New compression method
Quote:
I am for the most part skeptical till I see something work in front of me, but love to dream that it could. Don't rain on this person's parade, until you absolutely see it not work (and even then, add ways that it may be able to work). By the way, I have come up with a great idea for a perpetual motion machine, but until I can get a prototype made, I won't reveal my new idea. One reason being that I don't want anyone else to take the credit for it, but the second reason is that if it does not work in real life like it does on paper, then I don't want the kind of responses that Aalfabob is recieving and the embarassment of letting myself down in public. Remember those two words people. Gracious Professionalism. Make your grandma's proud. Don't blindly say that something will or will not work until you see the proof that it does or does not. |
Re: New compression method
lolk, I love trying to invent perpetual motion machines! If you ever want someone to take a fresh look at your machine (because I find, that it often times will point out that little flaw that is so obvious... yet, so hidden) make me sign a confidentiality agreement or something.
Though lol, sometimes I feel bad for saying, "Umm... you realize blah blah blah so it won't work..." ;_; |
Re: New compression method
Alright heres an update on how build 5 is going:
I was going to try a new method I had came up with about a day ago cause I thought it would be a little better. After running a few tests on it, I found out that only some files ran better then the past method, and sometimes the files even got larger. I havent finished fully testing this program because when a file is being changed through a pass, it ends up with a totally different make-up inside of it which means that it could actually regain the lost bytes and compress below the original file. Although the 1st method is a little more complex, I should still be able to meet that dead line I gave before if anything doesnt 'pop-up' (Life stuff). And if something does happen, it wont delay the test by much. Im putting as much time as I can into programming it right now and dont have work for the next couple of days. I am taking a break from that method and moving back into the first which I thought was going to be very slow, but I redesigned most of it to make it actually faster then the new method. Ive checked a couple of the files after a couple of runs by hand to see if the data is coming out right and everything seems fine. Thats about it for now. edit - Some good news is though that I had finished the compressor before for the first method so I can basically copy most of the functions over and with some changes in file set up it should be good to go. |
Re: New compression method
Quote:
|
Re: New compression method
So Aalfa, do you take certified cheques/money order or have paypal? roffle
|
Re: New compression method
Please correct me if im worng but I think this can help prove it a little better:
So say I have 512 bytes of data im compressing. This can contain 3.74 * 10^693 different combinations correct? So say I add a byte to that (513 Bytes), the combinations that creates are 6.16 * 10^693 or twice as much. So say I have one byte as a header that can count the amount of times that data has been compressed. So that header can hold 256 values. So depending on which way those bytes come out (3.74 * 10^693), the counter can hold 256 different values so wouldnt adding that 1 byte for counting actually make the file have 9.57 * 10^695 combinations (256 * (3.74 * 10^693))? Now this is alot more combinations avaliable for the same amount of data. Hopefully I did that right. Data: 512 byte combinations = 3.74 * 10^693 513 byte combinations = 6.16 * 10^693 513 with one byte being a counter = 9.57 * 10^695 If im correct i think that this can prove that that many pieces of randomly generated code can fit in that space. And plus im using a 2 byte main header which can contain 65536 runs. |
Re: New compression method
Any chance of a Mac OS X compile? Interestingly enough, I'm also collaborating with a friend in producing a new compression scheme that uses recursive compression. Don't ask me about it, though. I'm just building the interface.
mrtoast@gmail.com MrToast [edit] 121st post! Go Rhode Warriors![/edit] |
Re: New compression method
Quote:
Also, I _love_ math and arguing about math, etc, but _please_ use 2^x instead of 10. With binary stuff it makes it SO much easier to understand what's actually going on. Rob |
Re: New compression method
Quote:
|
Re: New compression method
The judgment might be being made among some of you that the algorithm is automatically a failure since it cannot possibly compress all possible combinations of files to 515 bytes. The algorithm doesn’t have to be equally successful with compressing every class of file, and they don’t all have to be reduced to 515 bytes to be considered successfully compressed. The most “successful” compression technology becomes so by specializing in one class of file or another, e.g., imagery, text, object, and executable files. Our friend here hasn’t yet had the opportunity to run rigorous tests on the wide variety of files within even one class yet to discover the limitations of his algorithm.
This is a learning experience in developing an algorithm. Everyone can help as independent testers, with developing test procedures, mathematical proofs, test sets, critique, etc. Just keep it positive. The mathematical reasoning posted above is very good experience for Aalfabob and for any of us for the type of proof that will be required whenever you develop a commercially viable algorithm. Don’t be discouraged to uncover shortcomings of your method. |
Re: New compression method
Isnt the formula combinations = bytes ^ values
so like a 512 byte peice of data ^ 256 possible charactors = 3.742e+693 combinations? edit - Nm got the wrong formula from someone. |
| All times are GMT -5. The time now is 13:31. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi