|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
|
|
Thread Tools | Rate Thread | Display Modes |
|
|
|
#1
|
||||
|
||||
|
Re: New compression method
Quote:
Another arguement that he made was that some of these methods used groups of bits to show compression. Most of these methods are flawed majorly because they had no proff that they could ever be reversed. But I do know for a fact that a computer can tell if a start of a group of bits are 1, 00, or 01 which makes them easly seperated. There is also another method which I had made and it also is a way to seperate the bytes but Ill explain that in my post when my program is finished (Was thrown away due to me finding a better faster way). If this is truly just about if I am lieing about this, just give me 7 days as I had posted earlyer and the new program will be finished. But just because some guy writes a paper on if this is possible or not does not mean he knows 100% what hes talking about. I am positive that he has not tested every single way you can compress a file which makes his arguement invalid about there being no possible way. Every once in a while something *impossible* happens, Im sure you can think of hundreds of examples, but please just give me the time to prove this. It think the short wait of only 7 days isnt going to kill you. And if you still really think its so impossible thats there is no chance at all, you can exit this post and forget all about it. edit - Btw when i finish the program I will post a link to a video of the program running, and if thats not enough because some people are going to say, "Ahh that can be easly faked" Ill post up some other way that it can be proven unless i somehow get the money for a temporary patent, then ill just put the source up for everyone and the method. Last edited by Aalfabob : 07-09-2004 at 22:02. |
|
#2
|
||||
|
||||
|
Re: New compression method
Quote:
I'm not really sure what i talkign about and i may be completely wrong, but could this possible fall under a copyright which is a much easier process. I think there is even some sort of implied copyright on everythign that you don't actualyl have to file for but it won't hold up in court very as well as an official one. I looked into this at one point a loong time ago but i don't really remember much. Last edited by Rickertsen2 : 07-09-2004 at 23:04. |
|
#3
|
||||
|
||||
|
Re: New compression method
I understand exactly what you are saying and I would feel the same way. The reason I came to this site is because I was having trouble finding an actual forum on file compression. So I talked to one of my friends and he said that this would be a good site so I made a post.
The only way to give everyone hard proof of this would be to give someone the decompressor, and have them provide me with the file that needs compressing. This is the only fair way I can think of that would prove that it works while keeping how it works a secret until I can actually make the idea *safe*. Stay patient and I will have a fast and working program ready for this. |
|
#4
|
||||
|
||||
|
Re: New compression method
Quote:
Putting a copyright notice on is easy, at least. Just stick a "Copyright 2004 by Alphabob (real name, obviously)." And, the best part is, you can do that for free. No registration, nothing. ![]() Last edited by Ryan M. : 08-09-2004 at 06:45. |
|
#5
|
|||
|
|||
|
Re: New compression method
A few things: newer compression methods do not base themselves on true ability to compress. Even if you COULD do what you claim, which i highly doubt, it would not be very worth while.
Right now, hard drive space is cheap. Time is not cheap. So if your algorithm took say, 2 hrs to make my 1gb file into a 500byte file, and winrar turns it into 400mb in 3 minutes, what is the advantage? Sure, i saved about 400mb, but i lost 2 hours! Also, you can get a hard drive for a buck a gig, so whats the point? Also, most people who care about larger files, have broadband. I personally downloaded the entire mandrake linux pack a few years ago on a 1mbit connection. So, i let it run overnight. Consider this: the ISO files totaled to about 2 gigs, it took me overnight to download, so no big deal. Using winrar, i possibly could have gotten these 2 gigs down to about 1.3 gb, but it would have taken 20 minutes or so. Now, based on the fact that you say it is an exponentially enlarging algorithm, i would find no problem assuming this compression/decompression would take longer than it did to download the whole file on my DSL connection. NOW, place that file accross my 10mbit connection at college, and compression becomes useless. It used to be for compression that the order of matter was this : Size, Speed, Packageability. Now the order is: Packageability, size, speed. So unless your "algorithm" can turn the 2gb files into a 500byte file within 20 minutes, it would not be worth while. |
|
#6
|
|||||
|
|||||
|
Re: New compression method
Sure it would be worth the while... I mean... imagine being able to store entire hard drives on say, your USB pen, or on a floppy. Set your computer to make weekly (or nightly) backups overnight to a 1kb file. And as processors are constantly evolving, so too will the compressing speeds drop. And you probably wouldn't need to compress your files down to 500 bytes... bringing it down to say, 10 megs would be just as ideal (or 1+ megs for your floppy) or to 700 megs and toss it on a CD.
It would naturally change many industries too; especially if under professional development the processing time can be reduced. Transfer full length feature films to the theatres for digital real-time screening... not to mention all those 56k'ers out there mmmm piracy |
|
#7
|
|||
|
|||
|
Re: New compression method
This claim is total BS. Iterative compression simple isn't possible as it will *increase* filesize instead of reducing it.
Regards, Werner Bergmans Webmaster http://www.maximumcompression.com/ |
|
#8
|
|||
|
|||
|
Re: New compression method
Im calling BS too. As i said before, even if you could compress stuff really well, it wouldnt be worth the time.
Processing power is expensive--it takes a TON of money to get a processor that would be able to do an iterative compression method in any reasonable amount of time. Storage on the other hand, is relatively cheap: you said that making nightly backups would be the way to go. What happens if nightly backups take say, 16 hours? It makes it not worth it, especially because then you need to factor in transfer time. It is just as cheap to set up a raid array and just live with an extra HD around. Compression was big a few years ago, but it really isnt important at all right now. Most corperations wont even trust compression, because they lack some of the failsafes that maintain the integrity of the compressed file. In a compressive algorithm, it is possible to screw up the entire file with a relatively small amount of loss (slightly above 1 byte). In this "method" listed by the thread starter, i would assume that a loss of 4 bits could screw the file to hell. In a real file, there are enough failsafes, that it may take as much as 30% to make a file unrecoverable. The same cannot be said about a compressed file. |
|
#9
|
||||
|
||||
|
Re: New compression method
I have to argue from the other side. Compression is good. For instance, I have a 120 GB HD and a 40 GB HD in my computer right now. (For those lazy out there, that's 160 GBs of total storage.) However, I still don't appreciate the 9 GBs that 4 Fedora Core, 3 Mandrake, 3 Redhat, 3 BSD, and a few different live CD ISOs are taking up.
Also, consider the people on dialup users who pay by the minutes they're on. If you had the choice between a 100 MB download or a .5 KB download which are exactly the same things, which would you go for? Sure, the decompression takes time, but you're not paying for the time online you would be spending downloading a larger file. Also, this would be great for those amateur developers/webmasters who use a cheap/free web server with bandwidth limits. |
|
#10
|
||||
|
||||
|
Re: New compression method
OK at these speed issues,
This is the information I have compiled about the compressor so far: If is able to run the compression routine for 7.5 megs in 1 second (It is actually alot faster then this but incase I have any more ideas for the test version). Now if this file was highly in the compressors favor it could reach around a 10.9% gain in that one second for 7.5 megs. This means that it would shorten the file from 7.5 megs to around 6.67 megs in close to 1 second depending on your cpu and memory clock speed. This test was taken on a 1.47 Ghz XP athlon with 133 mhz memory. Another thing about the time being exponentially larger: Even though it has more passes to run for a small gain, the data being compressed is much smaller which gives it a pretty constant change vrs. time. If you look at those graphs ive posted earlyer, you can see that it includes no time but only passes vrs file size. Btw the file needing to be in its 'favor' does not have a very great chance on random files but still can occur. But on files with charators near each other (text files, logs, ect.) It will be very close to to its max gain per pass (10.9%). Now on average, with my last build which wasnt programmed very well came to around 1.5% to 4% per pass on a .rar file, but the current build has been heavly modifyed to get higher compressions per pass. O and here is another graph that shows that sometimes a file can actually get larger for a run but in the end, the compression method is in the favor of getting smaller. (Have tested many files of different sizes and types this is just one). edit - If you think these times are impossible, they are due to the file being loaded into memory and of course it does take larger files in chunks if memory is needed. Also the speed comes from the compression process consisting of a couple of if thens, the qsort routine, and then the bytes being put back into memory. I expect that a professional programmer would be able to chop these times down by a ton by using assembly or faster librarys. Last edited by Aalfabob : 08-09-2004 at 15:15. |
|
#11
|
|||
|
|||
|
Re: New compression method
Quote:
I make you a deal. Compress one of the test files on my website (http://www.maximumcompression.com/) and send me the compressed file + compiled decoder. I will then decompress this file on my computer. If after decompressing original file is restored (binary identical to the original) and the size of the decoder + compressed file is less then 95% of the best compressor mentioned for that test you get $10.000 from me!. Deal? PS Before decompressing I'm allowed to change the name of the compressed file to anything I like. Last edited by D.J. Fluck : 08-09-2004 at 17:58. Reason: language |
|
#12
|
||||
|
||||
|
Re: New compression method
Quote:
|
|
#13
|
||||
|
||||
|
Re: New compression method
Quote:
You said compressor + compressed file ='s less then todays best compiler. Well the .exe tested on that site is only 3,870,784 Bytes. The top scoring compressor recieved a final size of 953785 Bytes. You want me to get that down to a size of 47,689 Bytes (47 Kb @ 95%). Ok so even if my compression could knock it down to 515 bytes, that leaves 47,174 Bytes left for the program. Um I dont really know how you want me to get my program that small with out spending hundreds of hours in assembly just to save the size of the decompressor. Right now with a little over 100 lines im at 626,747 Bytes. Unless im reading your challenge wrong, it seems a little impossible... Last edited by D.J. Fluck : 08-09-2004 at 18:10. Reason: Language Editing in Quote |
|
#14
|
|||||
|
|||||
|
Re: New compression method
Quote:
Given a sufficiently restricted set of possible input files and a sufficiently large shared data base, I can achieve miraculous compression too. For example, I can "encode" any static data currently on the World Wide Web into a short string of characters: just reference it by URL. But arbitrary multimegabyte files compressed to 500-odd bytes? To say I am skeptical would be an understatement. |
|
#15
|
|||||
|
|||||
|
Re: New compression method
We'll, I guess we'll find out in less than a week's time, eh?
PERSONALLY, and I think I speak for everyone here, I'd hope that this is the real deal. I wonder though if it would be possible for you to give us a quick run down on the logic behind your algorithms... I mean, for example, how you manage to get around the counting problem (the 2^n-1 thingamajiggy) Are you somehow incorporating non-base 10 mathematics or something? |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| Compression Crisis | reisser | 3D Animation and Competition | 9 | 22-02-2004 11:23 |
| IRI Elimination Round Method | D.J. Fluck | Off-Season Events | 19 | 23-07-2003 18:56 |
| Scouting method suggestiongs | punarhero | Scouting | 0 | 26-01-2003 04:27 |
| What is your favorite method for attaching gears to shafts? | archiver | 2001 | 13 | 24-06-2002 04:00 |
| CRYSTAL METHOD CONCERT | drksdofthemoon | Chit-Chat | 7 | 30-04-2002 16:58 |