Go to Post When I'm hiring an intern "Award Winning VRC Notebook" isn't applicable. But "Award winning technical documentation" sure is. - Andrew Schreiber [more]
Home
Go Back   Chief Delphi > Technical > Programming
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
Closed Thread
 
Thread Tools Rate Thread Display Modes
  #1   Spotlight this post!  
Unread 07-09-2004, 21:53
Aalfabob's Avatar
Aalfabob Aalfabob is offline
Registered User
#0201 (FEDS)
 
Join Date: Sep 2004
Rookie Year: 2004
Location: Rochester, MI
Posts: 27
Aalfabob is on a distinguished road
Send a message via AIM to Aalfabob
Re: New compression method

Quote:
Originally Posted by Max Lobovsky
In case any of you are still in doubt about this "new compression scheme", I encourage you to read this discussion of exactly this matter, (http://www.faqs.org/faqs/compression...section-8.html)

A quote from this document:
Wow, had no clue anyone else was claiming that they could do this. Ive read the paper and from what I understand, my method does not fit his arguement for the most part. It uses no fancy math, no special numbers which im guessing that most of the people try to work with because their *special* and really had no proof or ideas that they would actually do something. But I have to admit, alot of the processes he talks about that are flawed I had thought about at one time and I turned them down because it was easly noticed that they were flawed. I check my ideas from start to finish before I go around and claiming that I invented something that works.

Another arguement that he made was that some of these methods used groups of bits to show compression. Most of these methods are flawed majorly because they had no proff that they could ever be reversed. But I do know for a fact that a computer can tell if a start of a group of bits are 1, 00, or 01 which makes them easly seperated. There is also another method which I had made and it also is a way to seperate the bytes but Ill explain that in my post when my program is finished (Was thrown away due to me finding a better faster way).

If this is truly just about if I am lieing about this, just give me 7 days as I had posted earlyer and the new program will be finished. But just because some guy writes a paper on if this is possible or not does not mean he knows 100% what hes talking about. I am positive that he has not tested every single way you can compress a file which makes his arguement invalid about there being no possible way. Every once in a while something *impossible* happens, Im sure you can think of hundreds of examples, but please just give me the time to prove this. It think the short wait of only 7 days isnt going to kill you. And if you still really think its so impossible thats there is no chance at all, you can exit this post and forget all about it.

edit - Btw when i finish the program I will post a link to a video of the program running, and if thats not enough because some people are going to say, "Ahh that can be easly faked" Ill post up some other way that it can be proven unless i somehow get the money for a temporary patent, then ill just put the source up for everyone and the method.

Last edited by Aalfabob : 07-09-2004 at 22:02.
  #2   Spotlight this post!  
Unread 07-09-2004, 22:43
Rickertsen2 Rickertsen2 is offline
Umm Errr...
None #1139 (Chamblee Gear Grinders)
Team Role: Alumni
 
Join Date: Dec 2002
Rookie Year: 2002
Location: ATL
Posts: 1,421
Rickertsen2 has a brilliant futureRickertsen2 has a brilliant futureRickertsen2 has a brilliant futureRickertsen2 has a brilliant futureRickertsen2 has a brilliant futureRickertsen2 has a brilliant futureRickertsen2 has a brilliant futureRickertsen2 has a brilliant futureRickertsen2 has a brilliant futureRickertsen2 has a brilliant futureRickertsen2 has a brilliant future
Send a message via AIM to Rickertsen2 Send a message via Yahoo to Rickertsen2
Re: New compression method

Quote:
Originally Posted by Aalfabob
Wow, had no clue anyone else was claiming that they could do this. Ive read the paper and from what I understand, my method does not fit his arguement for the most part. It uses no fancy math, no special numbers which im guessing that most of the people try to work with because their *special* and really had no proof or ideas that they would actually do something. But I have to admit, alot of the processes he talks about that are flawed I had thought about at one time and I turned them down because it was easly noticed that they were flawed. I check my ideas from start to finish before I go around and claiming that I invented something that works.

Another arguement that he made was that some of these methods used groups of bits to show compression. Most of these methods are flawed majorly because they had no proff that they could ever be reversed. But I do know for a fact that a computer can tell if a start of a group of bits are 1, 00, or 01 which makes them easly seperated. There is also another method which I had made and it also is a way to seperate the bytes but Ill explain that in my post when my program is finished (Was thrown away due to me finding a better faster way).

If this is truly just about if I am lieing about this, just give me 7 days as I had posted earlyer and the new program will be finished. But just because some guy writes a paper on if this is possible or not does not mean he knows 100% what hes talking about. I am positive that he has not tested every single way you can compress a file which makes his arguement invalid about there being no possible way. Every once in a while something *impossible* happens, Im sure you can think of hundreds of examples, but please just give me the time to prove this. It think the short wait of only 7 days isnt going to kill you. And if you still really think its so impossible thats there is no chance at all, you can exit this post and forget all about it.

edit - Btw when i finish the program I will post a link to a video of the program running, and if thats not enough because some people are going to say, "Ahh that can be easly faked" Ill post up some other way that it can be proven unless i somehow get the money for a temporary patent, then ill just put the source up for everyone and the method.
You have piqued my intrest but i won't believe anyhting till i see it. Honestly jsut think about what you are claiming and give us reason why we should believe you and not jsut think you a nut? We have been given lots of empty promises but no hard evidence. One question... Why do you choose these forums to post your *discovery*? Sure we are all nerds here and many of us are interested in this sort of thing but i am sure there are forums dedicated to compression. Why Chiefdelphi? I truly would like to believe that your claims are real but until i see proof you are a nut in my book. This reminds my of the human cloning ppl a while back who suddenly vanished into the ethersphere when asked to prove what they had done. I have seen too many things like this that all turn out to be nothing. If i were claiming what you are claiming you would think i was nuts too so you can't realyl blame us. I'll admit that many things that have in the past been dismissed at utterly impossible are now intergral parts of our everyday lives. Maybie this is one of them but somehow i am doubtful. I sincerely hope you prove me wrong.

I'm not really sure what i talkign about and i may be completely wrong, but could this possible fall under a copyright which is a much easier process. I think there is even some sort of implied copyright on everythign that you don't actualyl have to file for but it won't hold up in court very as well as an official one. I looked into this at one point a loong time ago but i don't really remember much.
__________________
1139 Alumni

Last edited by Rickertsen2 : 07-09-2004 at 23:04.
  #3   Spotlight this post!  
Unread 07-09-2004, 22:56
Aalfabob's Avatar
Aalfabob Aalfabob is offline
Registered User
#0201 (FEDS)
 
Join Date: Sep 2004
Rookie Year: 2004
Location: Rochester, MI
Posts: 27
Aalfabob is on a distinguished road
Send a message via AIM to Aalfabob
Re: New compression method

I understand exactly what you are saying and I would feel the same way. The reason I came to this site is because I was having trouble finding an actual forum on file compression. So I talked to one of my friends and he said that this would be a good site so I made a post.

The only way to give everyone hard proof of this would be to give someone the decompressor, and have them provide me with the file that needs compressing. This is the only fair way I can think of that would prove that it works while keeping how it works a secret until I can actually make the idea *safe*. Stay patient and I will have a fast and working program ready for this.
  #4   Spotlight this post!  
Unread 08-09-2004, 06:14
Ryan M. Ryan M. is offline
Programming User
FRC #1317 (Digital Fusion)
Team Role: Programmer
 
Join Date: Jan 2004
Rookie Year: 2004
Location: Ohio
Posts: 1,508
Ryan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud of
Re: New compression method

Quote:
Originally Posted by Rickertsen2
I'm not really sure what i talkign about and i may be completely wrong, but could this possible fall under a copyright which is a much easier process. I think there is even some sort of implied copyright on everythign that you don't actualyl have to file for but it won't hold up in court very as well as an official one. I looked into this at one point a loong time ago but i don't really remember much.
He could copyright any source code released (an open source license is essentially a detailed copyright) or any document detailing the algorithm, but I don't think it wouldn't actually protect the algorithm itself. I may be wrong about that.

Putting a copyright notice on is easy, at least. Just stick a "Copyright 2004 by Alphabob (real name, obviously)." And, the best part is, you can do that for free. No registration, nothing.
__________________


Last edited by Ryan M. : 08-09-2004 at 06:45.
  #5   Spotlight this post!  
Unread 08-09-2004, 11:45
ErichKeane ErichKeane is offline
Registered User
FRC #3210
Team Role: Mentor
 
Join Date: Nov 2003
Rookie Year: 2004
Location: Hillsboro, OR
Posts: 113
ErichKeane is just really niceErichKeane is just really niceErichKeane is just really niceErichKeane is just really niceErichKeane is just really nice
Send a message via AIM to ErichKeane
Re: New compression method

A few things: newer compression methods do not base themselves on true ability to compress. Even if you COULD do what you claim, which i highly doubt, it would not be very worth while.

Right now, hard drive space is cheap. Time is not cheap. So if your algorithm took say, 2 hrs to make my 1gb file into a 500byte file, and winrar turns it into 400mb in 3 minutes, what is the advantage? Sure, i saved about 400mb, but i lost 2 hours! Also, you can get a hard drive for a buck a gig, so whats the point?

Also, most people who care about larger files, have broadband. I personally downloaded the entire mandrake linux pack a few years ago on a 1mbit connection. So, i let it run overnight. Consider this: the ISO files totaled to about 2 gigs, it took me overnight to download, so no big deal. Using winrar, i possibly could have gotten these 2 gigs down to about 1.3 gb, but it would have taken 20 minutes or so.

Now, based on the fact that you say it is an exponentially enlarging algorithm, i would find no problem assuming this compression/decompression would take longer than it did to download the whole file on my DSL connection.

NOW, place that file accross my 10mbit connection at college, and compression becomes useless. It used to be for compression that the order of matter was this : Size, Speed, Packageability. Now the order is: Packageability, size, speed.

So unless your "algorithm" can turn the 2gb files into a 500byte file within 20 minutes, it would not be worth while.
  #6   Spotlight this post!  
Unread 08-09-2004, 12:34
FizMan's Avatar
FizMan FizMan is offline
aboot, eh?
AKA: Chris Sinclair
#0783 (Mobotics)
Team Role: Alumni
 
Join Date: Feb 2004
Location: Toronto, Canada
Posts: 102
FizMan will become famous soon enough
Send a message via AIM to FizMan Send a message via MSN to FizMan
Re: New compression method

Sure it would be worth the while... I mean... imagine being able to store entire hard drives on say, your USB pen, or on a floppy. Set your computer to make weekly (or nightly) backups overnight to a 1kb file. And as processors are constantly evolving, so too will the compressing speeds drop. And you probably wouldn't need to compress your files down to 500 bytes... bringing it down to say, 10 megs would be just as ideal (or 1+ megs for your floppy) or to 700 megs and toss it on a CD.

It would naturally change many industries too; especially if under professional development the processing time can be reduced. Transfer full length feature films to the theatres for digital real-time screening... not to mention all those 56k'ers out there mmmm piracy
__________________
Joules per second! Watt? Joules per second! Watt? Jouls per second! Watt?
  #7   Spotlight this post!  
Unread 08-09-2004, 12:35
Fulcrum2000 Fulcrum2000 is offline
Registered User
no team
 
Join Date: Sep 2004
Location: Netherlands
Posts: 7
Fulcrum2000 can only hope to improve
Re: New compression method

This claim is total BS. Iterative compression simple isn't possible as it will *increase* filesize instead of reducing it.

Regards,
Werner Bergmans
Webmaster http://www.maximumcompression.com/
  #8   Spotlight this post!  
Unread 08-09-2004, 13:24
ErichKeane ErichKeane is offline
Registered User
FRC #3210
Team Role: Mentor
 
Join Date: Nov 2003
Rookie Year: 2004
Location: Hillsboro, OR
Posts: 113
ErichKeane is just really niceErichKeane is just really niceErichKeane is just really niceErichKeane is just really niceErichKeane is just really nice
Send a message via AIM to ErichKeane
Re: New compression method

Im calling BS too. As i said before, even if you could compress stuff really well, it wouldnt be worth the time.

Processing power is expensive--it takes a TON of money to get a processor that would be able to do an iterative compression method in any reasonable amount of time.

Storage on the other hand, is relatively cheap: you said that making nightly backups would be the way to go. What happens if nightly backups take say, 16 hours? It makes it not worth it, especially because then you need to factor in transfer time. It is just as cheap to set up a raid array and just live with an extra HD around.

Compression was big a few years ago, but it really isnt important at all right now. Most corperations wont even trust compression, because they lack some of the failsafes that maintain the integrity of the compressed file. In a compressive algorithm, it is possible to screw up the entire file with a relatively small amount of loss (slightly above 1 byte). In this "method" listed by the thread starter, i would assume that a loss of 4 bits could screw the file to hell.

In a real file, there are enough failsafes, that it may take as much as 30% to make a file unrecoverable. The same cannot be said about a compressed file.
  #9   Spotlight this post!  
Unread 08-09-2004, 15:04
Ryan M. Ryan M. is offline
Programming User
FRC #1317 (Digital Fusion)
Team Role: Programmer
 
Join Date: Jan 2004
Rookie Year: 2004
Location: Ohio
Posts: 1,508
Ryan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud of
Re: New compression method

I have to argue from the other side. Compression is good. For instance, I have a 120 GB HD and a 40 GB HD in my computer right now. (For those lazy out there, that's 160 GBs of total storage.) However, I still don't appreciate the 9 GBs that 4 Fedora Core, 3 Mandrake, 3 Redhat, 3 BSD, and a few different live CD ISOs are taking up.

Also, consider the people on dialup users who pay by the minutes they're on. If you had the choice between a 100 MB download or a .5 KB download which are exactly the same things, which would you go for? Sure, the decompression takes time, but you're not paying for the time online you would be spending downloading a larger file. Also, this would be great for those amateur developers/webmasters who use a cheap/free web server with bandwidth limits.
__________________

  #10   Spotlight this post!  
Unread 08-09-2004, 15:08
Aalfabob's Avatar
Aalfabob Aalfabob is offline
Registered User
#0201 (FEDS)
 
Join Date: Sep 2004
Rookie Year: 2004
Location: Rochester, MI
Posts: 27
Aalfabob is on a distinguished road
Send a message via AIM to Aalfabob
Re: New compression method

OK at these speed issues,
This is the information I have compiled about the compressor so far:
If is able to run the compression routine for 7.5 megs in 1 second (It is actually alot faster then this but incase I have any more ideas for the test version). Now if this file was highly in the compressors favor it could reach around a 10.9% gain in that one second for 7.5 megs. This means that it would shorten the file from 7.5 megs to around 6.67 megs in close to 1 second depending on your cpu and memory clock speed. This test was taken on a 1.47 Ghz XP athlon with 133 mhz memory.

Another thing about the time being exponentially larger:
Even though it has more passes to run for a small gain, the data being compressed is much smaller which gives it a pretty constant change vrs. time. If you look at those graphs ive posted earlyer, you can see that it includes no time but only passes vrs file size.

Btw the file needing to be in its 'favor' does not have a very great chance on random files but still can occur. But on files with charators near each other (text files, logs, ect.) It will be very close to to its max gain per pass (10.9%). Now on average, with my last build which wasnt programmed very well came to around 1.5% to 4% per pass on a .rar file, but the current build has been heavly modifyed to get higher compressions per pass.

O and here is another graph that shows that sometimes a file can actually get larger for a run but in the end, the compression method is in the favor of getting smaller. (Have tested many files of different sizes and types this is just one).

edit - If you think these times are impossible, they are due to the file being loaded into memory and of course it does take larger files in chunks if memory is needed. Also the speed comes from the compression process consisting of a couple of if thens, the qsort routine, and then the bytes being put back into memory. I expect that a professional programmer would be able to chop these times down by a ton by using assembly or faster librarys.
Attached Thumbnails
Click image for larger version

Name:	Graph3.jpg
Views:	78
Size:	54.2 KB
ID:	2468  

Last edited by Aalfabob : 08-09-2004 at 15:15.
  #11   Spotlight this post!  
Unread 08-09-2004, 17:08
Fulcrum2000 Fulcrum2000 is offline
Registered User
no team
 
Join Date: Sep 2004
Location: Netherlands
Posts: 7
Fulcrum2000 can only hope to improve
Re: New compression method

Quote:
Originally Posted by Aalfabob
edit - If you think these times are impossible, they are due to the file being loaded into memory and of course it does take larger files in chunks if memory is needed.
No, I don't think these times are impossible, but I *know for sure* that you claims about achieved compression are utter <deleted>. Please stop spamming the forum with claims which are totally impossible.

I make you a deal. Compress one of the test files on my website (http://www.maximumcompression.com/) and send me the compressed file + compiled decoder. I will then decompress this file on my computer. If after decompressing original file is restored (binary identical to the original) and the size of the decoder + compressed file is less then 95% of the best compressor mentioned for that test you get $10.000 from me!. Deal?

PS Before decompressing I'm allowed to change the name of the compressed file to anything I like.

Last edited by D.J. Fluck : 08-09-2004 at 17:58. Reason: language
  #12   Spotlight this post!  
Unread 08-09-2004, 17:24
Dave Flowerday Dave Flowerday is offline
Software Engineer
VRC #0111 (Wildstang)
Team Role: Engineer
 
Join Date: Feb 2002
Rookie Year: 1995
Location: North Barrington, IL
Posts: 1,366
Dave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond repute
Re: New compression method

Quote:
Originally Posted by Fulcrum2000
No, I don't think these times are impossible, but I *know for sure* that you claims about achieved compression are utter {bleep}. Please stop spamming the forum with claims which are totally impossible.
I am finding this discussion interesting, but vulgar language is not appreciated by most of us here. I find it a little interesting that someone who is completely new to this forum and obviously doesn't even understand our etiquette could accuse someone else of spamming.
  #13   Spotlight this post!  
Unread 08-09-2004, 17:37
Aalfabob's Avatar
Aalfabob Aalfabob is offline
Registered User
#0201 (FEDS)
 
Join Date: Sep 2004
Rookie Year: 2004
Location: Rochester, MI
Posts: 27
Aalfabob is on a distinguished road
Send a message via AIM to Aalfabob
Re: New compression method

Quote:
Originally Posted by Fulcrum2000
No, I don't think these times are impossible, but I *know for sure* that you claims about achieved compression are utter <edit>. Please stop spamming the forum with claims which are totally impossible.

I make you a deal. Compress one of the test files on my website (http://www.maximumcompression.com/) and send me the compressed file + compiled decoder. I will then decompress this file on my computer. If after decompressing original file is restored (binary identical to the original) and the size of the decoder + compressed file is less then 95% of the best compressor mentioned for that test you get $10.000 from me!. Deal?

PS Before decompressing I'm allowed to change the name of the compressed file to anything I like.
Ok let me show you how this bet is near impossible...

You said compressor + compressed file ='s less then todays best compiler. Well the .exe tested on that site is only 3,870,784 Bytes. The top scoring compressor recieved a final size of 953785 Bytes. You want me to get that down to a size of 47,689 Bytes (47 Kb @ 95%). Ok so even if my compression could knock it down to 515 bytes, that leaves 47,174 Bytes left for the program. Um I dont really know how you want me to get my program that small with out spending hundreds of hours in assembly just to save the size of the decompressor. Right now with a little over 100 lines im at 626,747 Bytes. Unless im reading your challenge wrong, it seems a little impossible...

Last edited by D.J. Fluck : 08-09-2004 at 18:10. Reason: Language Editing in Quote
  #14   Spotlight this post!  
Unread 08-09-2004, 15:32
Alan Anderson's Avatar
Alan Anderson Alan Anderson is offline
Software Architect
FRC #0045 (TechnoKats)
Team Role: Mentor
 
Join Date: Feb 2004
Rookie Year: 2004
Location: Kokomo, Indiana
Posts: 9,112
Alan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond repute
Re: New compression method

Quote:
Originally Posted by ErichKeane
In a real file, there are enough failsafes, that it may take as much as 30% to make a file unrecoverable. The same cannot be said about a compressed file.
Assuming that the compressed files are under 1k in size, it's no big deal to add "failsafes" such as error-correcting codes or even brute force redundancy after the compression. That will of course increase the resulting file size, but if an original megabyte of raw data still ends up less than a few kilobytes of damage-tolerant compressed data, it's a major net win.

Given a sufficiently restricted set of possible input files and a sufficiently large shared data base, I can achieve miraculous compression too. For example, I can "encode" any static data currently on the World Wide Web into a short string of characters: just reference it by URL. But arbitrary multimegabyte files compressed to 500-odd bytes? To say I am skeptical would be an understatement.
  #15   Spotlight this post!  
Unread 08-09-2004, 15:46
FizMan's Avatar
FizMan FizMan is offline
aboot, eh?
AKA: Chris Sinclair
#0783 (Mobotics)
Team Role: Alumni
 
Join Date: Feb 2004
Location: Toronto, Canada
Posts: 102
FizMan will become famous soon enough
Send a message via AIM to FizMan Send a message via MSN to FizMan
Re: New compression method

We'll, I guess we'll find out in less than a week's time, eh?

PERSONALLY, and I think I speak for everyone here, I'd hope that this is the real deal.


I wonder though if it would be possible for you to give us a quick run down on the logic behind your algorithms... I mean, for example, how you manage to get around the counting problem (the 2^n-1 thingamajiggy) Are you somehow incorporating non-base 10 mathematics or something?
__________________
Joules per second! Watt? Joules per second! Watt? Jouls per second! Watt?
Closed Thread


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Compression Crisis reisser 3D Animation and Competition 9 22-02-2004 11:23
IRI Elimination Round Method D.J. Fluck Off-Season Events 19 23-07-2003 18:56
Scouting method suggestiongs punarhero Scouting 0 26-01-2003 04:27
What is your favorite method for attaching gears to shafts? archiver 2001 13 24-06-2002 04:00
CRYSTAL METHOD CONCERT drksdofthemoon Chit-Chat 7 30-04-2002 16:58


All times are GMT -5. The time now is 13:32.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi