Go to Post If I can do it, it's not art. - Ether [more]
Home
Go Back   Chief Delphi > Technical > Programming
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
Closed Thread
Thread Tools Rate Thread Display Modes
  #31   Spotlight this post!  
Unread 08-09-2004, 15:32
Alan Anderson's Avatar
Alan Anderson Alan Anderson is offline
Software Architect
FRC #0045 (TechnoKats)
Team Role: Mentor
 
Join Date: Feb 2004
Rookie Year: 2004
Location: Kokomo, Indiana
Posts: 9,113
Alan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond reputeAlan Anderson has a reputation beyond repute
Re: New compression method

Quote:
Originally Posted by ErichKeane
In a real file, there are enough failsafes, that it may take as much as 30% to make a file unrecoverable. The same cannot be said about a compressed file.
Assuming that the compressed files are under 1k in size, it's no big deal to add "failsafes" such as error-correcting codes or even brute force redundancy after the compression. That will of course increase the resulting file size, but if an original megabyte of raw data still ends up less than a few kilobytes of damage-tolerant compressed data, it's a major net win.

Given a sufficiently restricted set of possible input files and a sufficiently large shared data base, I can achieve miraculous compression too. For example, I can "encode" any static data currently on the World Wide Web into a short string of characters: just reference it by URL. But arbitrary multimegabyte files compressed to 500-odd bytes? To say I am skeptical would be an understatement.
  #32   Spotlight this post!  
Unread 08-09-2004, 15:46
FizMan's Avatar
FizMan FizMan is offline
aboot, eh?
AKA: Chris Sinclair
#0783 (Mobotics)
Team Role: Alumni
 
Join Date: Feb 2004
Location: Toronto, Canada
Posts: 102
FizMan will become famous soon enough
Send a message via AIM to FizMan Send a message via MSN to FizMan
Re: New compression method

We'll, I guess we'll find out in less than a week's time, eh?

PERSONALLY, and I think I speak for everyone here, I'd hope that this is the real deal.


I wonder though if it would be possible for you to give us a quick run down on the logic behind your algorithms... I mean, for example, how you manage to get around the counting problem (the 2^n-1 thingamajiggy) Are you somehow incorporating non-base 10 mathematics or something?
__________________
Joules per second! Watt? Joules per second! Watt? Jouls per second! Watt?
  #33   Spotlight this post!  
Unread 08-09-2004, 16:06
Ryan M. Ryan M. is offline
Programming User
FRC #1317 (Digital Fusion)
Team Role: Programmer
 
Join Date: Jan 2004
Rookie Year: 2004
Location: Ohio
Posts: 1,508
Ryan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud ofRyan M. has much to be proud of
Re: New compression method

Quote:
Originally Posted by FizMan
I mean, for example, how you manage to get around the counting problem (the 2^n-1 thingamajiggy) Are you somehow incorporating non-base 10 mathematics or something?
Counting problem? What do you mean?
__________________

  #34   Spotlight this post!  
Unread 08-09-2004, 16:10
Aalfabob's Avatar
Aalfabob Aalfabob is offline
Registered User
#0201 (FEDS)
 
Join Date: Sep 2004
Rookie Year: 2004
Location: Rochester, MI
Posts: 27
Aalfabob is on a distinguished road
Send a message via AIM to Aalfabob
Re: New compression method

Quote:
Originally Posted by FizMan
We'll, I guess we'll find out in less than a week's time, eh?

PERSONALLY, and I think I speak for everyone here, I'd hope that this is the real deal.


I wonder though if it would be possible for you to give us a quick run down on the logic behind your algorithms... I mean, for example, how you manage to get around the counting problem (the 2^n-1 thingamajiggy) Are you somehow incorporating non-base 10 mathematics or something?
See, most of these claims on great compression ratios being made are by supporting a fact that you will always gain a compression down to their so called limit. I have my therom set up so that it has an advantage of gaining compression. This does not mean that it cant gain size either, it just simply means that in the long run the file will get small. And since my therom includes using headers to define what has happened to the piece of data, it has to have a limit because the headers take up space and the data the headers describe take up space.

These headers are 2 bytes each and are just showing predefined info that the compiler already knows. This makes that part 100% reversable. Next the compressed bytes are either 7-bit, 8-bit, or 9-bit. it defines a 7 bit variable with a start of 1 then 6 bits, an 8 with 00 then 6 bits and a 9 with a 01 then 7 bits. As you can see this makes up all 256 values of the asiic set and also is easly reversable. That is a pretty big part in my compression on how it uncompresses. Thats really all i think i can show right now.

edit - I have all of the process written out and I have written it in the way a computer would actually read the files and recompress them, this isnt some theorm that I just wrote down on paper. I went step by step in the process and made sure it was 100% compressable and 100% reversable.

The more I read that document on how this is impossible, the more info im finding out on these people who have claimed that thiers worked when all they had was a mathamatical problem that they thought they could solve before even thinking about programming it or how the computer can use it.

Last edited by Aalfabob : 08-09-2004 at 16:16.
  #35   Spotlight this post!  
Unread 08-09-2004, 17:08
Fulcrum2000 Fulcrum2000 is offline
Registered User
no team
 
Join Date: Sep 2004
Location: Netherlands
Posts: 7
Fulcrum2000 can only hope to improve
Re: New compression method

Quote:
Originally Posted by Aalfabob
edit - If you think these times are impossible, they are due to the file being loaded into memory and of course it does take larger files in chunks if memory is needed.
No, I don't think these times are impossible, but I *know for sure* that you claims about achieved compression are utter <deleted>. Please stop spamming the forum with claims which are totally impossible.

I make you a deal. Compress one of the test files on my website (http://www.maximumcompression.com/) and send me the compressed file + compiled decoder. I will then decompress this file on my computer. If after decompressing original file is restored (binary identical to the original) and the size of the decoder + compressed file is less then 95% of the best compressor mentioned for that test you get $10.000 from me!. Deal?

PS Before decompressing I'm allowed to change the name of the compressed file to anything I like.

Last edited by D.J. Fluck : 08-09-2004 at 17:58. Reason: language
  #36   Spotlight this post!  
Unread 08-09-2004, 17:09
Jack's Avatar
Jack Jack is offline
FIRST Scouting Network
AKA: Andrew Schenk
FRC #0201 (The FEDS)
Team Role: Alumni
 
Join Date: Feb 2002
Rookie Year: 2002
Location: Rochester Hills
Posts: 643
Jack is a jewel in the roughJack is a jewel in the roughJack is a jewel in the rough
Send a message via AIM to Jack
Re: New compression method

Quote:
Originally Posted by Rickertsen2
One question... Why do you choose these forums to post your *discovery*? Sure we are all nerds here and many of us are interested in this sort of thing but i am sure there are forums dedicated to compression. Why Chiefdelphi?
Just to answer this question...

He's a person who goes to my school whom I know.. and btw he's considering joining our robotics team too

He was talking to me about this, and since I really have no idea, I sent him to chiefdelphi because I knew there would be people here who understood better and could either prove or disprove his claim.

I have no idea if his method works, and honestly am as skeptical as the rest of you.. but eh.. if it does work.. it'd be cool

yea.. i know this is borderline chit-chat.. but it's the off season still.. and it can always be moved to another forum if brandon deems necessary..

anywho.. i guess we'll just have to wait for the program to see how good this idea really is..

jack
__________________
Team 201 - 2003 Great Lakes Semifinalists & 2003 Archimedes Division Finalists :|: Webmaster of www.feds201.com -> FBI Scouting
Winner of The 2003 ChiefDelphi Web Award for: User That Started The Most Non Chit Chat Threads
Winner of: 2003 FIRST Ventures with 5451.68 End Points

All about me: http://knehcsa.vze.com
**Check out the New FIRST Scouting Network**
  #37   Spotlight this post!  
Unread 08-09-2004, 17:24
Dave Flowerday Dave Flowerday is offline
Software Engineer
VRC #0111 (Wildstang)
Team Role: Engineer
 
Join Date: Feb 2002
Rookie Year: 1995
Location: North Barrington, IL
Posts: 1,366
Dave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond repute
Re: New compression method

Quote:
Originally Posted by Fulcrum2000
No, I don't think these times are impossible, but I *know for sure* that you claims about achieved compression are utter {bleep}. Please stop spamming the forum with claims which are totally impossible.
I am finding this discussion interesting, but vulgar language is not appreciated by most of us here. I find it a little interesting that someone who is completely new to this forum and obviously doesn't even understand our etiquette could accuse someone else of spamming.
  #38   Spotlight this post!  
Unread 08-09-2004, 17:33
Fulcrum2000 Fulcrum2000 is offline
Registered User
no team
 
Join Date: Sep 2004
Location: Netherlands
Posts: 7
Fulcrum2000 can only hope to improve
Re: New compression method

Quote:
Originally Posted by Dave Flowerday
I am finding this discussion interesting, but vulgar language is not appreciated by most of us here. I find it a little interesting that someone who is completely new to this forum and obviously doesn't even understand our etiquette could accuse someone else of spamming.
Sorry for the strong words I used, but I hope this kind of language will make clear the claims made are totally impossible. I saw these kind of super compression claims many, many times. Most of them where made to get some quick cash of people so they could 'finalize their invention and patent there discoveries'. After payment the people suddenly disappear...

I agree I maybe not fully aware of the forum etiquettes, but I know something about what's possible and impossible in compression land. That's why I give him a opportunity to earn 10000 dollar real fast.

Regards,
Werner Bergmans
Eindhoven, Netherlands

Last edited by Fulcrum2000 : 08-09-2004 at 17:35.
  #39   Spotlight this post!  
Unread 08-09-2004, 17:37
Aalfabob's Avatar
Aalfabob Aalfabob is offline
Registered User
#0201 (FEDS)
 
Join Date: Sep 2004
Rookie Year: 2004
Location: Rochester, MI
Posts: 27
Aalfabob is on a distinguished road
Send a message via AIM to Aalfabob
Re: New compression method

Quote:
Originally Posted by Fulcrum2000
No, I don't think these times are impossible, but I *know for sure* that you claims about achieved compression are utter <edit>. Please stop spamming the forum with claims which are totally impossible.

I make you a deal. Compress one of the test files on my website (http://www.maximumcompression.com/) and send me the compressed file + compiled decoder. I will then decompress this file on my computer. If after decompressing original file is restored (binary identical to the original) and the size of the decoder + compressed file is less then 95% of the best compressor mentioned for that test you get $10.000 from me!. Deal?

PS Before decompressing I'm allowed to change the name of the compressed file to anything I like.
Ok let me show you how this bet is near impossible...

You said compressor + compressed file ='s less then todays best compiler. Well the .exe tested on that site is only 3,870,784 Bytes. The top scoring compressor recieved a final size of 953785 Bytes. You want me to get that down to a size of 47,689 Bytes (47 Kb @ 95%). Ok so even if my compression could knock it down to 515 bytes, that leaves 47,174 Bytes left for the program. Um I dont really know how you want me to get my program that small with out spending hundreds of hours in assembly just to save the size of the decompressor. Right now with a little over 100 lines im at 626,747 Bytes. Unless im reading your challenge wrong, it seems a little impossible...

Last edited by D.J. Fluck : 08-09-2004 at 18:10. Reason: Language Editing in Quote
  #40   Spotlight this post!  
Unread 08-09-2004, 17:44
Fulcrum2000 Fulcrum2000 is offline
Registered User
no team
 
Join Date: Sep 2004
Location: Netherlands
Posts: 7
Fulcrum2000 can only hope to improve
Re: New compression method

Quote:
Originally Posted by Aalfabob
Ok let me show you how this bet is near impossible...
You said compressor + compressed file ='s less then todays best compiler. Well the .exe tested on that site is only 3,870,784 Bytes. The top scoring compressor recieved a final size of 953785 Bytes. You want me to get that down to a size of 47,689 Bytes (47 Kb @ 95%).
No, 95% of 953785 Bytes is about 906 Kb.
So the size of your decompressor + compressed file should be less then 906 Kb.


Quote:
Originally Posted by Aalfabob
Right now with a little over 100 lines im at 626,747 Bytes. Unless im reading your challenge wrong, it seems a little impossible...
I think you are OK, decompressor + compressed file are almost under 609 Kb!. But if you have a problem meeting the 95% I will increase it to 98%, so the combined size should be under 934709 bytes.

Last edited by Fulcrum2000 : 08-09-2004 at 17:47.
  #41   Spotlight this post!  
Unread 08-09-2004, 18:06
ThomasTuttle ThomasTuttle is offline
2004 Beantown Blitz Scorekeeper 1
#0125 (NU-TRONS)
Team Role: Student
 
Join Date: Jan 2003
Location: Boston, MA
Posts: 19
ThomasTuttle is an unknown quantity at this point
Send a message via AIM to ThomasTuttle Send a message via MSN to ThomasTuttle Send a message via Yahoo to ThomasTuttle
Re: New compression method

You said it can eventually compress any file to 508-515 bytes? Okay, let's assume you can do 512 for simplicity (it really doesn't matter). This can't hold true for every file, since with 512 bytes there are only so many files you can make. (1044388881413152506691752710716624382579964249047 3837803842334832839\
53907971557456848826811934997558340890106714439262 837987573438185793\
60726323608785136527794595697654370999834036159013 438371831442807001\
18559462263763188393977127456723346843445866174968 079087058037040712\
84048740118609114467977783598029006686938976881787 785946905630190260\
94059957945343282346930302669644305902501597239986 771421554169383555\
98852914863182379144344967340878118726394964751001 890413490084170616\
75093668333850551032972088269550769983616369411933 015213796825837188\
09183365675122131849284636812555022599830041234478 486259567449219461\
70238065059132456108257318353800876086221028342701 976982023131690176\
78006675195485079921636419370285375124784014907159 135459982790513399\
61155179427110683113409058427288427979155484978295 432353451706522326\
90613949059876930021229633956877828789484406160074 129456749198230505\
71642377154816321380631045902916136926708342856440 730447899971901781\
46576347322385026725305989979599609079946920177462 481771844986745565\
92501783290704731194331655508075682218465717463732 968849128195203174\
57002440926616910874148385078411929804522981857338 977648103126085903\
00130241346718972667321649151113160292078173803343 609024380470834040\
3154190336, in fact.)
So if you create a directory with every 513-byte file in existance (that's the above number times 8 files), it cannot be able to compress all of them to 512 bytes or less.
Furthermore, compressing random data is not feasible. Sure, you could try, and it might work sometimes, but, overall, any single string of bits will eventually appear, so you can't take advantage of a limited vocabulary in the file. Like if you tried to compress 2 bytes into one, it wouldn't work, since you would eventually have 65536 choices, which is back to the original 2 bytes.
In fact, I gave both gzip and bzip2 a chance, and so far, up to files of 64k, neither has averaged even one byte less than the original--in fact they are all larger!
But if you want me to test it, I would be glad to (I won't give away your program, don't worry...) I would really find it interesting if you can prove me and others wrong...
  #42   Spotlight this post!  
Unread 08-09-2004, 18:09
Andy Baker's Avatar Woodie Flowers Award
Andy Baker Andy Baker is offline
President, AndyMark, Inc.
FRC #3940 (CyberTooth)
Team Role: Engineer
 
Join Date: May 2001
Rookie Year: 1998
Location: Kokomo, Indiana
Posts: 3,416
Andy Baker has a reputation beyond reputeAndy Baker has a reputation beyond reputeAndy Baker has a reputation beyond reputeAndy Baker has a reputation beyond reputeAndy Baker has a reputation beyond reputeAndy Baker has a reputation beyond reputeAndy Baker has a reputation beyond reputeAndy Baker has a reputation beyond reputeAndy Baker has a reputation beyond reputeAndy Baker has a reputation beyond reputeAndy Baker has a reputation beyond repute
Send a message via AIM to Andy Baker
Re: New compression method

Quote:
Originally Posted by Fulcrum2000
Sorry for the strong words I used, but I hope this kind of language will make clear the claims made are totally impossible. I saw these kind of super compression claims many, many times.

Regards,
Werner Bergmans
Eindhoven, Netherlands
While this is understood, please choose words that don't pullute this fine site. This discussion is interesting, and your words are valuable as long as profanity is omitted.

You will be more respected and listened to if you keep things clean and respectful.

Andy B.
  #43   Spotlight this post!  
Unread 08-09-2004, 18:37
ThomasTuttle ThomasTuttle is offline
2004 Beantown Blitz Scorekeeper 1
#0125 (NU-TRONS)
Team Role: Student
 
Join Date: Jan 2003
Location: Boston, MA
Posts: 19
ThomasTuttle is an unknown quantity at this point
Send a message via AIM to ThomasTuttle Send a message via MSN to ThomasTuttle Send a message via Yahoo to ThomasTuttle
Re: New compression method

Quote:
Originally Posted by Aalfabob
See, most of these claims on great compression ratios being made are by supporting a fact that you will always gain a compression down to their so called limit. I have my therom set up so that it has an advantage of gaining compression. This does not mean that it cant gain size either, it just simply means that in the long run the file will get small. And since my therom includes using headers to define what has happened to the piece of data, it has to have a limit because the headers take up space and the data the headers describe take up space.

These headers are 2 bytes each and are just showing predefined info that the compiler already knows. This makes that part 100% reversable. Next the compressed bytes are either 7-bit, 8-bit, or 9-bit. it defines a 7 bit variable with a start of 1 then 6 bits, an 8 with 00 then 6 bits and a 9 with a 01 then 7 bits. As you can see this makes up all 256 values of the asiic set and also is easly reversable. That is a pretty big part in my compression on how it uncompresses. Thats really all i think i can show right now.

edit - I have all of the process written out and I have written it in the way a computer would actually read the files and recompress them, this isnt some theorm that I just wrote down on paper. I went step by step in the process and made sure it was 100% compressable and 100% reversable.

The more I read that document on how this is impossible, the more info im finding out on these people who have claimed that thiers worked when all they had was a mathamatical problem that they thought they could solve before even thinking about programming it or how the computer can use it.
So, here's how it works?

First, you have two constant bytes. Like the "GZ" from gzip, or the "BZ" from bzip2.

Then you have a bunch of 7-, 8-, or 9-byte strings. It works like this, I assume:
ASCII 00###### -> Compressed 1###### (8 bits to 7 bits, 25% of the time)
ASCII 01###### -> Compressed 00###### (8 bits to 8 bits, 25% of the time)
ASCII 1####### -> Compressed 01####### (8 bits to 9 bits, 50% of the time)

Given a random input file, each byte from 0-255 will appear the same number of time. Thus, the average size of a compressed byte is:

(7 * 25%) + (8 * 25%) + (9 * 50%) = 1.75 + 2 + 4.5 = 8.25.

Thus, the file expands by 1/64.

So, unless your input files contain mostly bytes that translate to your 7-bit strings, you should be seeing the file *increase* in size by 1/64 each time, not *decrease* in size.

If your program makes the files smaller when they shouldn't be, chances are it's either cheating, using very compressible input files, or losing data--have you written and tested the decompressor yet? ;-)

If I'm wrong, please correct me--I'm interested in seeing how it actually works if this isn't how it works.

See ya,

Tom
  #44   Spotlight this post!  
Unread 08-09-2004, 21:24
Aalfabob's Avatar
Aalfabob Aalfabob is offline
Registered User
#0201 (FEDS)
 
Join Date: Sep 2004
Rookie Year: 2004
Location: Rochester, MI
Posts: 27
Aalfabob is on a distinguished road
Send a message via AIM to Aalfabob
Re: New compression method

Quote:
Originally Posted by ThomasTuttle
So, here's how it works?

First, you have two constant bytes. Like the "GZ" from gzip, or the "BZ" from bzip2.

Then you have a bunch of 7-, 8-, or 9-byte strings. It works like this, I assume:
ASCII 00###### -> Compressed 1###### (8 bits to 7 bits, 25% of the time)
ASCII 01###### -> Compressed 00###### (8 bits to 8 bits, 25% of the time)
ASCII 1####### -> Compressed 01####### (8 bits to 9 bits, 50% of the time)

Given a random input file, each byte from 0-255 will appear the same number of time. Thus, the average size of a compressed byte is:

(7 * 25%) + (8 * 25%) + (9 * 50%) = 1.75 + 2 + 4.5 = 8.25.

Thus, the file expands by 1/64.

So, unless your input files contain mostly bytes that translate to your 7-bit strings, you should be seeing the file *increase* in size by 1/64 each time, not *decrease* in size.

If your program makes the files smaller when they shouldn't be, chances are it's either cheating, using very compressible input files, or losing data--have you written and tested the decompressor yet? ;-)

If I'm wrong, please correct me--I'm interested in seeing how it actually works if this isn't how it works.

See ya,

Tom
Yes this would have been a problem, but I had solved it already and it involves a little bit more then that. Trust me I will have the program ready for someone to test out, or more then one if everyone wishes so.

This is how the testing is going to be done, Im first going to send the decompressor to the tester. Next they send me all the files they want me to test (please try to stay under 10 megs, right now it is capable of up to 100 megs because i havent put in the memory swapping yet, but since its just a test I dont see the need for any bigger. But if you wish to try bigger ones...). Next I will compress the files they sent me and Ill send them back to the tester for them to decompressed and varifyed to see if they work. This way I cant make a program to work for only that file so it should be pretty good proff.
  #45   Spotlight this post!  
Unread 08-09-2004, 21:28
ThomasTuttle ThomasTuttle is offline
2004 Beantown Blitz Scorekeeper 1
#0125 (NU-TRONS)
Team Role: Student
 
Join Date: Jan 2003
Location: Boston, MA
Posts: 19
ThomasTuttle is an unknown quantity at this point
Send a message via AIM to ThomasTuttle Send a message via MSN to ThomasTuttle Send a message via Yahoo to ThomasTuttle
Re: New compression method

Quote:
Originally Posted by Aalfabob
Yes this would have been a problem, but I had solved it already and it involves a little bit more then that. Trust me I will have the program ready for someone to test out, or more then one if everyone wishes so.

This is how the testing is going to be done, Im first going to send the decompressor to the tester. Next they send me all the files they want me to test (please try to stay under 10 megs, right now it is capable of up to 100 megs because i havent put in the memory swapping yet, but since its just a test I dont see the need for any bigger. But if you wish to try bigger ones...). Next I will compress the files they sent me and Ill send them back to the tester for them to decompressed and varifyed to see if they work. This way I cant make a program to work for only that file so it should be pretty good proff.
Okay, I'll be glad to help... I'm skeptical, but interested.
Closed Thread


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Compression Crisis reisser 3D Animation and Competition 9 22-02-2004 11:23
IRI Elimination Round Method D.J. Fluck Off-Season Events 19 23-07-2003 18:56
Scouting method suggestiongs punarhero Scouting 0 26-01-2003 04:27
What is your favorite method for attaching gears to shafts? archiver 2001 13 24-06-2002 04:00
CRYSTAL METHOD CONCERT drksdofthemoon Chit-Chat 7 30-04-2002 16:58


All times are GMT -5. The time now is 23:53.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi