Advertisement

MP3-Beating Compression

Started by April 06, 2000 01:58 PM
494 comments, last by kieren_j 24 years, 8 months ago
OOPS! Someone already posted the address I gave! Sorry, but I just read the first couple of pages

Oh well, still proves my point
"End Communication!", Kang from Rigel-4
*cough* From what I remember, 1 does not equal .99999. What you are probably referring to is that 1/3 = .3 repeating, 2/3 = .6 repeating, add them together and you get .9 repeating, but 1/3+2/3 is 1, so thus 1 = .9 repeating. This is true, but remember that you are working with two number which repeat to an infinite length. Thus, adding them together you get 1 (.9 repeating), not .99999. Feel free to correct me if there is some other proof that 1 equals something other than 1. (and, no, tricks such as dividing by 0 don''t count. Most "proofs" of such tend to rely on faulty math.)

One time pads (feel free to point out any flaws in this, but I''m just reciting from memory):
In cryptography a ''perfect'' way to encrypt something is to use a ''pad'' or sequence of random data the same length as the original message and encrypt based on such. Also, this pad must only be used once (hence, ''one time pad''). Because the pad is random and only used 1 time, the resultant ciphertext can decrypt to _anything_. There is no way to tell what the original contents of the message are. Even if you know what the first few bits of the original message is, you still cannot get the rest because there is no pattern to the encryption. It _can_ be broken if you compare multiple messages encrypted with the same pad (and in fact, it is a really bad idea to use a one time pad more than once), but a true one time pad represents ''perfect'' security...so long as nobody (hostile) knows the pad.

As for "Impossible" versus "Haven''t figured out a way to do it". I guess that''s just why a lot of us have been careful on what we say. As I said before, certain things: Decrypting one time pads and compressing random data are impossible. Not just because we _think_ it is, but because it has been proven to be such by every mathematical model known to man.
Some things, discrete logarithms, faster than light travel, may be possible, but since we cannot prove that they aren''t, we don''t know.

Ah well. Tired of typing.
*looks for new posts*
LoK: Well, if you don''t think the compression faq''s examples are sufficient, why don''t you look up the proofs behind it? They are there, you know.

Mmm. FreedomCrunch-a-licious...

Deathy
Advertisement
Aaaah! Aaaah! I don''t think anyone reads a thing I post.

AIG even copied the darn thing right into _this_ page. We''ve seen it already! Yeesh.


Lack

Christianity, Creation, metric, Dvorak, and BeOS for all!
Lack
Christianity, Creation, metric, Dvorak, and BeOS for all!
Exactly are right you... me no exist...

hehe... dont try to be smart with me okay...

just playing with ya... why dont we end this thread and reply to some people who need help... oh btw, i need some help.. can anyone answer my questions in the graphics and theory area? 3d solid cube, and voxel/non-voxel landscapes... thanks ppl.. and of course i was just playing with y''all.. (i still believe that everything is possible... just not now at present time...)

thanks for the entertainment...
I poked around for a bit and didn't find much. But that's up to _you_ guys, since it's your side of the argument.

On the .9 repeating thing, that's what I meant. But .9 repeating doesn't equal one, though you just proved mathimatically that it was. It's infinitly close, but not equal.


Lack

Christianity, Creation, metric, Dvorak, and BeOS for all!

Edited by - LackOfKnack on 4/18/00 6:15:06 PM

Edited by - LackOfKnack on 4/25/00 3:28:44 PM
Lack
Christianity, Creation, metric, Dvorak, and BeOS for all!
Lack - my proof is on page 11, but it seems like you agree with it anyways. You agreed with me saying that you can''t compress every singe file. Now, my question for you is, what is the difference between a file that can be compressed and one that can''t? Everyone that knows anything about compression would say that the difference is that a file that is random or nearly so is one that cannot be compressed lossesly. It would be nice if you actually looked at some of the web pages people posted talking about compressing random data before you go shooting off your mouth. And thanks for saying what every child knows about apples, but this is not biology so please stay somewhat on topic.
Let me clarify something, by ''random data is not compressible'', we mean that every file of length n is not compressible by the same algorithm. I could easily write an algorithm that could compress/uncompress a file to a size of 1 byte, but for every other file of the same length, I would see detrement, no compression.
Gladiator - tell me some way we could possibly name the highest integer; since there is no highest integer, it''s impossible.
The sign of a flawed compression algorithm is one that continues to compress a file after being run more than once. Unfortunatly, he claims his does, this means his algorithm is far from optimal.
One last thing, this seeems to be an arguement between faith and science. I would love to believe that there is a magic compression algorithm, but to most of us it''s obvious that this just isn''t so.


Mike
"Unintentional death of one civilian by the US is a tragedy; intentional slaughter of a million by Saddam - a statistic." - Unknown
Advertisement
Thanx for the explanation, deathlok.
Lack, I think I''ve found a kind of definition what random data means:
Imagine you''ve a random bit generator. You generate a bit string, and with each bit the possibility that it''s 1 is 50 %. If the generated bit pattern has the length n, it becomes random for lim n->infinite.

Visit our homepage: www.rarebyte.de.st

GA
Visit our homepage: www.rarebyte.de.stGA
quote: Original post by Vetinari

Lack - my proof is on page 11, but it seems like you agree with it anyways. You agreed with me saying that you can''t compress every singe file.


That much I do, yes.

quote: Now, my question for you is, what is the difference between a file that can be compressed and one that can''t? Everyone that knows anything about compression would say that the difference is that a file that is random or nearly so is one that cannot be compressed lossesly.


I say (and I can do this since you are being equally vague in your backup) that the only files that can''t be compressed at all are the ones that are too small to contain any pattern large enough to weed out with good results. But then I''ve also said that those that are that small, nobody would need to compress anyway. Also, you speak of using a universal algorithm, and with that, perhaps your statements are true. But if your algorithm is an incorporation of many methods, including rearranging and masking and such, just about anything could be compressed.

quote: It would be nice if you actually looked at some of the web pages people posted talking about compressing random data before you go shooting off your mouth.


It''d be nice if you''d read what I posted, too. I already said that I read through it. Also, I don''t believe I''ve been shooting off my mouth. If I have, I apologize, but also may I point out the arrogant and immature statements people on your sides have been making as well. Someone even said I was a fool because I''m Christian. Did I start a flame war? No, I kept on with the discussion that I was interested in.

quote: And thanks for saying what every child knows about apples, but this is not biology so please stay somewhat on topic.


Since no-one had mathematics to back up the statement that data cannot be compressed repeatedly and were making vague statements as I was, I thought I''d introduce an analogy. I suppose it was a bit off topic. Sorry.

quote:
Let me clarify something, by ''random data is not compressible'', we mean that every file of length n is not compressible by the same algorithm. I could easily write an algorithm that could compress/uncompress a file to a size of 1 byte, but for every other file of the same length, I would see detrement, no compression.


Right, but if you have an array of methods and algorithms to use, sure you could.

quote: The sign of a flawed compression algorithm is one that continues to compress a file after being run more than once.


Says who? Why not? This is where the dna/seed analogy comes in.

quote: One last thing, this seeems to be an arguement between faith and science.


More like vague scientific opinions versus vague scientific opinions. The faith-only (at least those who were on topic) stopped posting awhile back.
Lack
Christianity, Creation, metric, Dvorak, and BeOS for all!
I posted the link again because some people aren''t reading it (or maybe aren''t understanding it). ;-)

Here''s some "math" for ya:

You have 256 pigeons, and 254 holes. And you''re not allowed to use the same hole twice. Is it mathematically possible to put each pigeon in a hole?

explanation:
- You have 8 bits, which allows 256 different values.
- It is IMPOSSIBLE to compress ALL 256 combinations to UNIQUE 7-bit (or less) compressed files, using any single algorithm:
- because 2^7 (all possible unique 7-bit compressed files) + 2^6 (all possible unique 6-bit compressed files) + 2^5 + 2^4 + 2^3 + 2^2 + 2^1 = 128 + 64 + 32 + 16 + 8 + 4 + 2 = 254
- therefore, the sum of *all possible unique* 7-bit-and-under files is 254. Yet, because 8-bit files can have up to 256 unique combinations, THERE IS NO WAY TO REPRESENT ALL OF THEM in 7-bit-and-under files. In other words, it is "MATHEMATICALLY IMPOSSIBLE" for 256 to be less than or equal to 254 (and once someone figures out a way this IS possible, we won''t be needing to compress data anymore).

Does that help? No matter how you massage 7 bits or less, you''ll NEVER come up with the 256 different combinations needed. The exact same thing is happening in large "random" files, just with bigger numbers. No matter what algorithm is used, even ones that aren''t invented yet, there will always be files that cannot be compressed by that algorithm.

That''s it. I''m done posting math, and links. ;-) If you don''t get it or agree by now, more of the same isn''t going to make a difference. Besides, 256 pigeons just make a lot of crap. ;-)

aig
aig
ga, is that it? You could find patterns in that for sure.


Lack

Christianity, Creation, metric, Dvorak, and BeOS for all!
Lack
Christianity, Creation, metric, Dvorak, and BeOS for all!

This topic is closed to new replies.

Advertisement