Do you believe me, though?
I''m not going to code anything at this point, unless you''d care to point me to some bit-manipulation resources for C++. But I can do it on paper.
Lack
Christianity, Creation, metric, Dvorak, and BeOS for all!
MP3-Beating Compression
Impossible compression algorithms? Not if kieren_j is using quantum physics and the general theory of relativity.
Here''s how I see it. Mr. k''s algorithm does not compress data. Instead it merely generates a quantum signature (these are random data and cannot be further compressed) that the decompression program can use in the future.
In the future, Mr. k''s decompression algorithm opens a worm hole in the space-time continuum to the point in space and time where the compression algorithm is running. The decompression program communicates the quantum signature using tachyons which can travel back in time. The compression algorithm then merely pumps the data thru the wormhole (using neutrinos, of course) and the decompressed data arrives in the future.
Since there was no compression the algorithms are loss-less. The algorithm can ''compress'' any file and you never have to store more than 1 quantum signature per ''compressed'' file.
As I see it Mr k has two problems.
Firstly, you cannot read the quantum signature without changing it. The result is that you are decompressing the 100M Q9 demo and instead get someone''s Great Aunt Edna''s digital photos of her bunions.
Two, Acme, Inc. already has a patent on a similar device used not for data compression, but matter compression. Check out their catalog for Acme Portable Hole, Model 192854, at http://www.uranidiot.com. Anyone who has studied the work of the great physicists the Warner Brothers has seen one of these babies at work.
(I have too much time on my hands at work! )
Mike Roberts
aka milo
mlbobs@telocity.com
Here''s how I see it. Mr. k''s algorithm does not compress data. Instead it merely generates a quantum signature (these are random data and cannot be further compressed) that the decompression program can use in the future.
In the future, Mr. k''s decompression algorithm opens a worm hole in the space-time continuum to the point in space and time where the compression algorithm is running. The decompression program communicates the quantum signature using tachyons which can travel back in time. The compression algorithm then merely pumps the data thru the wormhole (using neutrinos, of course) and the decompressed data arrives in the future.
Since there was no compression the algorithms are loss-less. The algorithm can ''compress'' any file and you never have to store more than 1 quantum signature per ''compressed'' file.
As I see it Mr k has two problems.
Firstly, you cannot read the quantum signature without changing it. The result is that you are decompressing the 100M Q9 demo and instead get someone''s Great Aunt Edna''s digital photos of her bunions.
Two, Acme, Inc. already has a patent on a similar device used not for data compression, but matter compression. Check out their catalog for Acme Portable Hole, Model 192854, at http://www.uranidiot.com. Anyone who has studied the work of the great physicists the Warner Brothers has seen one of these babies at work.
(I have too much time on my hands at work! )
Mike Roberts
aka milo
mlbobs@telocity.com
Dang! Forgot one thing about Mr. k''s algorithm. He uses pigeons to transport the data thru the wormhole.
Mike Roberts
aka milo
mlbobs@telocity.com
Mike Roberts
aka milo
mlbobs@telocity.com
Lack, no I don''t believe you can. I believe you think y ou can. I don''t think that you could compress zip files or mp3 files. Nor do I think Kieren can.
Mike
Mike
"Unintentional death of one civilian by the US is a tragedy; intentional slaughter of a million by Saddam - a statistic." - Unknown
Let's say you had a file that is or isn't random, and you have a set of compression algorithms. It doesn't matter which compression algorithms you're using here, so we'll just label them A, B, C, etc. If you run A on the file, you'll get some newer, more random file. Then run B on it, and say it compresses it just a fraction, but definitely not as much as A compressed it. Then C, then D, etc. If you had an infinite set of algorithms, there must be some output file from any one of those given algorithms, where NO algorithm will EVER compress it. What kind of file is that? Well, duh, it's random. But how random does a file, and how large or small must the file be, so that no algorithm whatsoever could compress it?
ColdfireV
P.S. Does anyone else here think kieren_j got hit by a bus on the way to the patent office?
Edited by - ColdfireV on 4/21/00 4:47:48 PM
ColdfireV
P.S. Does anyone else here think kieren_j got hit by a bus on the way to the patent office?
Edited by - ColdfireV on 4/21/00 4:47:48 PM
[email=jperegrine@customcall.com]ColdfireV[/email]
ga wrote:
-How will he keep the wormhole opened??-
With bags stuffed with all the cash he will make from his compression algorithm. Heh, heh.
Speaking of which, are you getting anywhere on that demo kieren_j?
Mike Roberts
aka milo
mlbobs@telocity.com
-How will he keep the wormhole opened??-
With bags stuffed with all the cash he will make from his compression algorithm. Heh, heh.
Speaking of which, are you getting anywhere on that demo kieren_j?
Mike Roberts
aka milo
mlbobs@telocity.com
It''s ridiculous to try and carry on a meaningful discussion here, but you''re misunderstanding the point slightly, ColdFire. There are not specific "random" files that can not be compressed. I can create a compression algorithm to compress any file I want, I simply hard code my algorithm to handle that specific case. That''s not the point. When we say that an algorithm can''t compress random data, we are essentially saying that it can''t compress ALL files. This isn''t exactly true, but it''s much closer than claiming that some specifc files are "random" while others are not. With a trule random distribution, EVERY file is equally likely... We really ought to be qualifying some of the statements made here with probabilities, not certainties. Of course, none of this changes that random data is uncompressable, and no algorithm can compress all files.
-Brian
-Brian
Damn! why didn''t I get on this thread earlier? Now to argue any of my points on space-time and all that jibberish is futile. It would seem late and unnecessary. I agree with whoever said that acceleration in free fall is due to gravity, and not the curvature of space... Quantum mechanics will refine relativity... Either that or something entirely new will replace them both.
This suffering has gone on for long enough.
I know I really shouldn''t have, but (as many of you suspected all along) - yes, it''s not really true.
For a while I''ve been thinking of doing something (stupid) like this, and I just couldn''t hold myself back any longer.
I''d like to thank absolutely everybody who has posted on this board, in particular those who believed me - thanks:
Matt Cruikshank
Chico
Kevin Field
SRMeister / Incubator
Tim Wallace
John Peregrine
Andy
...and all the others who I can''t remember!
I hate to say it but thanks also for putting up a good argument against me - without people like Ridcully, Gromit, and those anonymous posters it probably wouldn''t worked quite as well.
Again, I''m really, really sorry but hey - this will be one thread that I don''t think will be easily forgotten. And besides, I think we all benefited by having everybody''s theories and views on compression. Once again I''m very sorry (take a hint) to everyone who they think I should be sorry for (that sounds circular).
And hey - no hard feelings? We had fun!!
Thanks to all who have contributed to this thread.
kieren_j
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement
Recommended Tutorials
Advertisement