Advertisement

MP3-Beating Compression

Started by April 06, 2000 01:58 PM
494 comments, last by kieren_j 24 years, 8 months ago
Yeah, no shit zipster.

If you got working code, screw everyone here, cause they are not gonna take your word for it. Just look at what happens when other people come along and have a solution for things way before other people who think they have a right to the solution (case in point, genetic mapping).

Look at last month of how many scientists got pissed off cause a company went ahead and stated for a fact they had 100% of the genome mapped. They got so pissy it mademe sick.

So if you have a better algo for compression, and people mouth off, it is just cause they are upset they didn''t do it first. Personally, I hope it does work so I don''t need another 20 gig HD.

int main() {
   if(reply.IsSpam()) {
      while(true) {
         int*ptr=new int[1000000];
         reply.RandomInsult(); } }
   else std::cout<< "mailto:amorano@bworks.com"
}
quote: Original post by 3dModelMan

I''ve followed this thread since the beginning and I think it''s time to help kieren_j''s argument here...

I agree with everyone that random data cannot be compressed.....



I find this to be the most ignorant thought that has been circiling around here on this topic.

If random data couldn''t be compressed, how does anything work?

The computer doesn''t know A from B. All data is random ot the machine and the algorithm. Now, if you are a purist and go with the attitude that patterns must be visible in the data in order to compress it, I point you to chaos theory.

Even in random patterns, there is a pattern. Otherwise, I suppose they have been wasting tons and tons of money, time and resources on chaotic compression algorithms that actually work?!

Get the facts stgraight around here people, alot of you seem to talk out of turn about things which you have no clue. Do a little research first.


int main() {
   if(reply.IsSpam()) {
      while(true) {
         int*ptr=new int[1000000];
         reply.RandomInsult(); } }
   else std::cout<< "mailto:amorano@bworks.com"
}
Advertisement
The reason I''m debating this is because people dismissed his ideas so fast it made my head spin! While we wait for the demo, rather than say ''fuck everyone'', I wanted to discuss why it could or couldn''t work. No harm in that. I don''t know why people are so upset.

Anyhow,

Ridcully: Do you even look at anything but the size of the thread before posting? Obviously, I for one am seriously entertaining the idea.

ga: Won''t bother for now, kieren took the post.

MENTAL: Oh, so it is. I had it bookmarked. Why would they close it? They didn''t even post a reason... Well, I don''t see a problem, and if they''re too lazy or busy to fix the forum code or post a reason or read the whole thread and be objective, then I''ll respectfully ignore it for now.

I just wanted to say, espically to AIG and ga, no hard feelings. I''m glad we were able to keep discussing this.


Lack

Christianity, Creation, metric, Dvorak, and BeOS for all!
Lack
Christianity, Creation, metric, Dvorak, and BeOS for all!
Two comments on two statements by Joviex.

1) That private company has not mapped 100% of the human genome. What they''ve done is all of the sequencing of genetic material that will allow them to map the genome for something like 98% of the genome. It was probably a misquote by the press, but that is why all the rest of the scientists were upset. In reality theres still alot of number crunching yet to go.

2) As I stated much earlier in the thread, chaos theory based compression algorithms result in incredible compression ratios, but are inherently lossy. Chaotic compression does something like store a thousand trees in a picture in a single fractal pattern with a couple of parameters for variance. But it does not decompress to the original image.

Mike Roberts
aka milo
mlbobs@telocity.com
Joviex, not in every kind of data are patterns which you could use for compression. Else you could compress a file as often as you wanted until its length is zero. And with n bits you''ve 2^n possibilities to set these bits. If you "compress" this you generally lose information. Compression algorithms can compress only few of the 2^n patterns.

kieren, you say ''for some reason it detects "4-bit" data...''
In a file where every pattern occurs with the same frequency (in my 10 bytes there were "local fluctuations" but the algorithm also doesn''t work well with them)and in no specific order, why should there appear easy-to-compress 4 bit data after ''compressing'' with 16 bit?

Visit our homepage: www.rarebyte.de.st

GA
Visit our homepage: www.rarebyte.de.stGA
kieren, hope your right
For your sake, and for the sake of all those with shitty connections to the ''net ( Me included )

Unfortunately it doesn''t look too good, considering that after many years of dedicated research by professionals you come along after a couple of weeks of pondering with a wonderful magical panacea. A bit unbelievable. Prove me wrong.

- IO Fission

Tearing 'em apart for no particular reason...
Advertisement
kieren_j, here's some optimising I've done for you. I don't know what kind of speed difference you're going to get, but I know it's less lines of code.

This is what your code reduces down to. Good luck!

typedef struct bitstream{	unsigned char *ptr;	unsigned long byte;	unsigned char bit;} bitstream;unsigned char inline read_bit(bitstream *bs){	unsigned char ret;	ret = (bs->ptr[bs->byte]) & (1 << (bs->bit));	bs->bit++;	if (bs->bit == 8)	{		bs->bit = 0;		bs->byte++;	}	if (ret)		ret = 1;		return ret;}void inline write_bit(bitstream *bs, unsigned char value){	if (value)		bs->ptr[bs->byte] >>= bs->bit;		bs->bit++;	if (bs->bit == 8)	{		bs->bit = 0;		bs->byte++;	}} 


kieren_j, I believe you until your program is proven wrong. Not some compression theory stuff. Anyway, it's only the program that matters, right?

ColdfireV

Edited by - ColdfireV on 4/16/00 7:18:15 PM
[email=jperegrine@customcall.com]ColdfireV[/email]
quote: Original post by ga

Joviex, not in every kind of data are patterns which you could use for compression. Else you could compress a file as often as you wanted until its length is zero. And with n bits you''ve 2^n possibilities to set these bits. If you "compress" this you generally lose information. Compression algorithms can compress only few of the 2^n patterns.


Actually you are wrong. Where is the mapping taking place? If you don''t have some sort of lookup table or hash set, how are you going to know how to map those sequences back to the original byte order. Thus, your file will never reach zero.



int main() {
   if(reply.IsSpam()) {
      while(true) {
         int*ptr=new int[1000000];
         reply.RandomInsult(); } }
   else std::cout<< "mailto:amorano@bworks.com"
}
Hey Lack,

Of course, no hard feelings at all - this has been kinda fun. I guess I was getting tired of theoretical debate when all we have to do is code the darn thing to see. Looking forward to Kieren''s demo, even if just for some closure. ;-)

When this thread''s finally finished, we should start up a "I can crack PGP encryption on my VIC-20" thread. ;-)

aig
aig
Well I guess it''s time for my 2 cents. It''s amazing how this post has developed, from almost no one believing at the beginning to a few people starting to realize kieren may be telling the truth. I have believed him from the very start just by the way he sounded. Now as he posts code fragments I really do believe him. I am eagerly awaiting the demo kieren, and if you are for real, I hope you remember all of us back here at the gd.net forums, and at least thank us at your first press conference for our criticism.

--m0rpheus

This topic is closed to new replies.

Advertisement