From this forum:
This one is amazing. So many people actually believing in OP.
From this forum:
This one is amazing. So many people actually believing in OP.
“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”
From this forum:
This one is amazing. So many people actually believing in OP.
Original File: 100mb CAR File: 100mb ZIPped Original: 99mb ZIPped CAR: 3 bytes
LMFAO
Original File: 100mb CAR File: 100mb ZIPped Original: 99mb ZIPped CAR: 3 bytes
A 3-byte file only has enough entropy to encode 2553 different combinations. Even if we assume that every one of those combinations decompresses to a 100MB file, those three bytes can only represent a very tiny percentage of all possible 100MB files. How tiny? My calculator can represent numbers up to 1e999, and it overflowed when calculating the number of possible 100MB files in existence. Since my calculator won't cut it, I started a calculation in GNU bc, which is an arbitrary-precision calculator. It's been running for about 5 minutes now, and it still hasn't given me the answer. This result would be a very special case of the algorithm.
On top of that, I'm pretty sure that the header for a ZIP file alone is well more than 3 bytes.
Edit:
After over 45 minutes, I just killed the job. Calculating it in a more sane manner, it's about 10^(-2,523,430), which is reeeealy tiny.
LMFAOOriginal File: 100mb
CAR File: 100mb
ZIPped Original: 99mb
ZIPped CAR: 3 bytes
That's okay, in Steins;Gate they compress a person's entire conciousness (2.5 petabytes, according to the lore) down to something like 50 bytes by using black holes generated by the large hadron collider.
Original File: 100mb CAR File: 100mb ZIPped Original: 99mb ZIPped CAR: 3 bytes
A 3-byte file only has enough entropy to encode 2553 different combinations. Even if we assume that every one of those combinations decompresses to a 100MB file, those three bytes can only represent a very tiny percentage of all possible 100MB files. How tiny? My calculator can represent numbers up to 1e999, and it overflowed when calculating the number of possible 100MB files in existence. Since my calculator won't cut it, I started a calculation in GNU bc, which is an arbitrary-precision calculator. It's been running for about 5 minutes now, and it still hasn't given me the answer. This result would be a very special case of the algorithm.
On top of that, I'm pretty sure that the header for a ZIP file alone is well more than 3 bytes.
Edit:
After over 45 minutes, I just killed the job. Calculating it in a more sane manner, it's about 10^(-2,523,430), which is reeeealy tiny.
Or you could, you know, just use logarithms.
Back then I did implement something based on the ramblings of the OP with regards to his technique - ironically the data the program ultimately produced had the opposite effect, it caused zip compression to fail to reduce the size on any file which it was run on including small text files and mp3 files.From this forum:
MP3-Beating Compression
This one is amazing. So many people actually believing in OP.
Back then I did implement something based on the ramblings of the OP with regards to his technique - ironically the data the program ultimately produced had the opposite effect, it caused zip compression to fail to reduce the size on any file which it was run on including small text files and mp3 files.
From this forum:
MP3-Beating Compression
This one is amazing. So many people actually believing in OP.
“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”
"You can't say no to waffles" - Toxic Hippo