All the articles I''ve found (not many) talk about using RLE for compressing save files. That may be well and good for maps, but my save files are binary block written and RLE wouldn''t do squat.
What is a quick method that _I_ can do without using someone else''s libs for compressing my files.
And if you''re going to reply with "Just use suchandsuch a library", don''t bother. I will just ignore you. I want to do it myself.
"NPCs will be inherited from the basic Entity class. They will be fully independent, and carry out their own lives oblivious to the world around them ... that is, until you set them on fire ..."
"When you are willing to do that which others are ashamed to do, therein lies an advantage."
Compressing save files
"NPCs will be inherited from the basic Entity class. They will be fully independent, and carry out their own lives oblivious to the world around them ... that is, until you set them on fire ..." -- Merrick
Look into the LZ range of alghritms they are easily implemented offer good compression and are fast.
Or maybe use simple huffman compression, it''s easy you can have encoder/decoder in sub 250 lines with comments. Gives a typical 30% compression and well, is really easy to implement.
Or if you know your data you can create your own, for example you know that in a certain block you will only have 8 diffrent symbols and that block is say 255 bytes long, then you could compress that 50% just by bit packing.
be creative
Or maybe use simple huffman compression, it''s easy you can have encoder/decoder in sub 250 lines with comments. Gives a typical 30% compression and well, is really easy to implement.
Or if you know your data you can create your own, for example you know that in a certain block you will only have 8 diffrent symbols and that block is say 255 bytes long, then you could compress that 50% just by bit packing.
be creative
HardDrop - hard link shell extension."Tread softly because you tread on my dreams" - Yeats
Compression generally makes up for a wasteful design. The higher the compression you get on a file the larger the amount of space that was wasted in it. Using a 32 bit field to hold a single true/false value is wasteful. That is an apparent one but there are many that are less apparent. An example is storing zero, default or null values. You can use bit flags to indicate those. One bit flag says whether an individual field was written or not. If there is only one reason why it wouldn''t be that is all you need. You can add more saying why it was or wasn''t written. You don''t want to get too carried away, but some informed decisions about where and why you do it can result in some drastic savings.
Running the file through something like pkzip will give you a good measure of how much you can compress it. You should be able to get all that and more by being a little wiser in how you write the data to the file. Just as an extreme example you could write a million random numbers to a file or you could just write the seed you would have used to generate those million random numbers. PKZip could not possibly reduce that 4MB file to 4 bytes but it could easily be done in that case. That is a very simple case, but generally just the knowledge of where a record starts and ends gives you and extreme advantage over a generic algorithm. Add in things like knowing what the default value is and a generic algorithm can''t really touch what you can do except by random chance.
Running the file through something like pkzip will give you a good measure of how much you can compress it. You should be able to get all that and more by being a little wiser in how you write the data to the file. Just as an extreme example you could write a million random numbers to a file or you could just write the seed you would have used to generate those million random numbers. PKZip could not possibly reduce that 4MB file to 4 bytes but it could easily be done in that case. That is a very simple case, but generally just the knowledge of where a record starts and ends gives you and extreme advantage over a generic algorithm. Add in things like knowing what the default value is and a generic algorithm can''t really touch what you can do except by random chance.
Keys to success: Ability, ambition and opportunity.
Yes, I did that. Everything is as small as it can be without compression. But it''s still big. And so I move to compression.
BTW, DigitalDelusion, I''ve previously been looking at the LZH algorithm and it would probably be my best bet, especially since the encoder/decoder tables are easy to modify and its only 10K of source for both compression & decompression. Plus, it''s a pretty basic version, so I can already see a fair bit of improvement for speed
Any other recommendations?
BTW, DigitalDelusion, I''ve previously been looking at the LZH algorithm and it would probably be my best bet, especially since the encoder/decoder tables are easy to modify and its only 10K of source for both compression & decompression. Plus, it''s a pretty basic version, so I can already see a fair bit of improvement for speed
Any other recommendations?
"NPCs will be inherited from the basic Entity class. They will be fully independent, and carry out their own lives oblivious to the world around them ... that is, until you set them on fire ..." -- Merrick
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement
Recommended Tutorials
Advertisement