I encoded into ansi before the compression not after.
It still seems like you're needlessly adding size to your data.
I encoded into ansi before the compression not after.
It still seems like you're needlessly adding size to your data.
15 hours ago, hplus0603 said:
It still seems like you're needlessly adding size to your data.
How ? I have 1 byte per character and if I use UTF8 I get between 1 to 4 bytes.
Knowing that, for now, I only have letter number and basic character how am I adding size ?
If you "encode into ansi" then any binary byte that gets encoded into more than one output character will be bigger than it needs to be.
You should send raw binary data. It sounds in your posts above as if you're trying to wedge your binary data into a std::string, and encoding as ansi/text/characters/utf8 to somehow avoid problems with std::string representations. This is the wrong way around. Marshal binary data into a std::vector<> instead. (Or some other binary buffer class of your choice.)
7 hours ago, hplus0603 said:If you "encode into ansi" then any binary byte that gets encoded into more than one output character will be bigger than it needs to be.
You should send raw binary data. It sounds in your posts above as if you're trying to wedge your binary data into a std::string, and encoding as ansi/text/characters/utf8 to somehow avoid problems with std::string representations. This is the wrong way around. Marshal binary data into a std::vector<> instead. (Or some other binary buffer class of your choice.)
This is what I am doing :
Preparing my message, serializing it as a JSON (which is UTF8), making the JSON string ANSI, compressing it, sending it, receiving it, decompressing it and putting it back into a string so I can inflate my Message object.
On 11/22/2017 at 10:29 PM, hplus0603 said:You should take out the base64 again -- it serves no purpose.
More importantly, it is probably actually expanding the data.
Working on Scene Fusion - real-time collaboration for Unity3D (and other engines soon!)