Well I got it to work by linking and using the raw zlib library on the UE4 client, I also had to encode into ANSI char for my server to be able to convert it into std::string (I will need to look into utf8 on the server but not for now since I don't send any text message yet).
For those interested I will give my solution for the compression decompression here and then you can tweak it.
Client UE4 :
First of all build the zlib library and I choose the same version as the one in my current version of UE4 (4.18), so I used zlib 1.2.8.
Then I built it manually on windows, but since it's cmake it wont be much different on linux or osx :
mkdir C:\Builds\zlib; cd C:\Builds\zlib
cmake -G "Visual Studio 15 2017" -A x64 C:\local\zlib-x.x.x
cmake --build .
Then you need to tell UE4 you want to link it into your project (*.build.cs) :
PublicAdditionalLibraries.Add(@"PATHTOPROJECT/Binaries/Win64/zlibd.lib");
(if anyone knows how to have the path to the project dynamically it would be nice)
Compress on the client :
void MessageManager::compressString(const FString &json, TArray<uint8> &compressedData)
{
//convert into ANSI to be able to cast it into std::string for now
auto jsonANSI = StringCast<ANSICHAR>(*json);
TArray<uint8> UncompressedBinaryArray((uint8*)jsonANSI.Get(), jsonANSI.Length());
compressedData.SetNum(UncompressedBinaryArray.Num() * 1023, true);
//int ret;
z_stream strm;
strm.zalloc = Z_NULL;
strm.zfree = Z_NULL;
strm.opaque = Z_NULL;
strm.avail_in = UncompressedBinaryArray.Num();
strm.next_in = (Bytef *)UncompressedBinaryArray.GetData();
strm.avail_out = compressedData.Num();
strm.next_out = (Bytef *)compressedData.GetData();
// the actual compression work.
deflateInit(&strm, Z_DEFAULT_COMPRESSION);
deflate(&strm, Z_FINISH);
deflateEnd(&strm);
// Shrink the array to minimum size
compressedData.RemoveAt(strm.total_out, compressedData.Num() - strm.total_out, true);
}
Decompress on the client :
FString MessageManager::decompressString(TArray<uint8> &compressedData)
{
TArray<uint8> UncompressedBinaryArray;
UncompressedBinaryArray.SetNum(compressedData.Num() * 1032);
//int ret;
z_stream strm;
strm.zalloc = Z_NULL;
strm.zfree = Z_NULL;
strm.opaque = Z_NULL;
strm.avail_in = compressedData.Num();
strm.next_in = (Bytef *)compressedData.GetData();
strm.avail_out = UncompressedBinaryArray.Num();
strm.next_out = (Bytef *)UncompressedBinaryArray.GetData();
// the actual DE-compression work.
inflateInit(&strm);
inflate(&strm, Z_FINISH);
inflateEnd(&strm);
return FString((ANSICHAR*)UncompressedBinaryArray.GetData());
}
On the server (C++ using boost and STD) this is how to compress :
void MessageManager::compressString(const std::string &data, std::vector<char> &compressedData)
{
std::stringstream compressed;
std::stringstream decompressed(data);
boost::iostreams::filtering_streambuf<boost::iostreams::input> in;
in.push(boost::iostreams::zlib_compressor());
in.push(decompressed);
boost::iostreams::copy(in, compressed);
std::string str = compressed.str();
compressedData.assign(str.begin(),str.end());
}
And decompress :
std::string MessageManager::decompressString(std::vector<char> &compressedData)
{
std::stringstream compressed;
compressed.write(compressedData.data(),compressedData.size()); // <--- that is to be sure that it WONT STOP copying at a \0 char
std::stringstream decompressed;
boost::iostreams::filtering_streambuf<boost::iostreams::input> in;
in.push(boost::iostreams::zlib_decompressor());
in.push(compressed);
boost::iostreams::copy(in, decompressed);
std::string str(decompressed.str());
return str;
}
I hope it will help someone out there