So I have been working on my FPS game and its networking lately and I have been thinking about a case I have not really considered yet. My server sends data to the clients on every third simulation frame, this ends up being 20 times per second with my 60 Hz simulation rate. Every packet has the servers simulation frame number (tick) attached to it.
Currently the client synchronizes it's own tick number to the first packet received from the server like this:
int clientTick = serverTick - 3;
And then increases it's own tick for every simulation-frame which passes locally after this. The reason that i subtract 3 from the server tick is because I buffer the packets locally on the client in a de-jitter buffer, and then de-queue them from the buffer based on the client local tick number. So we basically "rewind" 3 ticks in time to setup this local delay to handle packet jitter.
The end result is that we expect to have 1-2 packets in this buffer on the client, and that if we have > 2 packets in the buffer we have drifted behind the server more then intended and need to speed up our local simulation a bit. Now, this seems to be working and is the by far best solution to this problem I have come up with.
Is this a sane way to solve this problem? It seems sane to me, and seems to be working, but just wanna get some feedback.