I'm trying to make a 2D multiplayer game. So far I've implemented client-side prediction and server reconciliation and a method that keeps the player objects almost perfectly sync, with as much as a 1 pixel difference. It works like this:
- Client samples input, position and other variables 60 times per second and stores them as a snapshot, and stores the input in a packet buffer.
- Every 50 miliseconds (30 tickrate), the queued input packets are sent to the server with a timestamp and sequence number.
- When the server receives an input packet, it adds half of the round trip time to the timestamp, as well as en extra 100 miliseconds, and stores the sequence number as "last acknowledged packet".
- The inputs are stored in another buffer and are applied accordingly to the server timer (albeit 100 ms late).
- The server also stores snapshots, but 30 times a second as well as sending a packet with all the positions and relevant variables, as well as the "last acknowledged packet".
- When the client receives the packet, it reapplies all the snapshots from the last acknowledged packet onwards (server reconciliation).
This seems to work pretty well, but I'm concerned about packet loss and any desynchronization that might happen. I've tried faking packet loss and it appears that when the character strays too far away from the true server position, it doesn't "correct" itself and the client remains in the wrong position.
When I play games like Overwatch, sometimes I get terrible lag spikes and everything freezes for a while, characters begin flying to completely random positions until my connection gets better and then everything "jumps" to a correct state, and I'm not sure of how to account for situations like these.
Also, the 100ms delay seems necessary as it makes sure the timestamps are applied in the correct order and properly spaced between each other, but I'm not sure if it's a good idea. Is there anything more I might be forgetting? What am I doing wrong?