Advertisement

Which frame number does a client start from when joining

Started by January 12, 2017 04:01 PM
3 comments, last by coffeebon 7 years, 10 months ago
Hi, I'm trying to get a simple networking going as a proof of concept (eventually heading to 2 cubes "tanks" moving and firing shells at each other). This is more to learn concepts etc. at first. I've read lots of articles (gafferongames, the valve ones etc.) and I think I've got most of the ideas straight. i.e. the client gets the moves and sends them to the server. Client then predicts / extrapolates what happens (i.e. runs the full simulation locally) and then corrects when it gets data back from the server.
However when trying to write the code I've found I've still not got one thing down - which is how to join a client at the right frame.
I've looked around and the nearest I could see was https://www.gamedev.net/topic/683580-server-client-ticks-lag-compensation-game-state-etc/ and I just want to see if I've understood this correctly (my apologies for raking this up again - feel free to point me at any other existing articles/posts you're aware of)
In my mental model of this works:
  1. Server starts and is running a simulation. Let's say a frame every 30ms.
  2. At frame 1000 a client connects.
  3. Server adds a player object in frame 1000 and starts sending state packets back starting at frame 1000.
  4. Client receives frame 1000. It's also determined by this point that it's ping (round trip) is 100ms
  5. It now knows that frame 1000 happened ~50ms ago on the server, or just under 2 frames worth.
  6. When it sends a packet to the server, the server will get that in ~50ms and will be nearly at frame 1004
My question is what does the client do now. Does it take frame 1000 and extrapolate to frame 1004, display that and start sending any moves to the server so that the server gets movement data for 1004, and then client stays about 4 frames "ahead"?
The job of the protocol is to make sure the client sends commands so they arrive just ahead of when the server needs them.
If you have a good offset-compensation system, then the server will see that it received a command for frame 1000 when it's actually at 1004, and would then send time-offset information to the client that it should move ahead its compensation by 4 ticks.
A suitably self-adjustive time offset will very quickly converge on the right value, in the first two round-trips (so, within 200 ms for your case.)
If you want to avoid that, you should tell the client to start simulating at tick T + margin (say, 10 ?) which means client will send commands at 1010. The server will then notice that it's 6 frames ahead of current (1004) and tell the client to back off by 5 or so.
enum Bool { True, False, FileNotFound };
Advertisement

I was going to start simple by assuming constant ping (and artifically make that happen whilst client and server are on my machine - and then work out to to adjust later when I have the simple case working). But in general the client should run it's simulation N frames ahead of what it last received, display that and get the user input and send it back where N gets adjusted depending on ping time right?

Yes, that is typically how it's done.
enum Bool { True, False, FileNotFound };

Sweet - thanks.

This topic is closed to new replies.

Advertisement