Advertisement

Client prediction and timing

Started by April 12, 2006 01:10 PM
7 comments, last by hplus0603 18 years, 9 months ago
So, I'm trying to get my head around certain bits of client prediction. It seems to me that the server begins at gamestate 0 and regularly ticks forwards. When a client connects, the server sends it the latest gamestate. Then the client begins ticking forwards, sending its controls to the server with a timestamp. The server looks at this timestamp, backs up the simulation a few steps to match, plays the simulation forwards again, and then sends the new latest gamestate back to the client. That much I get. What I'm wondering is, should the clients be trying to synchronize their notion of "now" with the server? It seems to me that there are two options: A.) Clients do not try to figure out their latency. The first state is the latest message from the server, and the client advances from here. If the server sends them a slightly old state, they bring it up to what it is now in the client. A client with a 50ms lag sees the gamestate 50ms ago. B.) Clients attempt to work out their latency and extrapolate forwards. The server sends them a gamestate, then the client adds the estimated latency to it to work out when "now" should be, and extrapolates what that gamestate would look like 50ms in the future. So, should the clients care about their own latency, or should they just accept that they may all be slightly behind the server? Does it matter? If they SHOULD care about their own latency, should they regularly send extra ping messages to the server, or just try to work it out based on the normal input->gamestate time difference?
Both approaches could work. There are no hard and fast rules -- different games can get away with different approximations.

Note that, typically, the connection between server and client will be based on a packet stream, where X packets are sent per second; within each packet will be N messages plus some protocol overhead (sequence numbers, authentication, whatnot). Part of the framing of the packet could be a few bytes of timing data, which can help with RTT estimation.
enum Bool { True, False, FileNotFound };
Advertisement
ive been doing some network predection myself this week.

I did something like you described in A.
The server sends its state to the client upon connecting. Everytime the server sends a position update to the client, it also sends its state.

The client checks his version of the server state with the one in the arriving packet. If the difference < 0 then the client updates its server state. In other words, the client tries to get the minimum "ping" and uses this to calculate the time it took for the packet to arrive.

I divide this difference by 2 and use cubic splines as anti-lag effect.

This method seems to be the only decent one I could find in my case.
It depends on your game (of course!). The client will always be 'seeing' the game state of 50ms ago, or whatever its ping is, and actions won't appear until 100ms (twice the ping) later. But if the effects of client-side actions (i.e. things the player does) are likely to be predictable over a 50ms timescale, client-side prediction can reduce the appearance of lag by 'pre-performing' those actions until the server tells them what to do.

Most games are predictable on short timescales, so prediction is possible. The question is then, does the lag have a bad effect on playability? Typically, fast-paced games (FPS, racing games and other action games) suffer the worst from lag and would be good choices for prediction, whereas strategy, puzzle and turn-based games don't suffer much at all and prediction is generally more trouble than it's worth.

Quote:
The server looks at this timestamp, backs up the simulation a few steps to match, plays the simulation forwards again, and then sends the new latest gamestate back to the client.
(new emphasis)

Uh ... surely the server can't 'roll back' for each and every client action! Many things could have happened during those cycles, including other client actions from other players, that can't be re-simulated. You have to accept that commands sent from the client will run a little after they are sent.
Quote:
Original post by Bob Janova
Uh ... surely the server can't 'roll back' for each and every client action!


Well, not exactly roll back, but it could remember the last hundred or so states of each actor and play just that actor forwards. Collision logic would be a bit tricky, but doable. It's easier to have the server always take in actions in realtime, but for that to look right on the client, the client would actually have to be AHEAD of the server, wouldn't it?

Like, player A is flying towards a missile, so he steers to dodge it at time T=10. The server won't get it for, say, 5 turns, so for the player to successfully dodge the missile, the server needs to be at least 5 turns behind the player. That seems a little backwards to me, so the server needs to be able to work with old data to at least some degree, doesn't it?
In most cases, the player will just move a little later than he actually pressed the button. This is why ping is important in online gaming. Clientside prediction is only to smooth over certain parts of the playing experience that make it look like response is instant.

In your example, the player would only dodge the missile if he saw it coming 5 turns before it hit him. Imagine if there was a player with a lag of 10 seconds; now, if you roll back to compensate for lag, everyone would have to play 10 seconds behind, as you could never tell if the current state might be altered by something that player did. That's clearly a bad idea!
Advertisement
So then, let's assume I go with the idea that the clients actively try to figure out their lag from the server and predict to that time. Is there a standard best-practice way to figure out the lag?

One idea that comes to mind is to number the packets I send to the server, and to have messages from the server include the number of the most recent packet they've received. Then the client could average over the last 5 or 6 received pcakets.

Another idea would be to regularly send packets specifically for pinging, and a third idea would be to actually use a standard ping on a seperate port to the destination machine.

Thoughts?
Some form of tagging packets in a way the client can use to figure out a time delay is normal. If you're already tagging them for ordering purposes, your suggestion of the server sending the last packet ID received from that client would be good; the client would then have to remember when it sent the last few packets and do a comparison.

It would seem best to me not to add a specific sort of 'ping' packet, or at least reserve it for when it matters that the ping is accurate (perhaps if a player requests his ping?). Adding packets, however small and infrequent, uses up bandwidth and processing power, and for no good purpose in this case. Having said that, it's probably the easiest way to find out your ping in terms of coding.

Quote:
and a third idea would be to actually use a standard ping on a seperate port to the destination machine

Don't do that. There's no guarantee that such a service is available on the machine, and certainly not one that the ping obtained that way will be the same as your current latency over an open connection to a game server.
Typically, your packet to the server will include "this is my current time". Packets from the server to the player will include "this is your timestamp from the last packet, which I received X time ago".

From this data, you can calculate round-trip transmission time as current-time - your-timestamp - X (where X represents "processing time" on the server, loosely).

This adds 2 bytes from client->server (assuming you're OK with millisecond precision, and seeing roll-over every 32 seconds) and 4 bytes from server->client (last-timestamp and time-since-received).

Note that you typically pack a number of messages into a single packet, so the "overhead" for determining ping is only once per network packet (part of framing). Also, that data is not unnecessary if you actually need the ping, either to present it to the user, or to do interpolation calculation with. I would think it's well worth it to collect this data.
enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement