I've read on this forum that generally, you want to send network updates about 10 or 20 times per second to clients (for an action game?). But what about the frame rate on the server itself? Should the simulation run at 60 Hz, 30 Hz, or something else?
Game server frame rate
One is that you use a variable time stamp; servers and clients tick as fast or slow as they can. Often, there's an upper limit to how frequent -- 60 Hz is not uncommon. For example, Unreal Engine uses this method. The draw-back is that server and client WILL see slightly different behavior, and client has to be corrected by the server all the time.
The other is that physics runs at a fixed time step, and if a render-frame takes longer than the real-world time of a physios-tick, you tick multiple physics ticks per rendered frame. This means that the server and client CAN deterministically come to the exact same output, so it's POSSIBLE to send only client input, rather than full states, at the network tick rate. (You may not want to do this for other reasons -- for example, perhaps you use a physics engine that's not actually deterministic, like PhysX.)
In general, you'll want to put a lower limit on the size of time steps on the server -- 1/60s is common; 1/120s would only be used by hard-core action players, 1/30s may save some CPU time on the server if that matters. Often, these numbers are set based on physics engine parameters (will you see tunneling through thin walls? will object stacks be robust? will suspensions go into oscillation when turning? etc)
In any case, yes, it is almost always the case that the server (and client) will tick more often than network packets. This means that each network packet "covers" more than one simulation tick, for whatever "covers" means.
The game is very simple at this stage (and will probably remain simple). The player clicks to go somewhere, so it's given a velocity, and the simulation step on the server computes their position at the current time. Then, 10 or 20 times per second, the server sends the client its new position, and the position of other clients. The client can do prediction by checking the last few positions to compute a velocity estimate, or by using the last received velocity value(s) (if sent).
So, assuming that the server ticks as fast as it can (with an upper limit)... I can make that upper limit as low as possible (as long as I'm not "seeing" tunneling or other artifacts on the server, it should be fine), right?
Because clients will tick as fast as they can (and therefore, process packets at different times), they will see different things. But if the difference in latency between the different clients is not too far off, then it shouldn't be a big deal (depending on the game), right?
hplus0603, thanks for that first detailed reply. It helps me a ton.
At some point, I won't be able to bring down my "upper limit" anymore. So, assuming a frame rate value that works without tunneling (i.e. 30Hz)... there will come a time when my server won't be able to handle more clients because I won't be able to guarantee each frame takes 1/30s. That's when I would need to add servers, correct?
there will come a time when my server won't be able to handle more clients because I won't be able to guarantee each frame takes 1/30s. That's when I would need to add servers, correct?
Add more servers, and break out the champagne :-) Or just optimize your code so it runs faster... With a good profile, you can do wonders.
Note that "just add servers" is OK if your game is OK with putting different players in different instances of the same world (like different counter-strike servers playing the same map.) If you have a single, shared world, it becomes harder -- this is the "un-sharded virtual world" problem, which I highly recommend you don't try to solve as a single developer...