Advertisement

In which model/architecture does a Server and Client run at different tickrate?

Started by August 21, 2024 02:09 PM
2 comments, last by saimitrfgolli 3 months, 3 weeks ago

Hi all, I've been developing a Multiplayer solution for a game engine for almost 2 years now in my spare time, it is a Server Authoritative model were client interpolate server snapshots and all that stuff. I've been reading some posts here lately and also some articles online and I've read about some model/architecture where the server and the client send packets at different speeds, but I cant wrap my head around it.

In my current model, the server and the client both run at a “network fixed rate” (independent from framerate), lets say 60hz, so when the client produces an input, it says “this input is four your tick 88 server”, and when it reaches the server it is queued for a (hopefully) very small time and then processed. It works great!

But I've seen some articles online about the client sending inputs every frame (every render frame, not every fixed tick) so if the player has 120fps it will send 120 pps, or if it has 40fps it will send 40pps… Which actually makes sense to me, because if you are running the game at 144fps why should I limit you to 60pps?

If I were to implement this right now, the server queue will be flooded because its receiving packets at a much faster rate than it is processing them and that would result in a never ending queue of pending inputs, so I would appreciate if someone can help me understand this better, maybe I've been thinking about my approach for so long that I cant think outside the box.

Now that I'm thinking… maybe if, even tho the client is sending an arbitrary amount of inputs per second, I still mark them “for fixed tick X”, and then process them together on the server? So like if the client produces 20 inputs at fixed tick 44, then all those inputs are marked as “for tick 44” and processed together on the server…. that actually might work, I still would like some second opinions. Thanks!

None

raz0328 said:
Which actually makes sense to me, because if you are running the game at 144fps why should I limit you to 60pps?

Actually, render-framerate does not have to agree with update/simulation-framerate at all. It is very common to have a game render at an arbitrary frame-rate, but only simulate physics at like 50, or 25 FPS. Such a system would then interpolate between the last two simulated states fo reach additional frame that is rendered, to actually generate a different image for each frame. The same can apply for a server-model: You don't necessarily need to process the network at the framerate the game is rendered at.

Processing input each frame as you describe is more often done to decrease input-lag in fast paced games like shooters. So for those kinds of games, it can make sense, but something slower like a turn-based RPG or card-game would not really see much benefit from processing input at render-frequency.

raz0328 said:
Now that I'm thinking… maybe if, even tho the client is sending an arbitrary amount of inputs per second, I still mark them “for fixed tick X”, and then process them together on the server? So like if the client produces 20 inputs at fixed tick 44, then all those inputs are marked as “for tick 44” and processed together on the server…. that actually might work, I still would like some second opinions.

Yes, that is how I'd say this has to be done. It's a bit of the inverse process of what I described for render-interpolation: In that system, the gameplay-loop/simulation produces frames, and the renderer consumes those frames in a specific order, interpolation them as time progresses. In your example, the server would consume all events that the client sends at whichever rate it is running, that would occur during the servers tick duration.

Advertisement

This topic is closed to new replies.

Advertisement