Hello. I’ve been working on a networked hobby game for a while. Everything seems to be working well enough but there are some questions that I have relating to some common suggestions and conventions that I have come across.
For context my game is a relatively fast paced PvE ARPG.
Question 1) Using integer frames or tick counts instead if milliseconds / microseconds.
In my game I clock sync and then exclusively use milliseconds for all my time units. Most resources I’ve read online use integer tick counts. I don’t understand why. It seems even if you use an integer tick count, you still need to translate it into smaller units to account for client frame rates anyway. Ex. if your server runs the simulation at 20 ticks per second, but the client is rendering at at 60hz (or even higher these days) the simulation delta needs to be broken down into smaller units. Or do we only actually process the client sim at 20hz and only use the actual framerate for animations / rendering code only? Ex, if the fixed step interval is 20hz, we update the sim every 3 frames that we render.
Question 2) Running the client sim RTT/2 ahead of the server.
I understand with prediction the local player is always RTT/2 “ahead” but i don’t understand the advantage of actually running the client game clock / tick ahead or post dating the messages. Is this simply for input queueing on the server? Ex. We queue an input on the client and send it so it arrives just in time on the server? Is there any other reason? In my game the server processes any received client inputs immediately (they are sequenced and sent redundantly, they always arrive in order). I dont understand why you would ever want to wait or delay processing, or reject them based on some timestamp. Also wouldn’t running the clock ahead make updates coming from the server extra behind? Ex. If the client is running RTT/2 ahead of the server now a server update will have all its timestamps a full RTT behind once it reaches the client, rather than RTT/2.
This leads into my final question:
Question 3) When applying server updates, do we play them out as is, or adjust time values for the estimated RTT/2?
For example, if an enemy is starting some attack animation, by the time the client receives that update, it is already RTT/2 behind. Should we A) start that attack animation at t=0 and play the full thing at normal speed, knowing its slightly behind, or B) snap the t value RTT/2 ahead so that it is immediately in sync and effectively cut off the initial portion of the animation? Or C) do some sort of t value error tracking / reduction and interpolate it (similar to physics error reduction that Glenn Fiedler talks about in his tutorials). Which would effectively speed up the initial portion of the animation.
I feel like C is probably the answer, but something about always fast forwarding the initial frames of every animation doesnt seem right.