Advertisement

Dynamic input delay

Started by December 22, 2024 11:12 PM
2 comments, last by Periwink 17 hours, 36 minutes ago

Hi,
I've been working on a library to implement netcode for multiplayer games.

  • Clients are running in the future compared to the server, approximately RTT/2 ahead so that client inputs at tick T arrive on the server roughly when the server is processing tick T
  • Clients speed up their time slightly (by +/- 10%) to make sure that this is respected and that the input buffer doesn't grow too large or too small. Basically we always want to make sure that the server has an input to process for a given tick T
  • I've implemented client-side prediction and input-delay. Latency can be hidden by a combination of input-delay + prediction. For example if the client is running 10 ticks ahead of the server (~150ms at 60Hz) then we could have 50ms of input-delay (3 ticks) and 100ms of prediction. In that case the inputs are delayed by 3 ticks, and the client timeline is actually RTT/2 - input_delay ahead of the server

I am trying to implement the logic described in this guide: https://www.snapnet.dev/docs/core-concepts/input-delay-vs-rollback/ Which is to make the input delay value dynamic based on the client's latency.
For example at 30ms, we could cover the latency only via input-delay. But then if the latency conditions of the client change and the latency jumps to 140ms, we could over this by increasing the input-delay and adding a bit of prediction.

What I don't understand is how it is possible to modify the input-delay dynamically, as this might cause some inputs to be missing or overwritten.
For example let's say that the input-delay should change from 4 ticks to 3 ticks, then we would have:
- tick 100, delay = 4, write input A in buffer for tick 104
- tick 101, delay = 3, write input B in buffer for tick 104 → input gets overwritten!

In the reverse case where the input-delay increases, we would have missing inputs for a tick.

Any ideas?

Almost any networked game system needs to keep a clock that counts game ticks, and timestamp each input and update with what timestamp it's intended for.

You may send more than one set of inputs in the same network message, as long as they each have a different timestamp.

Also, this timestamp is what the server uses to tell the client it's too late or too early. If a timestamp shows up that the server has already simulated, it will tell the client “you are X ticks behind,” and the client can increase its clock compensation delay. If the client sends a command that is way too early (so it'll unnecessarily wait many ticks in the queue before executing) the server can send a message that the client is too early, so it can reduce the latency.

It's important to make sure that this correction doesn't oscillate back and forth though! It could get into a self-reinforcing feedback loop where it's over-shooting each direciton. Typically, you'll want to refuse to adjust delay downwards for at least a few seconds after you've adjusted delay upwards, to avoid the worst of this.

enum Bool { True, False, FileNotFound };
Advertisement

Hm working with input timestamps directly isntead of ticks might work, thanks

Advertisement