Hello GameDev Community,
this is the first time, I look forward on contacting you for help, but I really need you as I'm currently totally stuck. Not a single Tutorial Like GafferGames, MPDev, ... was able to explain me what I'm doing wrong and I'm currently really stuck.
The issue I currently face is quite obvious, I know for sure in beforehand that my client side prediction is for 100% wrong. But I need to do it do simulate zero latency for the user. But in the end it is wrong and I have to correct the user. So yeah I'm completely stuck. Something in my network design is really bad to not properly fit Client Side Prediction.
In the game engine I use (and mostly every other game engine), movement is explained in a given speed * time. So let's say Speed is something like 5m/s. So Entities move at an amount that is dependent on the time and speed. The time is very important as this is also a realtime 3D game.
I read most of the good tutorials out there in the wild internet about good authoritative networking but I can't spin my head around client side prediction and server concealment. Please don't misunderstand me, I know for sure that this is needed to simulate zero latency and also ignore the RTT for the change to upon based upon the Input. But:
Every Input that triggers an Action that is depending on the previous one (which are most) will be wrong predicted.
Let me explain this with an example:
Server runs at 10 Ticks per Second. So in between each SnapShot is a delay of 100MS. Let's say the client currently interpolates in between the states 500ms and 600ms. The RTT is 100ms so the ping is 50ms. So as we interpolate on the client side in between 500-600 Snapshot the server is on 650ms.
Now the client sends the button press "W" that triggers the movement forward action. This arrives at the server at 700ms. The client will start running at server time 650ms. The server sends back the ACK of this action with the server time 700ms as this is where the server marks the character to run. Now the client can mark the 700ms world state he will receive with his 650ms Snapshot of his position to now if the predicted state is correct.
Now the client send at 700ms the action "SPRINT" to the server. This triggers double speed. This time the package of the input to server takes not 50ms instead it takes 100ms to arrive. so the Server will send an ACK with 800ms. So to summarize:
Client -> 650ms MoveForward -> Server -> ACK @ 700ms (Ping 50ms)
Client -> 700ms Sprint -> Server -> ACK @ 800ms (Ping 100ms)
So on the clients predicted state we run for 50 ms and then sprint.
But on the server state we run for 100ms and then sprint.
This example is just blown up. For a real world scenario this would be more like the ping is 50ms then 55ms then 48ms. So every time the latency jumps around and is never the same. Knowing this, I know for sure, that my predicted states will be 100% wrong. I will have to correct them.
So what am I missing here that I can't wrap my head around this? Most authoritative game has a client side prediction as the latency would be too noticeable. But it can't be that they always correct the predicted state. In fact a good prediction system has the least correction amount.
So please help me to see what I miss x)