This is actually much more confusing then I though when I started to write this thread...
I mean, the update rate and receiving rate is basically the same regardless of latency.. If I send 30 packets a second and the latency is 2 seconds, packets are all getting there 2 seconds late, but theyre all getting there at the 30 per seconds rate...That makes the whole prediction thing different than what Im used to think...Generally I just think in terms of a single event..
The latency influence in prediction is the elastic effect caused by the 'delay and catch up' effect you have when predicting (either slow start and catch up (if stopped to moving) or go beyond and back(if moving to stop). But the update rate is just the granularity of the movement performed by the players. A low rate will look like the player move in straight lines, a high rate will more accurate describe a movement "zigzaging" if any.. Is that right?
But thats not even what I was thinking..
Should one try to adjust update rates (packet sending) accordingly to RTT measures?
I understand that gameplay wise: the higher rate the better; bandwidth wise: the minimum necessary the better. (pretty obvious)
But sending less or more packets under different latencies doesnt change anything gamewise, its the latency itself that change stuff (worse prediction)..So its really just trying to soften the problem that MAY be cause by congestion? Cause if its not congestion the problem, youre lowering your rates for no reason right?
Is it best practice to step in the breaks in the case of bad connection?