Clock synchronization.
Hey, I'm having a problem with my network game that I think is a clock synchronization problem. Basically, it's a tank-battle game, where 2 players each have a tank and they fight each other. Let's call them players A and B. It doesn't matter whose hosting, the problem is the same regardless. Now... when player A moves his tank in a straight line, it appears to jump forward every few seconds on B's computer. Inversely, when player B moves in a straight line, it appears to halt for a frame on player A's computer. The problem, I believe, is that player A's computer thinks that a second is slightly shorter than what player B thinks is a second, and thus player A process more frames for every one of B's seconds than does player B. Therefore, I believe that I have an issue of unsynchronized clocks. (sound right?) So my big question is... what do I do about this? Here are some thoughts I had: The server can periodically send out its clock time to the client. The client collects these and keeps track of them. It then uses the past 10 or so times to compute the rate at which the server's time is increasing compared to the rate at which the local client's time is increasing. The client then uses the resulting value and adjusts the rate at which it performes game updates accordingly. Does this sound plausible? It should work fine assuming lag is constant, but I'm concerned about how much it will deviate if the lag fluctuates a lot. Any advice?
You should make sure that all units, are in seconds. (m/s, km/s, ect.)
clocks should not become unsynced enough to generate any significant bump after a few seconds. A few thousand years maybe, but a few seconds?
Your problem seems to be lag. One comp starts accelerating, then it tells the other one, then the other one starts acellerating resulting in a mismatch.
The solutiom is to 1. have synced clocks (i can tell you how to do it, if you really want to), and to 2. Delay all actions until all players have recieved the data.
So basically, instead of saying "I am accelerating NOW", say "I will acellerate at x m/s at time z" where z is longer then the most lag for any of the clients.
This should do it, but it makes the clients seem more laggy... (waiting 200-500ms to update is very slow, about 2-5synced updates per second).
There are other ways... But i'm not very knowlelagable on this subject.
From,
Nice coder
clocks should not become unsynced enough to generate any significant bump after a few seconds. A few thousand years maybe, but a few seconds?
Your problem seems to be lag. One comp starts accelerating, then it tells the other one, then the other one starts acellerating resulting in a mismatch.
The solutiom is to 1. have synced clocks (i can tell you how to do it, if you really want to), and to 2. Delay all actions until all players have recieved the data.
So basically, instead of saying "I am accelerating NOW", say "I will acellerate at x m/s at time z" where z is longer then the most lag for any of the clients.
This should do it, but it makes the clients seem more laggy... (waiting 200-500ms to update is very slow, about 2-5synced updates per second).
There are other ways... But i'm not very knowlelagable on this subject.
From,
Nice coder
Click here to patch the mozilla IDN exploit, or click Here then type in Network.enableidn and set its value to false. Restart the browser for the patches to work.
Okay, I think I figured out what it is. It's not a clock synchronization problem, but it's not a lag problem either. I believe it has to do with how I use a fixed logical update rate in conjunction with a variable frame rate. If one of the framerates is relatively low, then it causes this problem, but as it gets higher, the problem becomes much more infrequent.
Thanks anyway!
Thanks anyway!
The method I employ is very much like what that article describes - fixed update rate with graphical interpolation between frames.
The problem (I'm speculating) is that it's not possible to garauntee that an update will occur EXACTLY every 100ms, and the margin of error increases as the graphical framerate decreases.
This is the only thing I can think of at the moment. If anyone can think of anything else (especially a solution!), let me know.
The problem (I'm speculating) is that it's not possible to garauntee that an update will occur EXACTLY every 100ms, and the margin of error increases as the graphical framerate decreases.
This is the only thing I can think of at the moment. If anyone can think of anything else (especially a solution!), let me know.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement
Recommended Tutorials
Advertisement