Advertisement

Synchronizing clients with servers?

Started by March 07, 2002 01:07 PM
5 comments, last by Ironside 22 years, 10 months ago
Does anyone have some sample code for synchronizing a time variable on the client and the server. Basically I’m looking for an implementation of NTP but it doesn''t have to be accurate to the nano second. + or - 5ms would be just fine. As clients are typically 50ms out of sync with the server at a minimum. I''m using UDP as my transport, and i know it''s possible to synchronize time with an unreliable transport because NTP does it Any help or reference to resources would be greatly appreciated.
A suggestion(which i thought off whilst reading your post)..

Before you send the sync message, first send about 10 messages
each requesting a ping, time how long it takes each message to return back to you, accumulate these times, divide by 20. Obviously you then have the average trip time to the client in ms. Then you can set your timer on your server to 0, x ms after
you send the message to your client. So, theoretically at the time your client recieves the message, you will have set the counter on the server side to 0, and both should be within your +/- 5ms range. Another thing, i would use TCP, personally because i hate UDP, and also, you don''t want to lose the packet If your synching at startup this would be fine, but if you''re synching constantly, UDP might be ok. You could probably use the lag time you worked out previous if you wanted to, or could calculate it using less messages to average, this would ensure that any lag spikes have minimal effect. Again, one last point , if your synching constantly, don''t go resetting the counters back to 0, instead set the client to the servers time

I hope at least some of that made sense , or if not, sparked off a better idea

-= DarkStar =-
Advertisement
I once did the averaging thing with ping times.

I found it was better to send a lot of packets and just use the packet with the shortest total travel time. That should get you close enough, unless there is something wrong with my thinking.

Then in your code, watch out for clients sending events to other clients for a time that hasn't actually taken place yet. That happened in Battlezone 1. I would see another player's tank fire and the shots would hit it in its backend because the time was bad.

Also, NEVER assume the ping in one direction is the same as in the other! That's one reason you can still get bad client-to-client times no matter how much you approximate.

If you want to do further testing, trying writing an app that actually simulates the internet on a single machine. Allocate a list of client objects with average pings to each other and connection quality (simulating lost packets and travel time fluctuation), and a linked list that delivers the packets to their destinations. It turns out to be a handy tool to test message processing logic, too!

- Waverider


Edited by - Waverider on March 7, 2002 3:13:37 PM
It's not what you're taught, it's what you learn.
Check http://www.codewhore.com/howto1.html

HTH
www.persistentrealities.com for Inline ASM for VB, VB Fibre, and other nice code samples in C++, PHP, ASP, etc.<br/>Play Yet Another Laser Game!<br/>
Wow, that''s exactly what I needed. Thanks.

I had allready come up with a scheme on my own that turned out to be remarkably simmilar.

1.Client sends it''s current time
2. Server respons with it''s local time
3. Client calculates round trip time
4. Client estimates time to send packet to server = .5 * round trip
5. Client sends guess of what servers time will be.
6. Server compares guess to actual time and can now fairly accuratly perdict how long it took to send packets and what the delta between client and server is.
7. Server responds with delta.
btw, just to note. if this is for a game enviroment, you should not need to send actual times. everything should be done in game ticks. thus framerate will not affect the data sent back and forth. look into fixed time step game loops. basically it rips your game logic from your framerate, so ALL mahines on tick x will have all the objects in the same location (assuming the data is accurate and up to date at the time). again this is just a fyi, and may not have much to do with what you are trying to do with sync the client and server.
Advertisement
The problem is that the client and the server have to agree on the game tick. If the server sends a packet that says a projectile was at position x,y and moving with x velocity and y velocity at time 100 on the server. If the client recieves the packet at time 102 (server tick time) but the clients tick is 150, the client is going to render the projectile way the heck on the other side of the map, because the client will think it''s been traveling at that velocity for 48 ticks. When the on the server only 2 ticks have transpired.

So basically it''s ok if the client ticks are at 150 and the servers are at 102, the client just needs to know the delta between the two clocks so that when it recieves a packet from the server saying "Event Occured" at "Time t" it knows to add or subtract a certian amount from it''s current tick to find out when the event actually occured in client time.

This topic is closed to new replies.

Advertisement