Advertisement

Synchronized time.

Started by May 02, 2005 06:12 AM
5 comments, last by Hexxagonal 19 years, 9 months ago
So, I have my super fast, low latency UDP client/server code up and running. Now I need to have a synchronized time among all clients and the server. Anyone have any good strategy on how to do this? Basicly I need to show an "animation" on several screens that are located in the same room. I'd reckon that if the animation was out of sync on different screen this would be very noticable, right? The animation contains several swift camera cuts, going from bright to dark areas. I'd like the screens to "flicker" synchronously. The animatiom runs in 60 fps so the timer divergence should be somwhere around 1/120 of a second (~8ms), or? Is this possible how? I was thinking that the server pings the client, takes it's current time, adds half of the ping and sends this time back to the client. The client adjusts it's current time by a fraction of the difference clientTime += clamp(serverTime - clientTime, -0.01f, 0.01f), this too avoid visible "jumping" of the time. Is this a good strategy, or? Any help?
Time synchronisation is an not so easy topic.
Since the delays during data transmission are not necessarily constant when using IP, you cannot simply calculate it back.

Maybe you should read on those pages:
http://www.ntp.org/
http://homepage.ntlworld.com/robin.d.h.walker/cmtips/timesync.html
http://www.horology.com/hs-synch.html

One simple idea would be to let the times converge to each other. That's similar to a smooth filter in image processing. You just calculate the average of the times in each step and set that average to the new time. In the end the will converge to a global time. Note that you will have to take into account that time goes on.
Advertisement
I'd don't like the idea of letting the server time change.
My server is my God :)

Since the computers are on a closed LAN it might not be as bad.
I'll simply have to try it out.
But I'd like to use my code for internet applications aswell and a synchronized time is always a nice thing (especially since I like to do a lot of deterministic stuff).
Another idea:
1 When a client connects, measure the average ping for some time.
2 Server: send initTime(serverTime + pingTime * 0.5)
3 Client: receive serverTime, clientTime = serverTime
// serverTime and clientTime are now quite similar, but not so good.
4 Server: send getTimeDiff(serverTime)
5 Client: receive serverTime, send timeDiff(clientTime - serverTime)
6 Server: receive timeDiff, send setNewTime(serverTime + timeDiff)
7 Client: receive serverTime, clientTime += clamp(serverTime - clientTime, -0.01, 0.01)
8 Client: send timeDiff(clientTime - serverTime)
9 repeat from 6

After a few iterations things should stabelize, or?
Basicly the client tells the server what's the estimate server->client lag is.
The server sends an update message based on this estimate, the client takes a new guess (hopefullt slightly more accurat) and then loops.
If you're on a LAN, you have good control over all the participants. Thus, you can have one machine be "time keeper" and send out commands on a broadcast port to all the others, about what frame you're on. The others would just read this port, and update their clocks as appropriate (probably updating a baseline for the clocks, and using an internal timer to advance forward from that).

Again, if you're on a LAN, the latencies will be so low that you don't need to worry about more than that. Instead, you should worry about things like getting all your different display monitors gen-locked...
enum Bool { True, False, FileNotFound };
I did some test and got some really good results (using a slightly modified version of the above). However all my test was done in the same program (to be able to compare server and client times) and the internet problems was simulated.
Even with 25% packet loss, 25% packet corruption (data and/or length, which essentially results in a dropped packet), packet reordering and packet delays (simulating highly varied ping). Plus different sleep time before network processing every frame (and different for client/server). I didn't see a time diversion off more than +/-0.7 ms. The "setup" phase takes about 10 seconds under these severe conditions (about 2 seconds when no nastiness is enabled). After initiation I send an adjust packet every 2 seconds (might not get through though!). I ran this setup for about an hour and the worst time differece was 0.7 ms. Since I'm going to use a lan (with heavy security that migh slow things down), I feel that this will work fine since my theoretical maximum accepted divergence is around 8 ms. So even if it does perform 10 times worse than the test case, it's good enough.

Now a second problem:
How to you compare times between two different machines to verify how synched they are?
I was actually thinking of using my camera to take a picture of two monitors with the time rendered on screen?
This will "only" capture the time up to the refreshrate of the monitor (and/or shutter speed).
Any other suggestions?

(As you mentioned this is getting theoretical now, but I'm still interested, my immediate problem is solved.)

Edit: 10x is not acceptable since this time (0.7 ms) was between the server and a client. Assume that client A's timer is +0.7ms and client B's timer is -0.7ms that's a differense of 1.4ms, but around 5 times worse is still good enough.
Advertisement
Game Programming Gems 3 (I think it was) had some information about client-server time synchronization. That should be at any bookstore, it's a short read (30 minutes to an hour very max) and very friendly. I believe it was the first article in the networking section.

This topic is closed to new replies.

Advertisement