I am working on making my game multiplayer, and I do not have a lot of experience with network programming. I was wondering if there are many known issues to avoid or specific practices to use when networking a game. I am using OpenGL/SDL and it is a lot of fun figuring out how to do it myself however I can see that it can be involved and it might be a good idea to seek some guidance. To be a little more specific, I have new threads now that manage packet transmission. How should I keep their iterations per time managed? For example should the client listen slower than the server emits packets, or vise versa. And is it a bad idea to have a thread not "frame" controlled (just iterate without any kind of delay) Also, I'm having some kind of visual glitching that seems to be worsened with more packet transmission I'm wondering what is up with that. Any help or book/guide suggestions are appreciated. (I do know of beej's guide so I do not need to be pointed there.)
Thanks
(NOTE: When I mention threads and iteration control, I'm speaking specifically about packet transmission speed, how fast the server should send packets and how fast the client should listen for packets)
I assume that you're doing some kind of action game, with continuous simulation. If it's a turn-based game, the rules are somewhat different.
A typical game runs physics at a fixed tick rate (30-120 Hz, 60 Hz typical) runs graphics at however fast it can (with intra/extrapolation from physics, although 60 Hz frame locked is common) and runs networking at another fixed tick rate (10-50 Hz, 20 Hz typical.) All messages that need to be sent to a specific client from a server, or to the server from a client, is put in a queue, and that queue is drained and a single packet sent each time the network tick comes around. If you send too much data, so the queue or network stack backs up, then the client can't keep up, and you need to either drop that client, or you need to optimize your network traffic so that you send less data. If you send more than 10 kB/second, you're likely doing it wrong (many action FPS games can get by with 2 kB/second or so per client.)
Other than that, there are two main approaches to networking: Use lock-step simulation (everyone simulates the same thing,) or use predictive display with server tie-breaking of differences. Lock-step is very robust once you get it working, and uses a very small amount of data on the network, but introduces a lag time between command and effect that you need to hide somehow. It also de-syncs if you have any kind of consistency bug. Predictive display allows you to immediately show an action on the local client, and isn't as sensitive to de-sync, but ends up with lag-induced problems like "how could you shoot me when I just dove behind this cover?"
All messages that need to be sent to a specific client from a server, or to the server from a client, is put in a queue, and that queue is drained and a single packet sent each time the network tick comes around.
Do you mean it is good practice to put network data on a queue or are you referring to what is happening behind the scenes on the network layers?
It also de-syncs if you have any kind of consistency bug.
Do you mean it is good practice to put network data on a queue or are you referring to what is happening behind the scenes on the network layers?
I mean you should explicitly keep your own queue. Once you formulate and send a packet, it should be a single send() on the socket, and it should contain the freshest info available about each of the entities that go into that packet. Thus, when your game decides "the hitpoints value of X has changed," it shouldn't enqueue "hitpoints of X is Y," but instead enqueue "send a hitpoints update for X." That way, if it changes again, the latest value will be sent once the queue is drained and the value sent.
Could you explain what you mean by this?
I could, but this article does a fantabulous job on its own.