Hey,
I'm currently working on a school assignment which is in short explained in a thread I started here:
http://www.gamedev.net/topic/676440-multiplayer-network-design/
But to recap, it's a simple top-down game. Four players each control a "ship" and they're divided into two teams and each one spawns in a separate corner. There is a ball as well which both teams should try to push into the screen edge of the opposing team spawn thus scoring. It's "physics based" meaning collisions causes bounces and accelerations e.t.c. - Very simple game.
However, the school assignment is to convert this four player local game into an online multiplayer game. So I have the game, physics already written and it's working locally but I need to implement online multiplayer. It's written in java, no network libraries e.t.c. are allowed. Main parts of the assignment is to:
1. Fight lag
2. Fight cheating
So far I have a server which sends a full world snapshot at a given rate. Since the world is very small and I am under a bit of time constriction I'm not going to bother with delta-snapshots but instead just keep sending the whole world.
The clients at the moment send their input every frame, that is, which buttons they're pressing at the given frame. I know this isn't optimal but I need to get things going before I start to optimize. The server reacts to the input once it receives it which in whole just gives clients a delayed input but everything is perfectly synched.
However, now that I've got that working I'm trying to implement client prediction but it's not working very well at all. The client side prediction at the moment is as follows:
1. Client stores input for each frame and sends the input along with the frame number.
2. When client receives a world update from server, it sets the clients m_frame variable to what the update says (provided that it's the most recent update, otherwise client ignores it.)
Client then checks the turnaround time to the server and updates local world (latency / time_per_frame) times and also increments the m_frame variable for each update. For every update, the client also checks the input buffer mentioned in step 1 and replays the input for each frame being predicted.
3. At this point I'm thinking that the client should be processing ahead in time. So if the client now sends, let's say input for frame 1000 and the server is currently at 960 the input should arrive in time for the server to process frame 1000.
Now my problem is that the local players is jittering, a whole lot. When I'm testing I'm running it all locally but with a simulated latency. So packet loss shouldn't be an issue at this point. Maybe this is wrong, but if there is only a single player connected, there should be no visible artifacts at all, no matter how much latency? Since the server is only playing back the input given by the player on a deterministic game.
Any suggestions on how to proceed from here?