Advertisement

Interpolation Issue

Started by April 03, 2015 03:23 PM
6 comments, last by hplus0603 9 years, 7 months ago

I am tweaking my client code to interpolate movements between the server position and the client's current position. This seems to work "ok" but I still notice a little bit a jitter when the NPC is being interpolated to his destination. Hopefully someone could inspect this code and see what I am doing wrong. I ported this from another post I found. http://www.gamedev.net/topic/639568-network-interpolation-help/.

To test this I am setting up an NPC on the server side to move 0.3f every ~16 msec, BUT I send the position packet at ~160 msec. Perhaps this test case is wrong or inaccurate? Note : the "Update" method is manually called from an entity manager class every frame.


using UnityEngine;
using System.Collections;
using SharedNetworkLibrary.Types;

public class EntityType
{
	public int ID { get; private set; }

	private GameObject _gameObject;

	private Vector3 _displayPosition;
	
	private Vector3 _latestPosition;
	
	private float _latestPositionTime;
	
	private string _currentAnimation = "idle";

	private Vector3 _previousPosition;
	
	private bool _isMoving = false;

	private const float _interpolationPeriod = 0.3f;

	private Animation _animation;


	public EntityType(int id, GameObject gameObject)
	{
		ID = id;
		_gameObject = gameObject;
		_animation = _gameObject.GetComponentInChildren<Animation> ();


	}
	// Update is called once per frame
	public void Update ()
	{
		if (_isMoving == true) 
                {
			float t = 1.0f - ((_latestPositionTime - Time.time) / _interpolationPeriod);

			_displayPosition = Vector3.Lerp (_previousPosition, _latestPosition, t);
		 
		} 
		else
		{
			_displayPosition = _latestPosition;
		}



                  _gameObject.transform.position = _displayPosition;

	}
	
	public void UpdateAnimation(string animation)
	{
		_currentAnimation = animation;
	}

	public void UpdatePosition(EntityStateType entityType)
	{
		if (entityType.IsMoving == true)
		{
			_isMoving = true;

			_latestPosition = new Vector3 (entityType.X, entityType.Y, entityType.Z);
			_latestPositionTime = Time.time + _interpolationPeriod;
			_previousPosition = _displayPosition;

			if(_animation.isPlaying == false)
			{
				if(_currentAnimation.Length > 0)
				{
					_animation.Play (_currentAnimation);
				}
			}
		}
		else
		{
			_isMoving = false;
		}
	}
}

Have you checked out http://www.mindcontrol.org/~hplus/epic/ ?

In general, to avoid jitter, you must either accept a guess about the future that will be wrong, or you must interpolate between two known-old positions. Which in this case means that, when you receive the position that's 160 ms old, your entity will just have reached the position at 320 ms old, and will start moving to the 160 ms old position over the next 160 ms.
enum Bool { True, False, FileNotFound };
Advertisement

personally, I avoid jitter by measuring the deviance from the known rate of incoming packets..

IE: if the server is sending 20 updates per second, I know that each incoming packet *should* be 50ms apart. So I just measure what time each update packet arrives, and can use that in combination with the previous packet to determine how late any latency jitter has caused a packet to be. I use that to extrapolate incoming object positions using their velocity.

This keeps the irregular latency from causing positional info to make objects jerkily accelerate/decelerate. Dropped packets must be accounted for as well, to prevent them from causing what appears to be a giant acceleration (because it looks like a really really late packet, when really the previous one never made it).. This can be done by measuring intervals of the known update spacing (50ms, for instance).

Thanks for the replies, I will check out the article and code linked. The 160msec rate is that too much time between update packets? This game is not a FPS so I figured a little bit longer delay on position updates would be ok and then I would be able to balance the load of sending out state data on specific zones on different iterations. For example: on iteration 0 I would send out zone001 states, next server cycle (~16msec) send out zone002, next zone003 etc... Perhaps there is a better approach I am not sure, I am open to suggestions!

160ms is just fine so long as your scheme for dealing with sparse information is sound. 160ms is 6.25hz, which seems great if you're handling incorporating the received information properly. If your game doesn't need everything to be as fast as possible, then the strategy hplus mentioned about buffering updates and interpolating between two known good updates is probably your best bet..

You could even go so far as to use cubic splines to interpolate between multiple updates to get maximum smoothness. The real issue is in how fast your objects move, how much inertia they have (which helps hide update intervals), and how you interpolate between updates.

One strategy I had was to just constantly be interpolating entities toward their correct position, allowing known velocity to affect them as well. This was basically some form of interpolation without there ever really being a direct path from 'where i am' to 'where i should be'.. The objects were allowed to just float around using normal physics, and network updates would basically give them a goal to move toward. This would allow for great error at times, but was uber smooth. Basically I would set an interpolation-time of about 1 second, and each time I'd receive an update I would calculate the current position vs actual position delta, and scale it to a vector that would take 'interpolation-time' seconds to move the entity from where it was to where it should be.. Typically this worked out so that by the time the object reached where it should be, a new update had arrived and it would begin moving to the next position. This kept it in constant flow. At the time I was not dealing with jitter, so it didnt work too great over the internet, but it was nice with a low update rate.

if the server is sending 20 updates per second, I know that each incoming packet *should* be 50ms apart


If you use a fixed timestep, and timestamp each update message, you don't need to measure jitter; the de-jittering will automatically come from the time-stepping. (You still have to figure out how late you get your timesteps, and estimate the link delay based on that.)
enum Bool { True, False, FileNotFound };
Advertisement

if the server is sending 20 updates per second, I know that each incoming packet *should* be 50ms apart


If you use a fixed timestep, and timestamp each update message, you don't need to measure jitter; the de-jittering will automatically come from the time-stepping. (You still have to figure out how late you get your timesteps, and estimate the link delay based on that.)

Yea, I don't timestamp messages, but that was the other option, definitely.

Yea, I don't timestamp messages


Good! That means you have some low-hanging fruit to pick :-)
enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement