Advertisement

Latency in Wireless Connections

Started by June 26, 2014 07:14 PM
6 comments, last by hplus0603 10 years, 4 months ago

We have been investigating wireless latency (WiFi and cellular) and its effect on realtime multiplayer games.

First, we have some questions about real world wireless latency. It seems that limited latency testing in the wild is not sufficient for preparing a game for real world use. Ookla speedtest is the most used network testing tool with 5 million tests a day, but we found that it only gives the best out of 10 samples, so this naturally gives highly optimistic results. What are better sources of real world latency data?

Looking at latency data, what’s a good way to decide the range of latency a game should accommodate? For example, you can look at the min/max/standard deviation, the mean, or the tail (95th percentile).

What is the best practice to actually test the effect of latency on a game? Is it better to test with deterministic or random latencies? For example, for deterministic testing you might use a constant latency at 95th percentile. Deterministic latencies make it easier to find issues in a controlled and repeatable manner, while random latencies are a closer to what your game will see out in the real world.

We have posted more details about this topic and are very interested in what other developers have to say on the matter. If you are interested in reading more about our thought process, check out our post about mobile latency here.

I'm not aware of any good, thorough latency data. Your best bet is probably to try a number of different devices in different service areas. For example, a home 100 Mbps connection in Korea, a 4G MiFi WiFi adapter in the US, a 20 Mbps DSL connection in Europe, and a 1 Mbps satellite-downlink modem-uplink connection may all perform very differently along both loss, latency, jitter, and bandwidth.

If you're designing your game to be played well, then you should design for at least 99th percentile, if not 99.9th or better. Although percentiles are also quite hard to pin down -- latency and jitter may vary more over macro time (time of day, day of week, internet weather) than it varies from packet to packet.
enum Bool { True, False, FileNotFound };
Advertisement

I have a strange habit of assuming nearly-worst-case scenario with end-user situations. Hardware, internet, etc... and aim to make everybody as happy as possible, without disappointing those who are more invested in their setups.

The worst case is something like 33 kbps and 1800 milliseconds latency (GPRS, satellite, etc.)
enum Bool { True, False, FileNotFound };

Thanks for your responses, guys.

I'm not aware of any good, thorough latency data. Your best bet is probably to try a number of different devices in different service areas. For example, a home 100 Mbps connection in Korea, a 4G MiFi WiFi adapter in the US, a 20 Mbps DSL connection in Europe, and a 1 Mbps satellite-downlink modem-uplink connection may all perform very differently along both loss, latency, jitter, and bandwidth.

If you're designing your game to be played well, then you should design for at least 99th percentile, if not 99.9th or better. Although percentiles are also quite hard to pin down -- latency and jitter may vary more over macro time (time of day, day of week, internet weather) than it varies from packet to packet.

hplus, how does one go about field testing on all the different connections you mentioned? Would one use a third party testing company for field testing? It seems expensive/time-consuming for a typical studio to do this kind of testing and make it statistically meaningful. It appears hard to test and troubleshoot issues with latency using just field tests during development.

Our opinions below are informed by experience in other fields. We are hoping to understand current best practices in game development.

A little more background information about our perspective— personal computing has been moving from hard-wired to wirelessly connected devices. Portable devices (smart phones, tablets and laptops) use only wireless connections, and even consoles and PC’s increasingly use wireless connections. Realtime multiplayer action games work best on hard-wired connections. This transition from hard-wired to wireless connections require developers to address higher and more variable latency.

We’d like to start off by making a distinction between field tests in the wild and lab tests during development. During development, there is a need for regular regression testing of builds; this would include unit, integrated functional and performance tests. Regression tests are most useful when automated (run without human input).

Lab tests, both automated and live-player, require setting up repeatable, yet representative network conditions. This allows direct comparison of test results to assess software quality and uncover bugs. In contrast, for comparing field tests in the wild, both network conditions and quality of the software are variable.

How does one know where to test, and under what conditions? And if one has covered the instances that the customers are going to see? If we spend most of our time testing for the tails of the distribution, we may not have confidence in the outcomes majority of players will experience. The further you go out into testing the tail (i.e. a higher percentile) in field tests, the more tests you have to run for the data to be statistically significant (adding to the cost). Field tests are more expensive than lab tests, so it is better to use fewer field tests and more extensive lab tests.

Network engineers have been concerned with similar issues for a long time. IETF RFC 3393 addresses measurement of packet delay (latency) variations for IP Performance Metrics. Interestingly, it deprecates use of the term “jitter” due to ambiguity. It views latency as a distribution whose statistics are inferred from measurements. Network emulators are commonly used in development of networking products. Statistical parameters of the distribution are then used to configure the emulated test conditions. In that case, testing confidence is improved by long enough tests and importance sampling (stress testing). Repeatability in statistical tests can be achieved by using pseudo-random sequences.

Would one use a third party testing company for field testing?


Note that wireless LANs (WiFi) and wireless access (Mobile, Microwave, Satellite) are different kinds of beasts.
Testing over WiFi is reasonably simple, as all you need is a few kinds of hardware adapters and access points, covering the main chipsets and vendors in the market.
Testing wireless last-mile connectivity is harder, because it varies so much based on locale.

Many game developers who target console games let the platform owners (Microsoft, Sony, Nintendo) do the testing/validation. If you enter into the developer program for those platforms, you will receive recommendations that they have developed, and that they will test your game against to certify it for compliance for each platform. Because of the gatekeeper function of those companies, it's their requirements, not on-ground requirements, that matter.

Testing cellular is harder, although you can get a few different phones or cell modems with different technologies (GPRS, 3G, 4G, LTE, WiMax, etc) and drive around town to find different levels of reception, and then run a test through those set-ups.
You may also be able to use crowd-sourcing solutions, like odesk, freelancer, etc, to run tests for you if you can clearly define the tasks and reports you need.

I doubt most game developers go to that length, though. The big ones know their networking works because they certify for the consoles and likely use very similar code for PC. The small ones have bigger problems to worry about, and probably make do with a soft network emulators like netem or wanem or gns3 or whatever. Make some educated guesses (up to 500 ms round-trip latency, up to 100 ms jitter, emulate 10% packet loss, limit bandwidth at 256 kilobit, or whatever) and call it good when it works.
enum Bool { True, False, FileNotFound };
Advertisement

Thanks again for your insight into the industry.

Latency on WiFi connections is actually just as high and variable as on cellular networks. This is due to the quality of the radio channel (fading) and the traffic, and has little to do with the hardware involved. Recent data collected over a 6-month period by OpenSignal demonstrates this point and is summarized here: http://opensignal.com/blog/2014/03/10/lte-latency-how-does-it-compare-to-other-technologies/

OpenSignal collects data from over 1.2 billion WiFi points and 800,000 cell towers.

Our limited testing confirms this as well: http://sugarcanegames.com/latencysummary2.html

This is due to the quality of the radio channel (fading) and the traffic, and has little to do with the hardware involved


I agree -- the fact that it's radio doesn't change between vendors. Unfortunately, it is also the case that some vendors hardware and drivers and firmware is better than others, and the race-to-the-bottom in cost which drives the US market means that some vendor that was great last year, may have a dud the next year :-(
If you then want to support Linux, your woes multiply.
enum Bool { True, False, FileNotFound };

This topic is closed to new replies.

Advertisement