Advertisement

Limiting bandwidth usage of a download from client side

Started by March 31, 2018 06:26 PM
10 comments, last by Ian Reed 6 years, 7 months ago

I'd like my game client to download some extra assets in the background, but I want it to rate limit itself so it has less effect on any other network traffic.
I imagine I could rate limit on the server side through a web server setting, but I'd rather the client gets to choose its own rate limit so it can also have times when it downloads at a faster rate.
What is the right way to do this?

 

I'll be using HTTPS for the downloads.
Is there some way to inform the TCP or HTTP stack that you only want to receive data at a certain speed for a particular connection?
Or do I have to just slow down the rate at which I am reading it from the HTTP stream?
Slowing down my rate of reading seems like I am relying on TCP fallback logic to handle my slow processing speed, rather than informing it up front that I want to be a good citizen and use less bandwidth.

 

You have to just read at a slower rate. By reading more slowly you will be sending acknowledgements more slowly and thus effectively performing the rate limiting you're after. TCP is designed to handle this properly.

Advertisement

There is no standard way to do rate limiting on a TCP socket.

Reading at a limited rate is fine, because TCP keeps an "open transmission window" which tells the other end how fast it can send. When the other end has sent everything that will fit in the window, it won't send more, until the window opens up again. The window opens up when the receiving end acknowledges receipt of data, which it will only do if there is free buffer space in the kernel.

Note that you should set the SO_RECVBUF socket option right after calling socket() to set the receive buffer size. This has two effects:

  1. A bigger buffer will let the other end send more before getting choked
  2. A bigger buffer while establishing the connection will turn on window scaling and make TCP more efficient on fast high-latency links (i e, most home internet links)

Of course, a TCP implementation is allowed to turn on window scaling if agreed with the other end even without bigger buffers, but the classic way to  make sure this happens, is to configure a big buffer.

 

enum Bool { True, False, FileNotFound };

Ok, thanks to both of you.
I did some reading on the TCP receive window, window scaling, TCP congestion control, and HTTP ranges.

 

Wikipedia said: Windows Vista and Windows 7 have a fixed default TCP receive buffer of 64 kB, scaling up to 16 MB through "autotuning", limiting manual TCP tuning over long fat networks.
https://en.wikipedia.org/wiki/TCP_window_scale_option

 

I am using C# and see that the HttpWebRequest.ServicePoint.ReceiveBufferSize is set to -1 after constructing a new HttpWebRequest (despite the method documentation saying the default is 8192).
I assume the -1 indicates "unset" and allows the operating system to auto tune it.
My game clients will be run on Windows, and the download servers and game servers will be run on linux.
Would I be better off letting the OS auto tune the window scale / receive buffer size, rather than choosing it myself?

 

Auto tuning is probably fine, especially if you want to limit the receive speed.

Note that the HTTP client library you're using needs to also limit how much it buffers for the "read at a fixed rate" limiting to work. If the library itself buffers enough, you will not be able to limit the rate at which data is downloaded on the client.

You can also limit the rate at which data is sent on the server, using similar mechanisms, again, assuming that you either use a server/proxy with send rate limiting built in, or that you have control over the send-a-file loop chunk timing.

 

enum Bool { True, False, FileNotFound };

Thanks, that makes sense.

 

I expect this background asset downloader to run at the same time as network game play.
If I know the max download bandwidth for a client, is there a percent threshold I should keep the combined asset download and game play traffic under in order to avoid lost game play packets?
Is running an asset downloader in the background a bad idea?
I want to utilize the extra bandwidth by downloading assets ahead of time and not making the player wait at download screens as often or as long, but not if it will negatively affect network game play.

 

Advertisement
5 hours ago, Ian Reed said:

If I know the max download bandwidth for a client, is there a percent threshold I should keep the combined asset download and game play traffic under in order to avoid lost game play packets?

Percentage is probably the wrong way to approach this. You should be able to figure out the exact bandwidth requirements of your gameplay - subtract that from the total and leave a bit of a margin.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

If you're grabbing assets over HTTP, then presumably you're using something like apache or nginx as your server.

Apache has the mod_ratelimit module to define rate limiting options (it's enabled when downloading files from GameDev.net). I think there is a similar module for nginx, but I haven't used it.

Admin for GameDev.net.

@swiftcoder, thanks. I guess I was wondering how much that margin needs to be, whether that be a percentage of total bandwidth, or just a set amount.
I intended on subtracting the exact bandwidth requirements of the gameplay from the total available bandwidth as you say.
Then the decision is how much of that remaining bandwidth can I use for downloading assets in the background without negatively affecting the gameplay traffic.
I suppose my main concern is that using 90% of the available bandwidth would result in a greater than linear increase in lost packets as compared with using 10%, even though both are clearly under 100% with a margin.
My hope would be that I can expect a linear increase (or less) in lost packets as I saturate the the available bandwidth.
If the increase in lost packets is more than linear, then that might make it worth staying at much lower bandwidth usage for the background asset downloader, even though more bandwidth is available.
I hope that better explains my concern. Thanks for the help.

 

@khawk, thanks, I will probably be using Digital Ocean "spaces" / object storage.
They may have a rate limit setting built in to their admin interface.
I prefer to limit on the client, since I'm writing the client anyway, and each player's available bandwidth will be different, so they can rate limit at different speeds based on that.
The rate limiting is to protect the client's gameplay experience while still downloading assets in the background, not to protect the server from being overwhelmed.

 

Lost packet proportion is unlikely to depend on how much bandwidth you use, but again I'd argue this is mostly a problem you don't need to concern yourself with at the application level.

The fact is, a typical computer these days is probably performing all sorts of internet traffic in the background alongside your game - OS updates, updates to whichever obnoxious software demands to change every day, browsers left open and polling advert networks, etc. From your game's point of view it's probably sufficient to just give the user a drop-down box to say how much bandwidth they want to use for background asset loading.

This topic is closed to new replies.

Advertisement