Advertisement

[Redis] - Architecture Design with Multiple Instances

Started by April 18, 2016 03:48 AM
1 comment, last by WombatTurkey 8 years, 7 months ago

Hey guys!

So, I am feeling a bit 'worried' about how my server architecture is designed for my game so I decided to create a flow chart (very bad) instead of trying to explain everything:

za7jHFn.jpg

So, everything is working fine. I am using the PM2 module to manage the node instances.

Here are some questions / concerns I have surrounding my above setup.

1. If Player a is on 9300 and Player B on the 9301 instance, and they are chatting or in a game together; the server would have to use ALOT of Pub Sub signals. (Movement positions, if a player used a skill, etc, etc). Wouldn't it be far better to somehow get those player's on the same instance so I don't have to keep on Pub Subbing between nodes? If so, how would I go about doing that? -- Or is this how it's supposed to be when scaling instances with Redis? Should I rely on Redis` Pub/Sub feature for that, or use something like ZeroMQ maybe? I just need Redis to share memory across the instances.

2. Now, what if I add another server in the same data center. I would obviously add those inside my nginx upstream block, but then what if two players are on two different servers and trying to communicate. I would then be Pub Subbing between servers and instances! Am I over-thinking all of this, is this all premature optimization? Or do I have a right to be concerned about Pub Subbing, I just keep thinking I am doing something wrong and that sending a ton of Pub Sub signals across instances makes me feel uncomfortable.

I feel like if I get the player's onto the same node instance, it would minimize latency lag. But that way wouldn't scale very good as ultimately 1 instance could end up with a ton of player's on it. And that would void the idea of utilizing Redis in the first place.

Sorry for the long, drifted questions, just trying to wrap my head around things. Thanks for reading, and looking forward to some advice!

the server would have to use ALOT of Pub Sub signals


Yes. When you have a system where ANYONE can talk to ANYONE, you naturally end up with an N-squared problem, and when N is big, the solution has to be big.

The best thing you can do is to horizontally shard everything, including your Redis instances (if you use Redis for your message bus -- it's fine for light-to-medium duty, but will fall down on heavy systems or when you need guaranteed low latency.)
Split your game into many instances (this often comes naturally.) Split your community into many instances (separate guilds, forums, worlds, topics, etc.)
Then use some kind of service discovery to figure out which Redis to talk to for which channel.

Thus, the group "friends of user X" lives on Redis 3, and the group "game instance 32" lives on Redis 5, or whatever.
You then have to come up with a method for when to create and tear down groups (and how to keep them alive) as well as for mapping group IDs to specific Redis instances.
"Consistent Hashing" is one common solution to the second problem. Timeout when groups are empty is a common solution to the first one.
If you have eight Redis instances, calculate crc32(name) and modulo 8 to get the Redis instance to talk to, for example.

One thing you can do is to only keep pub/sub channels per player, and identify the sender in the message.
Thus, when sending a message to "guild X," you actually iterate over all users that are members of guild X, and send a message on each of those users' queues.
This is great for cases where you may have many topics, but each topic has a finite number of users. (Can you cap the size of guilds to 20, for example?)

Also, there exists nothing pre-built that actually scales to "big" in this space. Google Hangouts does it. Facebook Chat does it. Skype does it. We do it. Each of those implementations is home-grown, and not available for purchase (unless you want to buy a company.)
It turns out that ACTUALLY deploying and working out all the bugs of such a system requires significant effort, at REAL scale (simulation doesn't cut it,) so it's basically impossible to do this as open source.
The closest you can come is ejabberd, which for a very long time actually lost scalability as you added nodes (it didn't federate well) but it might actually be able to scale out these days. Don't know of an instance with more than about 10,000 users, though.

(Perhaps someone will post a link to one!)
enum Bool { True, False, FileNotFound };
Advertisement

Thus, the group "friends of user X" lives on Redis 3, and the group "game instance 32" lives on Redis 5, or whatever.

Ha, I wish my game could utilize multiple Redis servers :P. I don't believe we'll be that big! But, I totally understand your point regardless.

Well you basically confirmed that this is normal and I do trust you. So I will continue development now. Although, I might just use ZeroMQ after reading your thoughts on Redis concerning its Message bus. Also, after looking over these benchmarks, you're right, Redis` Pub Sub is not it's forte.

I have been thinking about a more Diablo approach where some game actions are Peer to Peer. It would help me so much with scaling. Infact, David Brevik said at GDC several weeks ago that Battle.net only utilized one server. (basically a gateway) -- (Although -- we all know the security of Diablo).

For whats it worth, I was thinking of just sending movement data to other players in my game. My game is instance based, up to 6 players per game. So, I feel like that would take a lot of stress off. Physically off the server, and mentally off me. I feel like I could cheat a bit and allow some WebRTC to flourish. Obviously, I think you know the downsides to this. However, most stuff is server-side already (Player's in range to Mob, Skill Cool-downs, etc, etc) -- that I feel allowing just a tiny bit of P2P action for movement would help so much.

In any event, I really appreciate your post. It has been extremely helpful.

This topic is closed to new replies.

Advertisement