Hi all,
I really hope this is in the right spot, feel free to move it if not.
I am trying to cobble together my own average-Joe's understanding of how multiplayer games work and wanted to explain it back to an experienced community to see if I had the right end of the stick. Correct me where I am wrong:
When playing a multiplayer game you access a client, a piece of software on your machine that translates your control/input into information which is then sent via your network/internet to a central server. All players of the game do this, and the server continually collects all of their input. The server then runs all of the players input through the game and returns the outcome to the respective players (for example, player A swings a sword, player B gets in the way - the game installed on the server calculates that player B was in the wrong place and tells them they are dead, meanwhile telling player A that they have scored a hit etc....). This information travels back down the tube* to the respective player and is displayed in the game client (in the form of game action).
Now for my questions:
* I have been led to understand that the tube through which the information travels between the server and client (i.e. the internet connection) is referred to as a 'Socket'. Is this the case, if so, what is the significance of the socket - why do I hear it referred to so much in the context of multiplayer game design?
Is the port kind of like an address at the server end? I envisage it like the server having say 1000 different doors. The client needs to know which door to send information to in order to get the server to accept it properly. Does this work both ways (does the server need to access the correct port when sending back to the client?).
When a player hosts a multiplayer game, I assume that the computer they are playing from doubles as both the server and the client?
When looking at big console games (take something like GTA or Halo for example), are the consoles acting as servers in these instances or is that done by bigger servers - if so, where are they/who operates them? The game developers? The console developers?
When you get old defunct community run games, such as 90's titles etc... how are these run online? If I were to switch on a copy of Quake or something - who is running the server?
Presumably modern PCs are growing in their capabilities as servers as they evolve and as internet connections evolve - how would you summarise what their limitations are? Are we looking at a future where massive, graphics intensive MMOs with tons of players are hosted on laptops - are traditional servers days numbered?
-
Sorry if this is basic stuff, but I really want to get a better understanding of how it all works.
Thanks for your patience!
A novice's understanding of how multiplayer games work
A socket is a technical term for how 2 computers on the internet handle sending data between them. You can think of it as the pipe through which the data travels, although obviously there is no single pipe. The socket is the interface through which a program (like a game client, or a game server) communicates across the network.
The port is technically separate from the address, but forms part of the unique identifier of "where the data will go". A socket refers to an address and a port. (Address: https://en.wikipedia.org/wiki/IP_address Port: https://en.wikipedia.org/wiki/Port_(computer_networking)
You could think of an address/port pair as being like a phone number and an extension number, or like a building number and an apartment number. Your 1000 doors analogy is quite good too.
The server does need to know the correct port in order to respond, but that information is sent to it as part of the connection process.
When a player hosts a multiplayer game, it can mean they act as both client and server, yes. Not only does their computer act as both, but sometimes the same program is essentially a client and a server. These are fairly loose terms, where a server implies some degree of authority and a client implies relying upon a server.
Big online games are usually hosted on Linux or Windows hardware, running in a data centre somewhere. Note however that many online games may only use their servers to facilitate connections between players, who then send game information directly to each other rather than via a server - this is called 'peer to peer' networking, and the servers that provide the minimum necessary to join players together tend to be called 'matchmaking' servers.
Many games allow you to run your own server, perhaps as part of the game itself (as mentioned above). This allows you to host servers many years after the original developers stopped doing so. Other games don't provide this functionality, but cunning reverse engineering allowed hobbyist programmers to write their own emulated servers. Providing there is a way for your client to find such servers, or if it is a peer-to-peer game and you can find other peers (or use a matchmaking server to do so), it's possible to play without the original developer needing to run anything.
Modern PCs are certainly growing in terms of capability but there are technical constraints that make the more massive game types challenging to run - and most MMOs will be spreading their players across several (or indeed many) computers running simultaneously, because it's not possible to handle it all on just one computer. That might be due to CPU requirements, it could be down to memory requirements, it could be because of network bandwidth, or many other constraints. You can certainly host games of yesteryear on a laptop of today, but we're nowhere near the point where a single consumer-level laptop computer is able to take the place of the server setups that the big games are using.
* I have been led to understand that the tube through which the information travels between the server and client (i.e. the internet connection) is referred to as a 'Socket'.
A "socket" is the software API that most programs use to talk to the tubes; it's not the tubes themselves. "Sockets" is to networking what "openGL" or "Direct3D" is to 3D graphics. The tubes are to networking what the graphics card is to 3D graphics.
Is the port kind of like an address at the server end?
Yes! The "number of doors" analogy is fine. There are in fact a number of addresses that stack on top of each other. The various numbers are:
- What network protocol are you using? This is almost always "4" for IPv4, or "6" for IPv6. Unless you're a crazy person running Banyan Vines or AppleTalk or Novell Netware on your pre-historic DOS re-enactment computer, or something, that is. This number tells routers how to interpret the next set of addresses.
- What IP address is the target host? This tells routers how to forward the packet to get to the other end of the tubes. IP addresses might be like streets.
- What IP protocol is used for the packet? This is things like "ICMP" (for ping, etc) "TCP" (for web browsing and downloads and remote shell) "UDP" (for IP phones and action games etc) and so forth. Protocol numbers may be like house numbers on a street full of apartment buildings.
- For TCP, what TCP port is being used. For UDP, what UDP port is being used. For other protocols, other protocol-specific sub-addresses. The important bit here: A UDP port is different from the TCP port of the same number -- just because you live behind door 80 in the TCP building, doesn't mean you have anything to do with whomever lives behind door 80 in the UDP building!
- For game protocols, games will often add game-specific sub-addresses here, such as "player 3" or "small gray rock object #7123761." There is no standard for this; each game is different.
When a player hosts a multiplayer game, I assume that the computer they are playing from doubles as both the server and the client?
There is a distinction between the "host" (who gets to set the game rules, kick other players, etc) and "server" (which computer do everyone else send packets to.) This matters when the game company in question runs the servers, but still allow players to host. For example, custom matches in Overwatch. However, in most player-hosted games, the "host" and the "server" are one and the same.
When looking at big console games (take something like GTA or Halo for example), are the consoles acting as servers in these instances or is that done by bigger servers
There are multiple systems involved. Yes, most of the time, especially for games with < 30 players, one of the consoles is the "server." Separately, there is a big back-end that does things like scoreboards and matchmaking and friends lists and so forth. The console game developer develops the code that actually runs the game server; the platform provider (Microsoft, Sony, ...) provides the back-end that does matchmaking, score-boarding, banning of bad accounts, and so forth.
MMOs are different; the consoles don't run those servers, the game developers do; typically they develop servers that they then host in approved/provided data centers that are tied into the console platform network. Even some non-MMO games use developer-provided servers, to cut out the chance of having to migrate servers when players rage-quit. I know Gears of War did this, for example.
When you get old defunct community run games, such as 90's titles etc... how are these run online? If I were to switch on a copy of Quake or something - who is running the server?
Either the game suppots player-hosted servers, and tell you "please connect to my server!," or someone reverse-engineers the server-side protocol, and run their own server somewhere on the internet (and, again, say "please connect to my server"!)
Presumably modern PCs are growing in their capabilities as servers as they evolve and as internet connections evolve - how would you summarise what their limitations are? Are we looking at a future where massive, graphics intensive MMOs with tons of players are hosted on laptops - are traditional servers days numbered?
The problems with player-hosted servers have almost nothing to do with PC capability. It has more to do with network problems -- home networks are not well suited to low-latency upload which is what servers need. When your sister starts torrenting the latest Linux distribution, all other players in your game would suddenly get a bad connection. When you rage quit (sorry, when "your console crashes,") all other players in the game either get kicked, or some disruptive server migration event takes effect. This is a bad user experience. Thus, large games that can guarantee a revenue stream, will likely want to host their own servers, to make sure the gaming experience is smooth.
Separately, for player-hosted servers, because the hardware is under the control of a particular player, that player could conceivably hack/cheat the game, by modifying the server software, and/or modifying the network packets that flow to/from the server. Because the hardware is under control of the player, there is nothing that a developer can do to deter a sufficiently advanced/determined hacker, and once a hacker has developed a script, less advanced cheaters can easily re-use that script. This, as well as the quality of the experience, is often one of the main reasons why developer-hosted servers will probably never go away.
On 8/14/2017 at 10:07 PM, hplus0603 said:QuoteWhen you get old defunct community run games, such as 90's titles etc... how are these run online? If I were to switch on a copy of Quake or something - who is running the server?Either the game suppots player-hosted servers, and tell you "please connect to my server!," or someone reverse-engineers the server-side protocol, and run their own server somewhere on the internet (and, again, say "please connect to my server"!)
This is true for games like Minecraft, but there's an extra layer involved for something like Quake, which has an in-game server browser.
So to add to that response: In order for a server browser to function, a "Master Server" (which is a central server that keeps track of active game servers) is used. When a server game is launched (dedicated, local PC, whatever), it sends information to the Master Server about itself (what it's IP is, the server name, etc). Any client that opens the Server Browser will contact the Master Server for the server list.
Source games typically use Steam's server system as a Master Server. For Quake, it was originally dev hosted, but now I'm not entirely sure who's handling it - but you can still find games via server browser for Quake 1 and other derived clients thanks to the code being open sourced, which is why those games are still active today (you can always do direct connect via IP but server browser makes it easier to keep a small community alive).
Thanks for these replies - great knowledge coming in from everyone!
How does it work for things like iPhone apps?
A program like WhatsApp or Snapchat must have some kind of server system backing it up, or could something like that be done with peer to peer messaging (using each phone to store the images/texts etc...).
iPhones are computers just like any other - just smaller!
Programs like WhatsApp and Snapchat are usually a mixture of peer-to-peer and client-server technology, but it depends entirely on the app. WhatsApp has servers to facilitate connecting users, and the servers are used to relay (but not store) encrypted messages. However voice calls are handled peer-to-peer - after the server has helped them set up the connection, that is.
Interesting - so would I be right in thinking that a setup involving a peer to peer interaction started by a matchmaking server that just puts two users together and then leaves the equation is less 'data/memory/usage intensive' (excuse my basic terms!) than a setup where a central server connects all users AND operates whatever app or program from that same central server too?
Sure, from the server's point of view, doing nothing is cheaper than doing something.
The main consideration is always whether a server is necessary. It might be necessary for various reasons:
- to provide a centralised point where shared content is stored (e.g. a web server, or a directory matching user names to remote devices for matchmaking)
- to perform processing that is difficult on a remote device (e.g. something involving a lot of server power, or access to a private database)
- to perform processing that can't be entrusted to a remote device (e.g. online game calculations where a player may cheat)
- to collect data on what the clients are doing (e.g. web or game analytics)
And obviously there is a lot of overlap between those categories.
A program like WhatsApp or Snapchat must have some kind of server system backing it up
Peer-to-peer is almost never the right choice, especially not on mobile networks where nodes can come and go with the speed of a car or train entering a tunnel.
Also, most of those social applications retain a log of who you've talked to, and what you've said; this is available even if you log in from another device, and thus all the messages go through a server.
Even with a peer-to-peer fabric (which is a bad idea,) you'll need at least some kind of centralized point for the peers to find out about each other. This can be a matchmaker, this can be a "game listing server," or this can be something as simple as a web forum where users post their IP addresses. But, as I said, all of the bigger systems use mainly a server-based system, not only to make sure that players/participants can find each other, but also to make sure to work around things like firewalls that don't support NAT, and so forth.
Discord, for example, makes it a point of pride that all the data goes to their servers and then to other players, because this means other players won't see your IP address, and thus you can't easily be rage-booted/DDoS-ed by IP address.