Game server & mpsc design

If I have a multiplayer game server, and I need a way to handle input events (from N different networked cilents), is the standard approach to use:

  1. 1 unbounded mpsc channel

  2. 1 bounded mpsc channel

  3. N bounded mpsc channel


It is not obvious to me which way to go and I am curious what the idiomatic approach is.

There is no one idomatic approach, because there's two separate issues embedded in your question:

  1. whether the queue is bounded or not depends on the consequences of that bound being filled, and more generally, how you want to handle backpressure within your system. If the client suddenly throws an enormous amount of data at you (which you can't trust it to not do, because in general you can't trust the client) do you want to attempt to buffer all that data? Is (parts of) the system blocking until the data can be dealt with acceptable?

  2. Whether you have one or multiple channels depends largely on what you're doing with the messages and how parallizable it is. If your game state is a monolithic blob, having multiple queues accepting the messages won't buy you anything, because they'll all have to wait for exclusive access to the state. OTOH, if the game state can be divided into lots of independent pieces, you might have N channels to apply changes to those pieces in parallel

1 Like

This depends on a few factors, but what you should be thinking is about is the desired behavior when processing can't keep up and the queues begin to fill with messages:

  • Do you want "fairness" between the clients? Then maybe N channels served in a round-robin fashion would be better than 1 channel.
  • Do you absolutely need to process every message, or could you drop messages when the queue is full? In the latter case, then you could use try_send on a bounded queue instead of send which waits for a space in the queue.
  • Is it not OK for the senders to wait OR for messages to be dropped? Then use unbounded queue(s).

Note that bounded queues using the blocking send can causes deadlocks when there's inter-dependent queues that fill up, so care should be taken to to avoid that.

1 Like

Unbounded is a recipe for running out of memory. Nothing is really infinite, so whether you like it or not, every channel will have a depth limit. Unbounded just won't handle it gracefully.

Depth 0/1 may unnecessarily block senders when the receiver is busy or just waiting on the OS to run the thread.

If you can block on the sender side without problems, pick something high enough to handle typical bursts of sends, but still reasonably low, e.g. number of players, or double that.

If blocking is undesirable, pick something very high, and threat queue full as a error.

  1. I agree with this.
  2. This makes me very uncomfortable. It seems everything needs to be bounded; or else it is a out-of-memory waiting to happen under pressure.

So if the # of players is dynamic, set it to max_num_users * max_msgs_per_user ?

Yes. Most elements of your program are inherently β€œbounded” β€” not going to allocate arbitrary memory β€” simply by being sequential. When you introduce a channel β€” meaning the sender and receiver are concurrent β€” you start needing to introduce explicit back-pressure via mechanisms such as bounded channels.

The back-pressure must be able to be propagated within your program all the way back to the part where you read data from the network, so that incoming messages (whether accident or DoS) cannot cause any part of your program to over-allocate. The network will then propagate this back-pressure to the computer sending the data.

You can also choose to close connections β€” this might be an appropriate solution if all individual connections are behaving reasonably but you have too many of them to serve at full speed. It depends on whether, for your game, it is better to provide slow service to many players, or normal service to a limited number of players; this will depend on game mechanics (e.g. turn-based games have much lower latency requirements than FPSes) and player expectations.

1 Like

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.