Hello, and welcome to the fourth devlog!
Today we're bringing in friends! In other words, experimental online multiplayer has been implemented!
Winter Social Media Hiatus
Let's rewind for a bit. I took a break from posting about the game throughout December, January, and February. However, I was still heavily developing the game.
The short reason for the break, was that I was constantly thinking "make something new to post". So instead of properly fleshing out ideas and realizing features. Then I ended up drowning in experimental and half-baked code. So I took a break, so I could get back into the flow, of actually progressing the development of the game properly.
Winter Progress
Throughout the winter months, a long list of things have been worked on. Originally, this devlog included various topics:
- Experimental online multiplayer
- Utility that automatically processes Aseprite files exporting images and animation data
- I'm considering releasing this as a standalone tool
- Updating millions of tiles instantaneously
- Even faster tile rendering
- Procedural world generation overhaul
However, I want to keep this post short and self-contained, so today we'll only be focusing on experimental online multiplayer.
Experimental Online Multiplayer
The gameplay is not really at a point yet, where actually having online multiplayer makes sense. However, implementing networking early definitely helps with structuring the rest of the game and engine. The main requirement I set, was that the implementation must be generic, such that it can still be used later when it's actually needed.
The Requirements
So what does the game need, in relation to online multiplayer? Well, here's the checklist I made:
- Client and server must exchange their protocol version number to verify compatibility before anything else
- Client sends client messages, e.g. "this is my position"
- Server broadcasts server messages, e.g. "[player-id] is at this position"
- Server persists player data, e.g. player should spawn at the location where they left
- When a client disconnects (regardless of how) broadcast that event to other clients
- When a new client connects broadcast the new player to all other clients
- Send all existing players to the new client as well
- Client's with a bad connection, shouldn't cause issues for other players
In the future, I want to write a more technical post including more of the implementation details, in relation to the responsibilities of the server compared to the client.
For instance in the future, part of the world simulation will occur on the server. However, currently the server only broadcasts the world seed to all clients. Additionally, the clients currently only send positional data of the player. So if a player chops a tree, then that isn't reflected amongst the other players yet.
Again, the overall goal was to implement the backbone for doing online multiplayer. The current implementation handles all the communication aspects needed. So in the future I only need to add new message types, e.g. a "player chopped tree" message. In other words, I only need to add new message variants, without needing to touch the actual network implementation at all.
The Implementation
The current implementation is super simple and uses a single TCP stream, between the server and client. The server binds a single TcpListener
, then clients connect using a single TcpStream
.
Both the server and client split the sending and receiving part of the stream, into two separate threads.
- The receiver thread, will indefinitely (do a blocking) read from the stream, until the connection is closed
- Inversely, the game loop will send client messages through a channel to the sender thread
- The sender thread, will indefinitely (do a blocking) read from the channel, and write all received client messages to the stream, until the connection is closed
The same logic applies to both the server and the client.
Additionally, on the server the game loop will receive all incoming client messages. The server will act upon these, and translate them into zero-to-many server messages, i.e. a client message saying "this is my position" is translated into "[player-id] is at this position". Then the translated client messages are broadcasted to the necessary clients.
The Protocol
The current protocol is equally super simple. It is split into two states, the first being the handshaking state, which on success transitions into the messages state.
Handshaking
When a client connects to the server, then they first perform a handshake by exchanging their protocol version number. Which is simply a 32-bit unsigned integer used to verify their compatibility. If they are compatible, then they transition into the messages state. Otherwise, if they are incompatible, then the server will refuse the connection.
In other words, if someone is running an old version of the server, then it might not be compatible with a newer version of the game. Thereby, if they started communicating, they'd likely start sharing data, that they each interpret as garbage. So to prevent that from happening, they first check their compatibility before anything else.
Messages
The connection remains in the messages state until either the client or server disconnects. The messages state allows both the server and client to send each other zero-to-many messages over time.
Each message is transmitted, by first sending a 32-bit unsigned integer representing the byte size of the serialized message. After that the actual serialized data of the message itself is sent. This nicely allows the receiving end to do a blocking read, since we always know exactly how much data is incoming.
Serialization
Most serialization libraries in Rust, usually only require swapping a handful of lines of code. So I didn't put much thought into which one to pick, since I'm not able to do a proper benchmark yet anyways. So currently the game uses bincode
, purely because I've used it in the past.
On the other hand, using a binary format can be hard to debug, when inspecting network traffic and raw incoming data. So in debug builds the game instead uses serde_json
. This makes the data easier to inspect, but comes with its own set of problems.
Many data types can be stringified, without any loss-of-data. However, floating-point numbers cannot!
Here's an exaggerated version, where client 1 echo back the position of client 2. While client 2 teleports between its own known position and the received position, causing it to "shake" between the two positions.
As a work around, all floating-point numbers are wrapped in a new type, which when serialized transmutes the float into bits, such that the bitwise value is preserved.
In the future, I want to benchmark and compare various serialization formats. The game both needs a format that is compact as well as fast to serialize and deserialize. However, again, a useful comparison cannot be made currently, so I will postpone this until later.
I did however find a benchmark showing that rkyv is one of the best-performing serializers available (among those tested in 2021).
Future Improvements
Congestion
Neither the client nor the server currently handles congestion on the sending aspect. In the future, the sender thread should at least check all pending messages. If the connection is poor, then there might be multiple messages queued up, where one message cancels out others.
As an example, if the connection is poor then the sender's message queue might contain multiple "[player-id] is at this position" messages. Where the sender should skip sending all but the newest one.
Players In-View
Currently, the positions of all players are sent to all players. In the future, it would make more sense, to only broadcast player positions between players that are in proximity of each other, or more specifically players that are (almost) in-view of each other. Player positions are (potentially) updated very frequently, so given enough players, then this could greatly reduce the network load for both the server and clients.
Inputs Not Positions
Alternatively, network load could be reduced even further, by only transmitting player inputs. Including a timestamp of when the input happened, and then it could be used to predict player movement. This is of course a whole separate can of worms, that can result in delayed movement, sluggish input response, and incorrect predictions resulting in a player "jumping" around on screen. Even worse, a bad connection could result in a player continuing to move, even though they already released the key.
Notably, Factorio sends inputs and not positions. In FFF #302 they covered some latency issues and what they did to solve the issues. An additional notable mention is FFF #149.
Here's another video showing 5 players:
This post was a bit more technical than the previous posts. I attempted to juggle it between being technical towards developers, while still being somewhat understandable by non-developers. Personally, I enjoy writing more technical posts, I also wouldn't be opposed to including actual code snippets. All in all, it depends on what you the reader wants. If people are into this, then I'll keep including technical stuff, if not I will focus more on high-level game updates.
Thanks for reading! Feel free to leave a comment through any of the following links: