Can we use rust for a tcp server with godot?


hi all! i was trying elixir, and crystal-lang, but elixir is a bit too confusing and complicated for me (not a functional guy), and crystal-lang’s memory doesn’t get free’d to the OS. which is kinda of scary for a game server

with that said, i use the godot engine and it sends a 32 bit unsigned int before the message in a tcp stream. i was curious, is it possible to read that using rust’s tcp net module? if so, can rust send the same kind of packet info to godot as well?

and 1 more question, if memory gets free’d from rust does it let the OS have that memory again? or does it stay in use forever, like with crystal-lang?

thanks in advance, rust’s syntax looks really promising and i’m enjoying the online playground


Disclaimer : I’m not an OS dev, I did read quite a few memory allocator designs and papers, and also tried to implement my own memory allocator at some point, so I probably got some bits wrong, but hopefully largely right about the memory stuff.

That isn’t as big of a problem as you might think. Returning memory to OS is expensive, and even if you call system’s free, the system may not reclaim it anyway(e.g. in C/C++), and system usually reclaims your process’s memory when it exits instead. You don’t get to call free directly in safe Rust, but even when you “release” some heap memory in Rust, it wouldn’t go back to OS immediately.

Garbage collected langauges like Crystal, Elixir/Erlang leave memory management to the runtime, and the runtime handles reuses and so on. So memory not getting freed to the OS is not a problem(since it will be reused later), but the total memory usage keeps growing endlessly is a problem(you’ve got a memory leak).

Obviously GC langauges cannot guarantee memory behaviour absolutely, so depending on the nature of your server, this may or may not be acceptable. If your server is for critical systems like flight controller on airplanes, then this is obviously not acceptable and you need something that doesn’t exhibit dynamic memory behaviour at all. But since you’re just using it for a game server, then you can tolerate a slight bit of randomess w.r.t memory use(e.g. you may get 1-2MB memory leak due to GC and memory fragmentation every now and then and that’s okay).

Rust feels a bit functional at places, but still overall an imperative language, which may or may not be a problem for you? Elixir/Erlang is best for most servers imo, so it might be worth the effort to get used to functional paradigm if you need to write a lot of servers in future.

The internal message handling of TCP packets is left completely to your program logic, so there’s nothing stopping you from decoding and encoding the messages to interact with the Godot engine(it also means there’s nothing inside TCP libs in whatever language would help you either).

Similar to answer above, probably not. OS deals with memory in pages(4KiB(?) sized memory chunks), so even if OS wants to reclaim your memory, you need to clear an entire page first, and even then the memory allocator(e.g. jemalloc) may not return the page back to OS either.


I think you don’t need to be too worried about freeing memory back to OS if you keep memory usage largely constant, which is usually the goal for servers. Somewhat more importantly, it is really difficult to enforce freeing since the OS’s memory allocator’s actual behaviour is not under your control anyway, so even if you “freed” your memory, it may still be held by the allocator, and is still equivalent to “staying in use forever”.

If you want to try this behaviour, just do some random large allocations in C/C++, then free it and check the task manager, the memory usage shouldn’t change too much.


I’m not an expert of Rust’s standard TCP library but I’m pretty sure you can send over whatever you want as that’s just a raw socket.

As for memory allocation, once you release something it goes back to the OS and it’s up to the OS to decide what to do with it. There is no managed memory and thus no garbage collector that takes care of memory in Rust.


wow @darrenldl what a nice response. ok sounds like i could do it all in rust then and be fine. thank you for all the insight and well thought out reply.

i was just curious about the memory not going back to the OS, because if let’s say a ddos starts happening and ram usage increases to 800mb on a 1gb VPS. in crystal lang, and after the DDOS is over, your memory will stay at 800mb and not go down. so, in essence, you’ll have 3 players online and ur app is utilizing 800mb. doesn’t make sense to me. i want to de-allocate that ram and let the OS have it again


When you (de)allocate in Rust, you’re not talking to the OS directly - there’s some allocator managing the memory for you (eg jemalloc, libc/system). Whether it decides to release memory back to the OS is dependent on its implementation. As mentioned upthread, an OS allocates memory in page multiples, often referred to as allocation granularity. So if you free an object it doesn’t mean the underlying page is free as it may have other allocations in it. This is where the allocator comes in and manages these things.

For a server that wants constant memory usage, you usually allocate your own arenas that you then allocate objects out of. This gives you control over usage and allows you to, eg, reject a request if there’s not enough memory in the arena, even though there may be more available from the OS/allocator. So if you want precise control, you’ll need to do the mem management manually.


To deal with DDoS, you essentially needs to make sure you have a strict uppderbound for all buffers in your server. Some people avoid making this restriction since having a strict upperbound means you need to decide what to do exactly when limit is reached(which can be difficult to determine), while not having a strict upperbound means you can just wave it off and just restart the server when too much memory is used.

There are several things you can do when your buffer is full to avoid growing it, 1) drop new packets/messages/whatever, and process things already in queue first, 2) drop old stuff instead, and process new things, 3) randomly drop things(drop some old and also new stuff). There are more strategies obviously, you can look up Linux QoS policies for more complex arrangement of queues and rules(at possibly here), since network QoS, traffic shaping and network DDoS mitigation is not too different from what you want to do, just things happen at different layers in the OSI.

The most difficult part is to decide which strategy is okay for which buffer. If you’re dealing with real time position information of your player, then favouring new info to old info is best, so you drop old info. If you’re dealing with account login queue, then you probably don’t want to drop old info since this means attacker will always be able to “flush” out players trying to login, but dropping new info means you will need to inform players outside the queue that they just need to wait(there are some more waiting strategies/algorithms).

It is difficult to get all these decisions right, but fortunately yours is a game server, so you can tolerate some level of failure. Eitherway, to buy yourself(your server program rather) some time before crashing, there are some basic rules you can follow

  • Avoid overly complex packet structure. If validation/verification takes too long, then DDoS will be very effective. This also helps with preventing vulnerability(see below).
  • Make decoding as strict as possible. This will also reduce the workload, since if you want to tolerate error then you need more processing on correcting the information. This will again make DDoS effective. This also helps with preventing vulnerability(see below).
  • Look into which layer you want to start tackling DDoS(or other attacks), if you’re hosting the server yourself, then look into using OS firewall to aid you. But either way, your server program should have dedicated code for blocking and filtering traffic as well. The quicker you drop invalid traffic, the better.
  • Why avoid complex packet structure and why make decoding strict? Complex packet structure means you can get things wrong more easily when writing encoder and decoder, getting things wrong means introducting vulnerability. Making decoding strict will reduce the flexibility somewhat(e.g. in some instance you can actually calculate the correct value if you don’t drop it so early), but strict decoding means you also prevent erroneous data from entering your core program logic. Reducing lifespan of invalid data in your program will help with security greatly, and also the speed since you can avoid wasting CPU time and memory down the road.

Basically all languages that can use multicore properly(i.e. not OCaml or Python etc) are pretty good for game servers. Erlang/Elixir handles redundancy really nicely, and also supports hot code swapping(i.e. you can do 0 downtime server maintenance/update), but it doesn’t have a static type system, so I’d still prefer using Rust, but you need to balance your decisions.

You can always go back to C++, and I’m sure there are a lot of frameworks available in C++, but Rust will make sure you don’t get (trivial) memory leaks or other memory misuse.

Lastly, look into ZeroMQ and other message passing libraries and see if they are suitable to you. They deal with a lot of messiness from networking internally(e.g. message queues, host serving policy), those messiness are also where you can easily trip over as well. Note that they don’t deal with serialization/deserialization, so my comments w.r.t. simple encoding/decoding still stand.