HashMap or bad architecture

I'm trying to build mmorpg server for academic purposes, so I ran into arcitecture difficulties.

There are five parts of my program: connect-listener, readers, writers, client-holder, handler. All of them communicate to each other via crossbeam-channels.

Connect-listener
A thread with for-loop for getting new connection streams. When new stream received, I create Client structure with some data, create Reader and Writer with TcpStreams and some encoding staff.
Send Reader and client's login to readers thread via channel.
Send Writer and client's login to writers thread via channel.
Send just created Client to client-holder thread via channel.

Readers
A special thread with HashMap<String(client's login), Reader> and loop, that try receive message from channel, then try to read data from all readers (non blocking), then sleep a little (no need to use 100% cpu), repeat. From channel we can get message to add or remove reader (if new client connected or someone disconnected). From Readers we get data from sockets, when we got it, send to handlers channel message with received data (Vec) and client's login.

Writers
A special thread with HashMap<String(client's login), Writer> and loop, that try receive message from channel with waiting on channel.recv(). Message can say to add or remove writer from HashMap or to write data (Vec) with Writer to stream.

Client holder
A thread with Arc<Mutex<HashMap<String, Arc<Mutex>>>> (I know that's crazy and I need help :wink: with that too), String again for login. Here is a loop with channel.recv(), can receive messages to add or remove clients to HashMap.

Handler
A thread (I have 5 equal threads) with loop again with channel.recv(). Messages are always from Reader with client's login and received data Vec. I need to take client from that crazy Arc<Mutex<HashMap<String, Arc<Mutex>>>>, which exists here to (shared memory). So I lock whole HashMap, get client, lock it and work with it (I can change data in Client struct for example).

If player equips a sword (just in case), I receive a request through Reader, pass it to Handler. Then I need to change data in Client struct (mark sword equipped) and send to player's Writer that sword was equipped. But that is not all: I need to go through all clients (in that crazy HashMap), find client's players nearby and send them information that player near them equipped sword.

The problem is in that HashMap. I need to keep all connected clients and work with them (and modify). But even when Server got new connected client, I need to lock HashMap to add element, even if I would use RwLock, here I need write lock here. And in that case all of my handlers will wait.

My question is where I do something wrong, what should I use or what should I read.

I'm not sure I understand what the problem is. Needing a write lock does not unto itself imply a design error. Obeying the RWLock pattern is one of the simplest, obviously-correct ways to design your code, in both multi-threaded and single-threaded settings.

@H2CO3 thanks for your reply!
Getting write lock is not a problem. The problem is that getting write lock on whole HashMap, which I need to add a new client, locks all read tries too, which I need to get in case the simplest handle operation. So connecting new client locks all current clients handling. I want to avoid this and keep handling current clients actions while new clients are connecting and some old are disconnecting.

It sounds like you need a concurrent hashmap. The dashmap crate provides one of these which splits up the hashmap into buckets, so that you only need to lock the bucket containing the key of the entry you are modifying.

1 Like

If your workload is read-heavy, then you can try out evmap

2 Likes

@Diggsey @RustyYato thank you very much! Variants look nice, I will try and write a feedback :+1:

Do the strings you use for hash map keys have a maximum size or a size known that fits most of them?
Something like ArrayString or SmallString might be of interest to you, as they avoid heap allocations completely or most of the time, depending on which one you use.

1 Like

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.