I have a bunch of complicated tasks I want to run in parallel. Some of the tasks add more tasks to the queue, and most of the tasks involve data with non-static lifetimes.
Based on these requirements, it seems like crossbeam's scoped threads (for sending data with non-static lifetimes across threads) are a good fit, and crossbeam's
deque for work-stealing queues should work for parallelizing the tasks.
When setting up the work-stealing queues, I ran into a question about how best to share the stealers between threads. (The docs don't suggest a way to do it.) Here's where I've gotten:
- The docs for
Workerrecommend creating one worker queue per thread. So I call
spawna thread, and create a
Workerqueue in there. So far, so good.
- I also need an
Injectorto serve as the global task queue on the main thread. That's easy enough to populate - I
pushtasks onto it from the main thread, and
pushtakes a reference, so I can share that reference across threads.
- Finally, each
Workershould have a
Stealerwhich lets it steal tasks from other threads - like in the
find_taskexample from the docs, a function which each thread calls to find a new task, passing a
This is where things got more complicated. Since each thread creates its
Worker in-thread, that's also where I have to create its
Stealer. However, to call
find_task, each thread needs to know about a slice of all the other stealers, even though each of those stealers were created on other threads.
To build this list of stealers, I set up a
channel and had each thread send a message to it containing the
Stealer it just created. Then the main thread gathered them all up into a
Vec. This works, but then I hit a problem: how can I now give each thread access to (a slice of) the completed
They can't borrow it, since it's borrowed mutably on the main thread (to populate it), which necessarily happens after the closure which would like to borrow it. I could use channels again to send each thread a reference to the stealers
Vec, but it seems like there ought to be a better way to do that.
It feels like I'm doing too much channel communication to set up something like this, but I couldn't think of a better way!