Data Processing in Parallel into a database (Hackathon Project)


in our team at work we have 2 Hackathons / year and for the next year I would like to introduce Rust :slight_smile: As we have in our daily work to do with data processing I am thinking about a challenge to read a huge csv file and to import each lines in parallel into a database like Surrealdb.

As I currently do not have so much experience with Rust generally I do not know what could be a good start for this experimentation.

During my research I came accros the crate Rayon or Tokio.
On I found an article from Umur Ozkul which shared this code:
Is this something I can start with?

use tokio::sync::mpsc;


async fn main() {

   let (mut producer_tx, mut transformer_rx) = mpsc::channel(32);

   let (mut transformer_tx, mut consumer_rx) = mpsc::channel(32);

   let producer_handle = tokio::spawn(async move {

       for i in 1..=10 {

           println!("Producing {}", i);




   let transformer_handle = tokio::spawn(async move {

       while let Some(i) = transformer_rx.recv().await {

           println!("Transforming {}", i);

           transformer_tx.send(i * i).await.unwrap();



   let consumer_handle = tokio::spawn(async move {

       while let Some(i) = consumer_rx.recv().await {

           println!("Consumed {}", i);



   tokio::try_join!(producer_handle, transformer_handle, consumer_handle).unwrap();


Thank you very much,


I'm not familiar with Surrealdb; does it prefer this? Most databases don't; a single connection doing a bulk load tends to perform best.

If your goal is a high amount of concurrency instead of a high amount of parallelism, then this is one of many of potential options. A task isn't a thread; each thread can handle a very large number of tasks.

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.