I have some iterator that produces a large amount of data (not fitting into the memory), which I'd like to process in parallel manner and write each result into the same file. I found rayon
to be very cool tool to do parallel things, but, obviously, I cannot use mutable buffer in the parallel region, and I wouldn't .collect()
results because I cannot allocate such a large amount of memory. Is there any possibility to back my application flow to sequential execution?
(Broken) code snippet to explain my problem:
use std::fs::{File};
use std::io::{BufWriter, Write};
use rayon::prelude::*;
fn main() {
let file = File::create("/tmp/data.txt").expect("Cannot create file");
let mut buf = BufWriter::new(file);
["hello", "world"].par_iter()
.map(|s| String::from(*s) + "_")
// Back to sequential execution somehow
.for_each(move |s| {
buf.write(s.as_bytes()).expect("Cannot write to file");
});
}
Sorry if this question is discussed some basic things, I'm newbie in both Rust and parallel programming.