Hello,
I'm trying to code a tool that will read every ".csv" file inside a directory reccursively and make an output file with all line of these files that have "200" as "status_code"
The normal version works and It is like that :
use scan_dir::ScanDir;
use std::path::PathBuf;
use std::fs::File;
use std::io::{BufReader, Read, BufRead};
use std::{io, thread};
fn main() {
let ffuf_files : Vec<PathBuf>= ScanDir::files().walk("/home/neolex/reconbug/hunt/samsclub.com/", |iter| {
iter.filter(|&(_, ref name)| name.ends_with(".csv"))
.map(|(ref entry, _)| entry.path())
.collect()
}).unwrap();
let mut wtr = csv::Writer::from_path("out.csv").expect("error writing file");
wtr.write_record(&["FUZZ","url","redirectlocation","position","status_code","content_length","content_words","content_lines","resultfile"]);
for file_path in ffuf_files {
let input = File::open(file_path.as_path()).unwrap();
let buffered = BufReader::new(input);
let mut reader = csv::Reader::from_reader(buffered);
for result in reader.records()
{
let record = result.expect("a CSV record");
if &record[4] == "200" {
wtr.write_record(&record);
}
}
}
}
But it is still a little slow if I have a lot of big files, so I want to make it multithreaded but I'm a bit lost.
Especially on how to write on the output file without messing everything, I guess I need locks.
Could you help me please ?