Serializing a Web Scrapper to CSV

let args = Cli::parse();

    let v = &args.pattern;


    let res = reqwest::get(v).await?.text().await?;

    let document = Document::from(res.as_str());

        .filter_map(|n| n.attr("src"))
        .for_each(|x| println!("{}{}", v, x));

    let list_of_x: Vec<_> = document
        .filter_map(|n| n.attr("src"))

    let file = File::create("b")?;
    let mut writer = BufWriter::new(file);
    serde_json::to_writer(&mut writer, &list_of_x).unwrap();

    //let  iter = list_of_x.into_iter();

    let path = "z.csv";
    let mut writerr = csv::Writer::from_path(path).unwrap();
    for row in list_of_x {



Hello rustaceans,
This group has being so helpful thanks. Please I need help again on this my simple project. I have being able to serialize the scrapped image into csv but I want each imgae url to be in columns and not row. I tried to loop through the vector and then serialize but I'm get an error from (row) : &str is not an interator. I tried adding this to .into_iter() at the list_of_x, still showing the error.

Thanks a lot


for row in list_of_x {

-        writerr.write_record(row).unwrap();
+       writerr.write_record(&[row]).unwrap();       


You're getting an error since write_record expects an iterator over fields (of a single record), while you were passing in a single field.

1 Like

Okay. Wow, thanks a lot. It work and I got the desire output. Thank you

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.