I am using a toy example, but the actual one is one scale of very large binary file that was written in the same manner.
Im trying to read from a very large binary file that i've written myself with bincode as well
The whole idea is to easily use Bincode in order to create big files and iterate on them later
#[macro_use]
extern crate serde;
extern crate bincode;
// use std::fs::File;
use bincode::serialize_into;
use std::io::BufWriter;
#[derive(Serialize, Deserialize, PartialEq, Debug)]
pub struct MyStruct {
counter: Vec<u32>,
offset: usize,
}
impl MyStruct {
// omitted for conciseness
pub fn new(counter: Vec<u32>, offset: usize) -> Self {
Self { counter, offset }
}
}
fn main() {
let mut vec = Vec::new();
vec.push(100);
vec.push(500);
vec.push(100);
vec.push(500);
vec.push(100);
vec.push(500);
let m = MyStruct::new(vec, 0);
// fill entries in the counter vector
let mut f = BufWriter::new(File::create("foo2.bar").unwrap());
serialize_into(&mut f, &m).unwrap();
let mut vec = Vec::new();
vec.push(100);
vec.push(600);
vec.push(100);
vec.push(500);
vec.push(100);
vec.push(600);
vec.push(500);
vec.push(101241241);
vec.push(600);
let m2 = MyStruct::new(vec, 35);
serialize_into(&mut f, &m2).unwrap();
drop(f);
// let mut buffer;
// buffer = BufReader::open_raw_file("foo2.bar").unwrap();
let input = File::open("foo2.bar").unwrap();
// let buffered = BufReader::new(input);
let mut vecs: Vec<MyStruct> = Vec::new();
loop {
match bincode::deserialize_from(&input) {
Ok(file) => vecs.push(file),
Err(error) => break,
};
}
}
However, due to read on raw file, it is very slow
How can I use BuffReader to speed up my code inside bincode::deserialize_from?
let input = File::open("foo2.bar").unwrap();
let buffered = BufReader::new(input);