Below is the max i could get. But this implementation means that i have to parse the bytes myself and build the hashmap. Where as I am looking for some library which can do it for me.
let mut body = self
.client
.get_object()
.bucket(bucket)
.key(key)
.send()
.await
.unwrap()
.body
.into_async_read();
let map: HashMap<Value, Value> = HashMap::new();
let framed = tokio_util::codec::FramedRead::new(body, LinesCodec::new());
let mut bs: Vec<u8> = vec![];
loop {
let b = body.read_u8().await;
if b.is_err() {
break;
}
bs.push(b.unwrap());
}
let map: HashMap<String, String> = serde_json::from_slice(&bs).unwrap();
The above is also working. But the problem I see with this approach is that I have to take in all the bytes into memory and then build a map. But I was wondering if we can convert the stream into another stream and give the stream to the converter which gives me a map as output --I am thinking this may reduce the memory pressure so asking.
There is SyncIoBridge in tokio_util::io - Rust which would let you feed an AsyncRead to serde, but unless you're getting into the gigabytes I doubt it would help much.