I'm having trouble integrating the async-bincode crate (v0.5.1) with async-std (v0.6.2). Specifically, I'm trying to read from a async_std::net::TcpStream in a task/thread, deserialize the data with bincode, and send it through a crossbeam synchronous channel.
The original implementation used a std::net::TcpStream with no problem, but migrating to async_std has raised a number of issues, chiefly that bincode::deserialize_from accepts a std::io::Read trait and not the async_std::io::Read trait. To solve this problem, I reached for async-bincode.
Unfortunately, I struggled to find usage examples that fit my use case. In addition, it seems to use tokio in the backend. Here is a simplified view of my code:
I've already imported the Stream trait from async_std like so: use async_std::prelude::*;. I assume the problem is due to trait bounds, but as I'm somewhat new to the language I'm struggling to diagnose and resolve the issue.
Ideally, I would like to avoid Streams altogether because I want the data to be ordered, but for now I just want the code to work as an MVP.
I'm running into a silly problem I can't seem to crack, and I was wondering if you could help me.
I'm creating a buffer of size 8192 bytes. I'm passing the buffer to the read function, but it never reads anything from the TcpStream. This is a synchronous example, with std::net::TcpStream as well.
async fn read_from_stream(mut input: std::net::TcpStream, output: Sender<T>)
where T: for<'de> serde::de::Deserialize<'de> {
use std::io::Read;
const MAX_REDUCTIONS: usize = 2000;
let mut reductions: usize = 0;
let mut buffer = Vec::with_capacity(8192);
println!("reading!");
loop {
if let Ok(_) = input.read(buffer.as_mut_slice()) {
if let Ok(data) = bincode::deserialize(buffer.as_slice()) {
println!("got some data!");
if let Ok(_) = output.try_send(data) {
continue;
}
}
}
// println!("yielding read");
task::yield_now().await
}
}
If I print the contents of the read Result, it's always 0 bytes. I'm not sure what I'm doing wrong. What do you think?