I am writing a custom serialization/deserialization format (similar to Redis protocol) and I am having an issue where the "deserialize_seq" is always called when requesting to deserialize a Vec or a &[u8] when I would like it to call "deserialize_byte_buf" and "deserialize_bytes" respectively.
Here is the code (relevant parts only I believe) of the deserializer I've written:
pub fn from_reader<'reader, R: io::Read, T>(reader: &'reader mut io::BufReader<R>) -> Result<T>
where
T: Deserialize<'reader>,
{
let mut deserializer = Deserializer { reader };
T::deserialize(&mut deserializer)
}
pub fn byte_buf_from_reader<R: io::Read>(reader: &mut io::BufReader<R>) -> Result<Vec<u8>> {
let mut deserializer = Deserializer { reader };
Vec::<u8>::deserialize(&mut deserializer)
}
and here is the deserialization implementation of "deserialize_byte_buf":
fn deserialize_byte_buf<V>(self, visitor: V) -> Result<V::Value>
where
V: de::Visitor<'de>,
{
visitor.visit_byte_buf(self.parse_bytes()?)
}
and here is my "parse_bytes" inherent method on my deserializer struct:
fn parse_bytes(&mut self) -> Result<Vec<u8>> {
match self.peek()? {
Some(b) if b == '$' as u8 => {
self.consume(1);
let len = self.read_line()?.parse::<usize>()?;
let mut buf = Vec::<u8>::with_capacity(len);
buf.resize(len, Default::default());
self.reader.read_exact(buf.as_mut())?;
let final_delimiter = self.peekn(2)?;
match final_delimiter {
[0xD, 0xA] => Ok(buf),
input => Err(Error {
kind: ErrorKind::DataError,
message: format!(
"Expected ending delimiter 'CR LF' for input of Bytes, found: {:?}",
input
),
}),
}
}
input => Err(Error {
kind: ErrorKind::DataError,
message: format!("Expected '$' for input of Bytes, found: {:?}", input),
}),
}
}
When I attempt to test the "deserialize_byte_buf" by requesting the deserialization of a "Vec" the underlying "deserialize_seq" is called instead of "deserialize_byte_buf". Here is the test code:
#[test]
fn test_byte_buf() -> Result<()> {
let string1 = "This is a test".to_owned();
let string2 = "This is also\r\na test...∑, 𖿢".to_owned();
let input = format!(
"${}\r\n{}\r\n${}\r\n{}\r\n",
string1.len(),
string1,
string2.len(),
string2
);
let reader = &mut io::BufReader::new(input.as_bytes());
assert_eq!(string1.as_bytes(), byte_buf_from_reader(reader)?);
assert_eq!(string2.as_bytes(), byte_buf_from_reader(reader)?);
Ok(())
}
Any idea how to get Serde to invoke the "deserialize_byte_buf" instead of "deserialize_seq"? I can't seem to find any good references or documentation that explains issues around this.
Thanks in advance for any assistance or guidance.