Did I do anything wrong or it's `data_encoding`'s bug?

playground

extern crate data_encoding; // 2.5.0

fn main() {
    const LEN: usize = 64;
    let mut signature = [0; 64];
    let s = data_encoding::BASE64.encode(&signature);
    println!("{}", s);
    // okay
    assert_eq!(data_encoding::BASE64.decode(s.as_bytes()).unwrap().len(), LEN);
    // panics
    data_encoding::BASE64.decode_mut(s.as_bytes(), &mut signature).unwrap();
    // panics
    assert_eq!(data_encoding::BASE64.decode_len(s.len()).unwrap(), LEN);
}

It looks like decode_len doesn't consider the effect of padding on the output length at all, and decode_mut's prior checking uses the results of decode_len directly. But such an obvious bug in such a frequently used library makes me doubt myself.

And it actually does exactly that.

Got it. The correct usage is to pre-allocate a buf with the max raw length corresponding to the encoded length (what decode_len actually means, it should be named decode_max_len in my opinion), perform the decoding which returns the actual_len, and the result will be in &buf[..actual_len]. It's not a bug, but the API design and documentation could be better.

1 Like

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.