Trying to understand why this always throws an error

I'm working on my own simple implementations of basic Unix utilities, as an exercise to learn Rust. I have previous experience with C. So, I'm stuck here for a moment, with base32 and base64 utils. In both cases I'm getting similar errors, using different crates from crates.io (data-encoding for base32, base64 for base64). A code snippet from my base64 program:

        let file = matches.value_of("INPUT").unwrap().to_string();
	let contents = fs::read_to_string(&file).unwrap();
	if matches.is_present("DECODE") {
	    let decoded = &decode(contents.as_bytes()).unwrap();
	    println!("{}", Base64Display::with_config(decoded, base64::STANDARD));
	} else {

Encoding works fine in both cases, but both fail decoding.
base64:
thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: InvalidByte(20, 10)', src/main.rs:32:52
base32:
thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: DecodeError { position: 2104, kind: Length }', src/main.rs:33:59

In both cases the native system utilities can decode the input file just fine, and the input file was in fact generated with my programs.The base32 code is very similar to the above snippet, with the exception of how I'm displaying the output due to differences in the crates I'm using.

trailing newline perhaps?

Edit: Try if

let decoded = &decode(contents.trim_end()).unwrap();

works :wink:

Thanks, a variation of that did work. I had thought of that, but didn't know of trim_end(). Anyway, since it expects a byte string I had to go with:

let decoded = &decode(contents.trim_end().as_bytes()).unwrap();

Now I just need to add line wrapping and accept input from stdin so that there is the same functionality as the native utilities.

It doesn’t. According to the docs, decode is:

pub fn decode<T: AsRef<[u8]>>(input: T) -> Result<Vec<u8>, DecodeError>

And there exists a

impl AsRef<[u8]> for str

in the standard library, as well as this one:

impl<'a, T, U> AsRef<U> for &'a T
where
    T: AsRef<U> + ?Sized,
    U: ?Sized, 

These together imply that T == &str, i.e. input: &str is a valid parameter type for decode.

I see. And yes, that does indeed work using that crate, but I still need the .as_bytes() in my base32 program using the data-encoding crate.

Thank you for all of your help.

You may be interested in the bstr crate, which makes byte strings more ergonomic to work with and can help avoid round-tripping through String.