Parse ASCII to u32 error

fn parse_c_str(slice: &[u8]) -> u32 {
    let mut res = 0;
    slice.iter().for_each(|n| {
        let digit = (*n as char as u32).wrapping_sub('0' as u32);
        res = res * 10 + digit;
    });
    res
}

let p = parse_c_str(&[48, 55])
assert_eq!(p, 7)  

this will panic and got 0.
how can I fix this?

Works for me (once I put the statements in a function).

But also, let me suggest the atoi crate. Or natively:

fn atoi(slice: &[u8]) -> Option<u32> {
    std::str::from_utf8(slice).ok()?.parse().ok()
}

fn main() {
    let p = atoi(&[48, 55]).unwrap();
    assert_eq!(p, 7);
}

In here, I am trying to covert ASCII to u32

@quinedot
this function is used to parse date . but *n as char as u32 got a bad answer.

I just want to parse it at a very fast speed.

I am not try to use std::str::from_utf8(slice).ok()?.parse().ok()
too many checks in here.

I have made sure that the params will not cross the border. so check is not neccessary. just ignore them.

Hmm. I can't reproduce it. If you can reproduce it or at least put the entire error here, maybe I could say more about that. If you're working with larger numbers, your multiplication and addition could overflow and panic in a debug build.

I still recommend the atoi crate. You'd probably want this method.

I'm sorry about that. i can't reproduce it too, the code is true. thanks for your reply again

error reduce by

 let y = parse_c_str(&a[0..4]) as i32;
 let m = parse_c_str(&a[4..6]);
 let d = parse_c_str(&a[6..]);

 let h = parse_c_str(&u[0..2]);
 let m = parse_c_str(&u[3..5]);
 let s = parse_c_str(&u[6..]);

got two m

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.