I've got a start on a Luhn algorithm that takes a credit card number as string, as an impl of Digits trait
impl Digits for String {
fn digits(self) -> Result<Vec<u32>, String> {
self.chars()
.map(|c| c.to_digit(10))
.collect::<Option<Vec<u32>>>()
.ok_or("Invalid Digits".to_string())
}
}
pub trait Digits {
fn digits(self) -> Result<Vec<u32>, String>;
}
It works, but I'm now trying to create trait implementation that allows me to pass an integer instead of a string. The problem I'm running into is that since
let ex2 = 4539148803436467;
is literal out of range for u32
I need to let ex2: u64 = 4539148803436467;
, which requires refactoring the Digits
trait to return a result with Vec<u64>
.
Obviously there are ways I can refactor that don't require me refactor the trait to
pub trait Digits {
fn digits(self) -> Result<Vec<u64>, String>;
}
But now I'm curious to how I would do it. I've tried various ways of converting u32 to u64 for
.map(|c| c.to_digit(10))
.collect::<Option<Vec<u32>>>()
but I can't seem to figure it out.