Question about opensll bignum functions

I have a question about the following code, specifically the line using bignum.to_dec_str. What is it doing?

The reason I am asking is I need to recreate this function in C#. Step 1: convert string to sha256. I can do that. I don't understand what the bignum functions are doing. I realize this may not be the right forum for this post--my apologizes if this belongs elsewhere.

extern crate openssl;
use self::openssl::sha::sha256;
use self::openssl::bn::BigNum;
use utils::error::BIG_NUMBER_ERROR;
pub fn encode(s: &str ) -> Result<String, u32> {
    match s.parse::<u32>() {
        Ok(_) => Ok(s.to_string()),
        Err(_) => {
            let hash = sha256(s.as_bytes());
            let bignum = match BigNum::from_slice(&hash) {
                Ok(b) => b,
                Err(_) => {
                    warn!("{}", BIG_NUMBER_ERROR.message);
                    return Err(BIG_NUMBER_ERROR.code_num)
            match bignum.to_dec_str() {
                Ok(s) => Ok(s.to_string()),
                Err(_) => {
                    warn!("{}", BIG_NUMBER_ERROR.message);


View the source to see the ffi functions being used.


I didn't consider looking at the raw source. Something I forget is available in rust.

so I dug into the rust code and came to the conclusion it didn't help. The rust library is calling
BN_dec2bn in another library.

I was wanting to know the actual transition of 's' to 1912512073317650557350815926626740347077674046295559837483249781945837465481. The math behind it, not another library call.

You can find the logic in openssl's source code:

1 Like