I'm not satisfied with my function

So i wrote this simple fn that parses an url (ex. /article?id=43&sort=true it parses the parameters into a hashmap)
it passed all my test cases without breaking

And rewrote it 9 times(not exaggerating) and i still can't get this feeling out of my chest that this code is trash and can be optimised more(as i was coming from a very low level environment) so can someone assure me or review my code?

Data::Params() is an enum(obvious meh)

pub fn parse_params(url: &str) -> (&str, Data) {
    match url.split_once("?") {
        Some((uri, res)) => {
            let mut hm: HashMap<&str, &str> = HashMap::new();
            let _d = res.split("&").for_each(|x| {
                println!("{x}");
                if let Some((a, b)) = x.split_once("=") {
                    hm.insert(a, b);
                }
            });
            return (uri, Data::Params(Some(hm)));
        }
        None => return (url, Data::Params(None)),
    };
}

and thank you :speaking_head:

You could use functional combinators to improve the design a little bit:

pub fn parse_params(url: &str) -> (&str, Data) {
    let (uri, params) = url.split_once("?").map(|(uri, res)| {
            let mut hm: HashMap<&str, &str> = HashMap::new();
            let _d = res.split("&").for_each(|x| {
                println!("{x}");
                if let Some((a, b)) = x.split_once("=") {
                    hm.insert(a, b);
                }
            });
            (uri, Some(hm))
        }
    }).unwrap_or((url,None));

    (uri, Data::params(params))
}

You can also simplify the generation of the hashmap using iterators:

let hm = res.split("&").filter_map(|x| x.split_once("=")).collect::<HashMap<_, _>>();
2 Likes

This is what I came up with.

pub fn parse_params(url: &str) -> (&str, Data) {
    let Some((uri, res)) = url.split_once('?') else {
        return (url, Data::Params(None))
    };

    let hm = res
        .split('&')
        .inspect(|x| println!("{x}"))
        .filter_map(|x| x.split_once('='))
        .collect::<HashMap<_, _>>();

    (uri, Data::Params(Some(hm)))
}

(But I ran no benchmarks, if that's what you meant by optimized.)

6 Likes

This is a two step job:

1.) Extract the query string from the URL,
2.) Parse the query string.

So, consider using two functions:

pub fn parse_params(url: &str) -> (&str, Data) {
    if let Some(uri, query) = url.split_once('?')  {
        let params = parse_query(query);
        (uri, Data::Params(Some(params)))
    } else {
        (url, Data::Params(None))
    }
}

fn parse_query(query: &str) -> HashMap<&str, &str> {
    let mut params = HashMap::new();

    for key_value in query.split('&') {
        println!("{key_value}");
        if let Some(key, value) = key_value.split_once('=') {
            params.insert(key, value);
        }
    }

    params
}

Alternative, more functional style as shown by @quinedot

fn parse_query(query: &str) -> HashMap<&str, &str> {
    query.split('&')
        .inspect(|kv| println!("{kv}"))
        .filter_map(|kv| kv.split_once('='))
        .collect()
}

Just a note: The original function parses not URLs with fragments ('#' anchor)...

1 Like

I never thought about collecting into a HashMap as a vec could be of variable lenght, thanks

1 Like

great, this sure looks neet and understandable, thank you

Http clients never send the anchor # i believe

You might want to decode the strings.

I use something like

let decoded: String = percent_encoding::percent_decode_str(encoded).decode_utf8_lossy().into();
2 Likes

If you are using
web_sys::window()
and getting window.location().href() then you will get the anchor #.

EDIT: Never mind. I thought your code was going to run "in" the client/browser. I see now that you are probably writing server code.

1 Like

Yes i'm writing a server code