Crate to derive `foo_internal` to avoid monomorphisation?

It's a common pattern to do something like:

fn foo<T: AsRef<str>>(s: T) {
    foo_internal(s.as_ref());
}

fn foo_internal(s: &str) {
    // ton of complicated code follows
}

to avoid cost of monomorphisation of foo_internal. It seems such a common pattern, that I would expect someone to write a macro or something to do it, but my search-fu is failing me today.

Is it? I have never come across this. And as for the "cost" of monomorphisation, I am guessing you are speaking from an embedded POV? Because, for all I can see, monomorphisation still happens for foo, so the only thing you'd probably be saving is the space required in the binary for the multiple copies of foo_internal (which would matter only if it is large and you have very little memory at hand).

3 Likes

You're also saving the compilation time from re-code-generating and re-optimizing the body of the method for every caller.

We often do it in std for things taking AsRef<Path>, like

3 Likes

Aside from already-mentioned compile time issue, wasting memory can have impact on performance due to instruction cache being filled with multiple copies of the same code if the compiler failed to optimize.

I believe failing to optimize can happen because the compiler may not understand it's useful to call the conversions first and create slight differences in the code that prevent it from being merged. Not a compiler expert, correct me if I'm wrong.

I've seen the pattern in std and some other crates multiple times. I also wrote such code recently.