So I landed on this clippy page here which makes the argument that function arguments should not be typed as &String , &Vec , &PathBuf , and Cow<_> .
It's explanation reads:
Requiring the argument to be of the specific size makes the function less useful for no benefit; slices in the form of &[T] or &str usually suffice and can be obtained from other types, too
I do not fully understand the rationale. I am guessing this has to do with differences between DST's and types that are statically known size?.
If so, Is this because using something like &[T] or &str results into less memory usage? If so, what is it about DST's that make them use less?
No, it's because the zero-cost conversions only go one way.
Using a fully concrete type, for example, there's no reason to take &Box<i32> as the parameter to a function, because there's nothing useful the function body can do with that that it can't do with a &i32, and if your caller has a &Box<i32> they can pass that as a &i32 no problem, but if you take &Box<i32> as the parameter then someone with a &i32 will need to allocate a Box to call your function.
(It's also a μoptimization too, because if you have v: &Vec<i32>, then v[0] needs to read the pointer out of the &Vec, then read the integer. Whereas if you have v: &[i32] then it has the pointer already, so can just read the integer directly -- one fewer indirection.)
It's because taking a &Vec requires you to have a Vec. If you have something that is not a Vec but would coerce to a slice, then this forces you to allocate a vector.
For example:
fn take_vec(_: &Vec<u64>) {}
fn take_slice(_: &[u64]) {}
fn main() {
let arr = [1337; 42];
// works:
take_slice(&arr);
// doesn't work:
take_vec(&arr);
// works but allocates unnecessarily:
take_vec(&arr.to_vec());
}
The general principle is to use the least-restrictive type that the function's body and semantics actually require.
&String requires:
That you have a valid UTF-8 sequence of known length,
That the caller own the allocation containing that sequence, and
That it be allocated using the String type.
On the other hand, &str requires:
That you have a valid UTF-8 sequence of known length.
There are other ways to obtain a valid UTF-8 sequence, including via string literals (type &'static str), via containers like OsString and Vec, or even via arrays. All of those can be used with a function accepting a &str, but cannot be used with a function accepting a &String without first converting the string to that type.
If the function body actually uses the unique capabilities of String, then by all means require it, but if it's treating it like a container for a sequence of characters, those extra requirements are restrictive for no benefit.