Does automatic String deref work for Vec<&str>?

Is it possible to write a function that accepts either Vec<String> or Vec<&str> without using an enum? I know that a scalar String would automatically deref to a &str, but it doesn't seem to be the case for strings inside a vector.

/// Replace variables in template with (potentially multiple) values
/// Results equal to the Cartesian product of variables interpolated
fn expand_vars<'a> (template : &[&'a str]) -> Vec<Vec<String>> {
...
}

The above signature works with let template = vec!["a", "b", "c"] , but doesn't work with let template = vec!["a".to_owned(), "b".to_owned()] (a Vec<String>).

No, because each element in a Vec<String> is 24 bytes large, but each element in an &[&str] is 16 bytes large. The conversion would require a separate allocation for the &str values.

You could use generics.

fn expand_vars<S: AsRef<str>>(template: &[S]) -> Vec<Vec<String>> { ... }
2 Likes

Ahh, thank you for the explanation, that does make sense. Generics is a brilliant solution, I'll go with that. Thanks!

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.