I was surpised to discover that the usual trick to take impl Into<T> to make it so callers don't have to call into themselves doesn't seem to work for strings. It keeps complaining that str is unsized, even after I try to tell it unsized types are okay with ?Sized. What am I missing here?
#![allow(dead_code)]
#![allow(unused_variables)]
fn test(a: &(impl Into<String> + ?Sized)) // presence of ?Sized makes no difference, strangely
{
}
fn test2(a: String)
{
}
fn main()
{
//test("Hello world"); // fails to compile
test2("Hello world".into()); // but this is allowed, suggesting str impls Into<String>
}
Oh I misunderstood the complaint, I thought it had to do with generic args being Sized by default. It just doesn't want you passing unsized things by value.
I did as you suggested, which works, but I thought it would prevent passing in &String since the argument is now by-value. But it actually works, why?
#![allow(dead_code)]
#![allow(unused_variables)]
fn test(a: (impl Into<String> + ?Sized)) // presence of ?Sized makes no difference, strangely
{
}
fn main()
{
test("Hello world"); // makes sense
test("Hello world".to_string()); // makes sense
let c = "Hello world".to_string();
test(&c); // why does this work? argument is by-value not ref.
}
I thought maybe there would be something making it so if U: From<T> then U: From<&T> but I don't see such an impl.
T and &T are different types, as much as T and Box<T>. Traits implemented for &T aren't auto implemented for T, for example. This is why we have the implementation that @quinedot posted above.