I'm a bit embarrassed to admit that I've been working in Rust for 5 years now and I still haven't managed to get the variance inference behavior clear in my head.
Specifically in this code below, I'm not sure what I have to do, to make it clear that the lifetime in the use_container_generic function can be shortened. It has no problem figuring that out when the parameter type is not generic. (in use_container_specific)
it can't. trait bounds are always invariant, consequently, dyn Trait<'a> is also invariant. PhantomFn was removed long ago.
the problem with lifetimes in trait bounds in general, is that the implementation type can have any kind of variance. for example, nothing prevents the user from implementing the trait for an invariant or cotravariant type, such as:
// an example of contravariant type
struct Foo<'a> (PhantomData<fn(&'a())>);
impl<'a> RefTrait<'a> for Foo<'a> {}
// use the contravariant type to call the generic function
use_container_generic(Foo::new(...));
in a generic context, given a type T, lifetime 'a and 'b, where 'b: 'a, it is true that &'b T can be used when &'a T is expected, but it is NOT true that T: Trait<'b> could be used when T: Trait<'a> is required.
the usual workaround is to use an adapter type which explicitly shorten the lifetime of the trait bounds:
pub struct RefTraitAdapter<T>(pub T);
// as long the trait methods are compatible with covariant `Self`,
// this should always be implementable
impl<'a, 'b, T> RefTrait<'a> for RefTraitAdapter<T> where T: RefTrait<'b> {
// delegate all trait methods to inner type
}
fn use_container_generic<'long, RZ>(long_lived: RZ)
where
RZ: RefTrait<'long>,
{
let short_lived: usize = 2;
let _ = Container::new(RefTraitAdapter(long_lived), &short_lived);
}
I am still left with one question - although perhaps it's more appropriate for the internals forum...
Since an "adapter" can be constructed for any trait where the trait's methods permit a covariant Self, I wonder why the compiler's inference mechanism can't do this automatically. I.e. if the compiler can validate the correctness of the adapter, it seems like it should know everything it needs to know to not require the adapter.
This seems like a precise analog to the way PhantomData inside a struct influences the struct's parameters' variance. It also seems like this was the original vision behind dropping explicit variance declaration, and why PhantomFn was cited as solving this problem.
Anyway, I don't mean to bike-shed the compiler's / language's decisions, but it's unclear if this situation might improve in a future language edition, or if it's an unfortunate and permanent consequence of other (sensible) choices interacting.