I can constrain a generic so that a type T implements a trait X which has an associated type which implements a trait Y (playground).
Once this associated type becomes a GAT (with a lifetime), this no longer seems possible (playground).
e.g. It doesn't make sense that -
// references to i32 always implement trait
impl TraitOnOutput for &'_ i32 {}
// a type T can be passed to this and compile successfully
fn has_output_of_i32<T>(_: T)
where
for<'a> T: Resolve<Output<'a> = &'a i32>,
{
println!("has i32 output");
}
// but _not_ this. Necessarily every T who can be passed to the above has an output which is TraitOnOutput
fn has_trait_on_output<T>(_: T)
where
T: Resolve,
for<'a> <T as Resolve>::Output<'a>: TraitOnOutput,
{
println!("this won't work");
}
// Note that if 'a is lifted out to the function definition, this does compile, but I can't call t.resolve() due to the lifetime not outliving the function body.
fn has_trait_on_output<'a, T>(t: T)
where
T: Resolve + 'a,
<T as Resolve>::Output<'a>: TraitOnOutput,
{
// can't call as doesn't outlive 'a
r = t.resolve();
r.it_works();
}
I would expect this to compile, but I suspect there's an issue resolving the lifetime for<'a>.
My best guess is I'm looking for this magical syntax which doesn't exist (?):
fn has_trait_on_output<T>(_: T)
where
for<'a> (
<T as Resolve>::Output<'a>: TraitOnOutput,
T: Resolve + 'a
)
{
println!("this won't work");
}
Could anyone give some advice as to how to tackle this/whether it's a type system issue?
From the error message … is not implemented for `<_ as Resolve>::Output<'a>` containing an _ in place of the expected Thing<i32> type, I could guess that this might be a type inference problem, and the compiler somehow isn’t properly aware that the parameter T of the has_trait_on_output is supposed to be the type Thing<i32>.
The natural follow-up question is then why type inference has any problems here. To that question, I do not have any good answers; we should look for existing issues in GitHub, and open a new one if no issue for this exists already.
I'm not familiar enough with relevant compiler internals to say that it's actually an "inference problem" in the sense that inference is the problem. All I can say is that it appears to be somehow in the interplay of the trait resolution here with type inference, so it seems to be a problem related to type inference. It might be related to normalization, too, who knows
(Also, for an actually unknown type, I would expect an entirely different error message (complaining about an ambiguous type, suggesting to add an explicit turbofish, etc..) I'm wondering whether there's any case where an error message like "Foo is not implemented for Xyz" does ever make sense for a type Xyz containing a for some reason unknown type, indicated by an underscore. This makes me wonder, what sensible diagnostics this way of displaying a (unknown?) type with an underscore (presumably as some kind of fallback) was even created for.)