I have been playing around with Rust traits, and I have hit a roadblock which I don't see how to solve. The problem is a bit convoluted, so I would try to explain it as simply as a I can.
My code is organized using general traits of the form "Trait that returns some specific Trait as an AT" like this:
Because the lifetime a' doesn't exists in that context. I have tried using a where statement like this:
trait AnotherSpecificTrait: AnotherTrait
where
for<'a> <Self as AnotherTrait>::Type<'a> = TypeStructWithLifetime<'a>,
{
}
Which works if the bound is another trait but not if it is a struct, because the compiler doesn't allow to use = in a where clause.
At this point I have run out of ideas, and I wonder if it is possible to express the idea of "a Trait that restricts the GAT of another Trait to a specific struct" and if there is a good reason this is not allowed.
Thanks a lot. I didn't realize you could introduce a for in the trait declaration (actually, I don't think I have seen a for clause there before). The 'static lifetime is indeed too restrictive, but that's not a problem with the trait declaration.
If I force the GAT not to have a Self: 'a bound everything works as expected.
I have been tinkering a bit more with the code, and it seems there is no easy way to remove the where Self: 'a bound from the GAT.
I found a workaround by adding a method to the trait which uses the GAT but not the object itself like in this post, which seems brittle but I've seen no problems so far.
The compiler error points to this comment which does lay out a workaround
= note: this bound is currently required to ensure that impls have maximum flexibility
= note: we are soliciting feedback, see issue #87479 <https://github.com/rust-lang/rust/issues/87479> for more information
This breaks my code. Workaround?
First, if any code breaks from adding the required bounds, we really want feedback. Second, the workaround is to move the GAT into a super trait. Using the example above, our new code would look like:
Looking through the discussion this comment already reports an (at least somewhat) similar-looking code example. I haven’t read through all the answers though
I did not participate in the discussion about GATs, and the more I look at it the less I understand the need for the Self bound. So far the borrow checker seems to understand that whatever lendme() returns is only valid as far as the lender is alive and unchanged.
As far as I know GATs were eventually stabilized even though some outstanding usability issues remained. But the benefits of stabilizing were seen higher than the downsides of still having remaining issues, and there were no known shortcomings that couldn’t be fixed at a later time in a backwards-compatible manner.
The desire to help the user not to forget useful Self: 'a-style bounds (which ideally never should have a downside) is what motivates these errors, not any concerns of e.g. soundness that would actually make them necessary. I believe, AFAIR, the motivation to have it an error is that this way it’s still easy to relax the constraints in the future, rather than going the opposite way of turning something that works into an error. This would be thus in line with the approach of stabilizing them in a way that doesn’t preclude any future fixes around the remaining usability issues.
which ideally never should have a downside
for example in the code at hand, arguable the actual issue is that the for<'a> … bound doesn’t work. It should arguably be the case that either implicitly, or with some explicit syntax, this for<'a> trait bound can be limited to only quantify over lifetimes fulfilling Self: 'a.