While adding explicit lifetimes to some code (to help figure out why the borrow checker was complaining about the elided version), I accidentally ran across an example of something I can’t explain to myself. Can anyone explain why the following code has an error?
Nothing good ever comes from adding lifetime to self.
Foo<'a> means it contains a reference inside, and for that to be valid, the reference must live longer than Foo (so that it never becomes invalid while Foo is using it). When you say self has the same lifetime, you say that Foo lives longer than itself.
It can certainly lead to confusion (at least for me)! Is there a lint that can be turned on to warn about it?
Hmm … from reading various things about lifetimes and references (including the subtyping section in the Nomicon), I thought that the reference inside a Foo<'a> only needed to live at least as long as Foo, so that giving self the same lifetime would just mean that Foo lives as least as long as itself. (EDIT: after reading that section in the Nomicon again, I think I was wrong about that)
And now that I think about it, if the problem really is that &'a mut Foo<'a> implies a Foo that lives longer than itself, how come my first snippet compiles without error if the only change is removing the call to do_two:
The “at least as long” is technically true, but there’s always some evaluation and destruction order, so no two things have exactly the same lifetime. With & references the compiler can ignore the difference (lifetime subtyping), but with &mut it’s super picky about then (invariant lifetimes).
Yes, that’s what I used to think! Thanks to your help, I believe I now have a better understanding of lifetimes and borrows – namely, that:
When a reference goes out of scope, its borrow will end if and only if the compiler can “shrink” the lifetime of the borrow (via lifetime subtyping) to match the lifetime of that scope; and the compiler cannot do this in the case of certain invariant lifetimes.