Why is the mutable borrow not dropped at end of the block?

Why is the mutable borrow not dropped in this code?

    enum Ref<'a> {
        Root,
        Child{parent: &'a mut Ref<'a>}
    }

    impl <'b, 'a: 'b> Ref<'a> {
        pub fn hello(&self) -> String {"hello".to_owned()}
        
        // Compilation fails here if I return Ref<'b>
        pub fn child(&'a mut self) -> Ref<'a> {
            Ref::Child {parent: self}
        }
    }


    #[test]
    pub fn test_ref() {
        let mut root = Ref::Root;

        {
            let child = root.child();
            child.hello();
        }

        // ERROR: cannot borrow `root` as immutable because it is also borrowed as mutable
        root.hello();

    }

I expected the child var to go out of scope at the end of the inner block and then to drop the mutable borrow.
How can I make it work? The requirement in that the child never outlives the root, I really don't get the underlying logic that does not make this to compile.

tl;dr: make it pub fn child(&mut self) -> Ref<'_>

Note that 'a is removed from &mut self. &'a mut self means to obtain unique access of self during lifetime 'a, and 'a is the whole lifetime of the Ref<'a>, so it will be borrowed for its whole lifetime.

1 Like

@Hyeonu
implementing it like you suggested

    impl <'a> Ref<'a> {
        pub fn hello(&self) -> String {"hello".to_owned()}

        pub fn child(&mut self) -> Ref<'_> {
            Ref::Child {parent: self}
        }
    }

throws this compilation error:

error[E0495]: cannot infer an appropriate lifetime for lifetime parameter `'a` due to conflicting requirements
   --> core/src/error.rs:208:13
    |
208 |             Ref::Child {parent: self}
    |             ^^^^^^^^^^
    |
note: first, the lifetime cannot outlive the anonymous lifetime #1 defined on the method body at 207:9...
   --> core/src/error.rs:207:9
    |
207 | /         pub fn child(&mut self) -> Ref<'_> {
208 | |             Ref::Child {parent: self}
209 | |         }
    | |_________^
    = note: ...so that the expression is assignable:
            expected error::test::Ref<'_>
               found error::test::Ref<'_>
note: but, the lifetime must be valid for the lifetime 'a as defined on the impl at 203:11...
   --> core/src/error.rs:203:11
    |
203 |     impl <'a> Ref<'a> {
    |           ^^
    = note: ...so that the expression is assignable:
            expected &mut error::test::Ref<'_>
               found &mut error::test::Ref<'a>

&'a mut self means to obtain unique access of self during lifetime 'a

I am really confused :thinking:
It makes sense, but I don't see why it should extend even over the life scope of the borrow owner itself when it's dropped.

I am also quite puzzled by such a seemingly simple situation. It might boil down to the following (take it with a grain of salt though, I'm no expert at the type system / lifetime stuff):

child() takes a &'a mut Self where Self is of type Ref<'a>. So the full type that child() takes is &'a mut Ref<'a>.
What you want to return from child() is some Ref<'b> where 'a: 'b, meaning that the parent field of the return value needs to have type &'b mut Ref<'b>. And this is the issue I believe, since you cannot convert self from &'a mut Ref<'a> to &'b mut Ref<'b>.
If we look into the Nomicon under "Variance", we see that a type &'a mut T is covariant with respect to 'a but invariant with respect to T. Since in our case T = Ref<'a>, we know that &'b mut Ref<'b> cannot be a supertype of &'a mut Ref<'a>.

Not quite certain whether this observation is correct.

Edit: I guess what you need to do to fix this is to separate the lifetimes. In practice that probably means making Ref<_> its own wrapper type and not conflating it with the "owning" Root case. Haven't thought this through yet though.

1 Like

When you define root, it is given the type Ref<'a> for some yet-to-be-determined lifetime 'a, which will default to 'static if nothing else imposes constraints on it. Note that since the lifetime 'a is part of the type of root, it cannot be smaller than the actual lifetime of root.

When you call child on root, your method has the following signature:

  1. Given a mutable reference with the lifetime 'a:
  2. Return a Ref<'a> with the same lifetime.

The important thing here is that child uses the 'a from the type directly. This means that the lifetime of the reference given to child is equal to the one bound to the type.

So when you call child, the compiler creates a lifetime to root with the lifetime 'a and passes it to child. Since child requires that the lifetime is equal to the one bound on the type of root, and since the lifetime bound on root exists for at least as long as root, the lifetime on this mutable reference is also set to live at least as long as root.

This is why it is still borrowed after the scope.

The reason this usually works is that when it creates the reference it gives to child, it is assigned a lifetime that is only valid inside the scope, so the lifetime also expires at the end of the scope.

It is the lifetime of the reference that decides how long it is borrowed. How long the returned object lives has nothing to do with it, besides that it is required to not live longer than the lifetime.

1 Like

@alice @dthul
Thanks for your explanations, the logic behind it is quite clear now.
Anyway, I am still not able to find a working implementation :sweat_smile:
I am puzzled because the situation is apparently quite simple, would you have any practical code sample to solve it?

The code in this playground compiles.

The trick is using &'a Ref<'a> instead of &'a mut Ref<'a>.

The pattern &'a mut X<'a> is a very counter-intuitive type: whenever a X<'a> is mut borrowed for its lifetime 'a, then for all intents and purposes, the original X<'a> is no longer usable:

  • First, if you try to borrow it for a shorter lifetime 'b (this is indirectly what @Hyeonu suggested), as in &'b mut X<'a>, you cannot get a &'b mut X<'b>.
    In technical terms this is because the type F<T> = &'b mut T is invariant (in T) for all 'b, which is necessary for soundness reasons. In plain old english, Ref<_> is behind a mut reference, which prevents this kind of "simplifications".

  • Second, if you do borrow for the lifetime 'a of Ref<'a> (not to be confused with the classic free lifetime parameter <'a> within a function), you are by definition borrowing the Ref<'a> until it is dropped. This is independent of any scopes that you may use on the thing resulted by the function.

    For instance, here is a degenerate example:

    impl<'foo> Foo<'foo> {
        fn borrow_until_it_dies (self: &'foo mut Self)
          -> () // returns nothing
        {} // does nothing
        
        fn borrow_until_the_returned_thinggy_is_no_longer_used<'a> (
            self: &'a mut Self,
        ) -> PhantomData<&'a ()>
        {
            PhantomData
        }
    }
    
    fn main ()
    {
        let mut foo = Foo(PhantomData);
        {
            let returned_thinggy =
                foo.borrow_until_the_returned_thinggy_is_no_longer_used()
            ;
            // foo.borrow_until_the_returned_thinggy_is_no_longer_used() // ERROR
            drop(returned_thinggy);
            let _ = foo.borrow_until_the_returned_thinggy_is_no_longer_used();
            let _ = foo.borrow_until_the_returned_thinggy_is_no_longer_used();
        }
        {
            drop(foo.borrow_until_it_dies());
        } // whatever was returned must be dropped by now
        // And yet ... the following line cause a compilation error
        let _ = &foo; // Error, cannot move out of foo because it is borrowed
    }
    
    • Playground

    • As you can see, the only thing changing between both functions is one using the special 'foo lifetime of the Self = Foo<'foo> type, whereas the second function uses a "free" lifetime parameter 'a.

Solution

The best solution is not using that many lifetime parameters, and use 'static (runtime-managed) references such as Arc: enum Ref { Root, Child { parent: Arc<Ref> } }

But if you really want to experiment with the lifetime-based / compile-time-managed references, then you need to avoid using mut for the reference: child: &'a Ref<'a>

  • if you really need mutation, you will need to use interior mutability, such as Cell on a node value (e.g., if that value is an integer), or RefCell

Then, if you use no interior mutability (or just wrap the nodes values, as with the suggested Cell case), then the signature (self: &'_ Ref<'a>) -> Ref<'_> will work fine thanks to covariance ("lifetime simplifications" are not disrupted by being behind a shared reference & _).

Otherwise (in the RefCell case), you will get the "infinite borrow issue", except for the fact that since this time the borrows are non-exclusive (& _), it won't hurt ergonomics that much; you will just be unable to move the initially borrowed value.

5 Likes

@Yandros
thanks, it was really useful!
As I need internal mutability, I'll apply one of the solutions you suggested.

Nevertheless, I am still convinced that the borrow checker here could do a better job and allow the use of root once child is out of scope. I am tempted to open a github issue for this as I don't see any soundness issue with it.

I am curious if a higher level order of precedence document could capture most of this. It is necessarily complex and yet I notice your answers follow the pattern.