Borrow lasts too long, if type that borrows is generic

Consider this struct:

struct Getter<T>(PhantomData<T>);

impl<T> Getter<T> {
    fn get_thing<'a>(&'a mut self) -> Thing<'a> {
        panic!();
    }
}

This mimics an API that returns something, that has a borrow. For example, a deserializer returning structs that borrow data from an internal buffer.

Please ignore T for now, it's going to become relevant in a moment. 'a is not necessary here (can be elided), but I decided to add it for clarity.

Here's the Thing that Getter returns:

struct Thing<'a>(&'a ());

Thing implements a trait, Trait. This will become relevant in a moment:

trait Trait<'a> {}

impl<'a> Trait<'a> for Thing<'a> {}

Here's an example program that uses Getter in a loop:

fn main() {
    let mut getter = Getter::<Thing>(PhantomData);

    loop {
        let _thing = getter.get_thing();
    }
}

This compiles just fine. _thing borrows getter, but _thing is obviously dropped at the end of the loop iteration, so this borrow doesn't interfere with the next call to get_thing.

There's a single change that will make this program break: Instead of returning Thing from get_thing, return a generic type that implements Trait.

struct Getter<T>(PhantomData<T>);

impl<T> Getter<T> {
    fn get_thing<'a>(&'a mut self) -> T where T: Trait<'a> { // return type in this line changed
        panic!();
    }
}

Now the example program no longer compiles:

error[E0499]: cannot borrow `getter` as mutable more than once at a time
 --> src/main.rs:8:22
  |
8 |         let _thing = getter.get_thing();
  |                      ^^^^^^ mutable borrow starts here in previous iteration of loop

I don't understand why it's not working. In my mind, literally nothing has changed from before:

  • _thing is still dropped at the end of the loop iteration, so it shouldn't interfere with the call in the next one.
  • Nothing about the lifetime is new. Thing already implemented Trait, and Trait's lifetime was already tied to the lifetime of Thing.
  • The compiler even knows that T is Thing!

I'm wondering if I'm missing something, and if there's some way to get this to compile. I can't seem to find a way.

1 Like

This will compile:

fn get_thing<'a, 'b>(&'a mut self) -> T where T: Trait<'b> {
    panic!();
}

I can't put the precise rule into words, but when I see &mut I expect having to split the lifetime parameter of the borrow itself from all other lifetimes which may appear in the signature. Variance makes immutable borrows much more forgiving.

Thank you for the reply!

Unfortunately your solution is not applicable beyond this simple example. As I said above, Getter is meant to mimic an API that returns something that borrows from the API itself, for example a deserializer. In my actual application, this is based on Serde, but I didn't want to complicate this example with too many dependencies.

I'll demonstrate. Let's extend Trait like this:

trait Trait<'a> {
    fn deserialize(buf: &'a [u8]) -> Self;
}

Thing's implementation of Trait could then turn into something like this:

impl<'a> Trait<'a> for Thing<'a> {
    fn deserialize(buf: &'a [u8]) -> Self {
        Self(buf)
    }
}

And Getter would become this:

struct Getter<T> {
    buf: [u8; 256],
    _t:  PhantomData<T>,
}

impl<T> Getter<T> {
    fn new() -> Self {
        Self {
            buf: [0; 256],
            _t:  PhantomData,
        }
    }

    fn get_thing<'a>(&'a mut self)-> T where T: Trait<'a> {
        T::deserialize(&self.buf[..])
    }
}

This is closer to my real scenario. If I apply your suggestion now, get_thing no longer compiles:

error[E0495]: cannot infer an appropriate lifetime for lifetime parameter in function call due to conflicting requirements
  --> src/main.rs:32:25
   |
32 |         T::deserialize(&self.buf[..])
   |                         ^^^^^^^^^^^^
   |

I did some more experimentation (after all, it's much easier now that I have a minimal example) and thought about this some more. I think the problem is that the compiler just sees that 'a is the same as the lifetime argument of Trait, but doesn't know what this means.

It doesn't understand when this lifetime might end. If I return Thing instead of T (as in my initial example above), it understands that the lifetime is bound to the life of the Thing, and that the borrow ends once Thing is dropped.

If this is right, the question becomes whether there's a way to make the compiler understand that the borrow ends when T is dropped. I tried adding new lifetimes, and various bounds between them, but I can't come up with a way to express this.

Can you introduce another lifetime which is shorter than 'a?

fn get_thing<'smaller, 'this: 'smaller>(&'this mut self)-> T 
where 
    T: Trait<'smaller> 
{
    T::deserialize(&self.buf[..])
}

If &self were an immutable reference variance would figure out an appropriate region for 'this and 'smaller, but because it's &mut we need to explicitly say that 'this outlives 'smaller.

Thank you for the reply.

Unfortunately that doesn't work either. With your suggestion, get_thing compiles, but the example program still fails to compile with the same error (so nothing is changes versus using only one lifetime).

If my theory is correct, then this makes sense: We've told the compiler that 'this outlives 'smaller, but it still knows nothing about how long 'smaller actually lives. It still doesn't understand that the borrow ends once T is dropped.

Okay, now this is very interesting. This got me to think:

I've changed &mut self to &self in get_thing:

fn get_thing<'a>(&'a self) -> T where T: Trait<'a>
{
    T::deserialize(&self.buf[..])
}

With this change, everything compiles. By itself, this is not suprising, since multiple borrows of a shared reference are allowed.

However, it still works, if I add this method:

fn do_thing_mut(&mut self) {}

And call it right after get_thing:

loop {
    let _thing = getter.get_thing();
    getter.do_thing_mut();
}

Now the compiler seems to understand that the borrow of getter ends once _thing is dropped.

Very weird, I don't understand this. But this workaround might be applicable to my application. I'll have to look into that.

Short update: Unfortunately I can't just use &self in my application, as the Serde deserializer does some decoding in-place (i.e. requires a mutable reference to the buffer). Maybe I can come up with a working solution by splitting the API into multiple structs (one for reading into the buffer, one for deserializing from it). I'll have to try.

The reason it doesn't work is because only Thing<'a> implements Trait<'a>.

The contradiction gets smuggled in through this line:

let mut getter: Getter<Thing> = Getter::new();
//                     ^^^^^-- what's the lifetime here?

It's not obvious because the lifetime parameter of Thing has been elided, but you're making a promise here that there is some lifetime, let's call it '1, such that getter is of type Getter<Thing<'1>>. Importantly, since the type of getter is parameterized with this '1, '1 must outlive getter itself.

Now when you write

loop {
    let _thing = getter.get_thing();
}

The compiler knows that _thing has to have the type Thing<'1>, and also that Thing<'1>: Trait<'a>. So 'a = '1. But that means you're borrowing getter for the entire lifetime '1! Since that lifetime must be valid for the entire lifetime of getter itself, you're taking a unique borrow of getter for its whole lifetime, which you clearly can't do twice.

The original non-generic code does not unify these lifetimes because there are two different Thing lifetimes in play: Thing<'a> and Thing<'1> are unrelated.

In the first example you can make it compile by implementing Trait<'a> for Thing<'b> where 'b: 'a...

impl<'a, 'b: 'a> Trait<'a> for Thing<'b> {}

... but that also doesn't work in the final example.

The underlying problem is the lifetime '_ in Getter<Thing<'_>>. This is a promise you didn't mean to make. You wanted a type that provides all kinds of Things, but you got a type that provides one particular Thing.

You want Getter to be generic over a family of types, not just one concrete type. You could certainly achieve this with GATs, which unfortunately are still unimplemented. I know of some workarounds, but I'm not sure any of those apply to your case.

4 Likes

Thank you, @trentj, that makes so much sense!

It also points to a solution: I think I can get away with moving the type parameter to get_thing. That doesn't accurately communicate the realities of the API (you can't actually get a different T every time), but it's certainly good enough for me.

The only thing I don't understand, is why it started working when I changed the self argument of get_thing to &self. The lifetime should still be as you described, so the &self borrow of get_thing should conflict with the call to do_thing_mut.

Have you read the serde documentation on Deserializer lifetimes. It's an excellent guide that is designed exactly for people in your situation.

I have, but couldn't find anything in there that applied to my situation.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.