Difference in lifetime between passing trait and concrete struct as argument to function

Hi. I am new to rust and I am playing around with async/await/tokio, although I have the impression my problem has nothing to do with it, but is rather a general problem with lifetimes.

Anyway, to give a working example, I am using dependencies

tokio = { version = "0.2.0-alpha.4" }
futures-preview = { version = "=0.3.0-alpha.18", features = ["compat", "async-await", "nightly"] }

I have the following snippet:

extern crate tokio;

struct My<'a> {
    text: &'a str,
}

trait Guy<'a> {
    fn get_text(&'a self) -> &'a str;
}

impl <'a> Guy<'a> for My<'a> {
    fn get_text(&'a self) -> &'a str {
        self.text
    }
}

async fn hello<'a>(my: impl Guy<'a> + 'a) {
    println!("{}", my.get_text());
}

#[tokio::main]
async fn main() {
    let my = My{text: "a"};
    hello(my).await;
}

When I try to build, it fails with the following error:

error[E0597]: `my` does not live long enough
  --> src/main.rs:18:20
   |
17 | async fn hello<'a>(my: impl Guy<'a> + 'a) {
   |                -- lifetime `'a` defined here
18 |     println!("{}", my.get_text());
   |                    ^^-----------
   |                    |
   |                    borrowed value does not live long enough
   |                    argument requires that `my` is borrowed for `'a`
19 | }
   | - `my` dropped here while still borrowed

However, if I replace the function hello with

async fn hello<'a>(my: My<'a>) {
    println!("{}", my.get_text());
}

the code compiles and runs as expected.

So I would like to understand where is the difference in lifetime between the trait object and the concrete struct I get a problem with lifetimes.

Thanks!

My intuition here is that 'a is an early bound lifetime parameter, to quote this post:

Unlike the previous case, the lifetime parameter 'a appears within the type bound for I , which means that it is early bound . In case you forgot from my previous post, this means that we cannot wait until the fn is called to decide what lifetime to use for 'a , but rather must choose one immediately.

So in other words (someone correct me if I'm wrong), the compiler doesn't know what the lifetime 'a is, so it errors out. (I'm wrong, see this post) There are (at least) a few ways of fixing this though:

trait Foo<'a> {
    fn foo(&'a self) -> &'a str;
}

struct Bar(String);

impl<'a> Foo<'a> for Bar {
    fn foo(&'a self) -> &'a str {
        &self.0
    }
}

// 1) Add a reference (the compiler knows `x` lives long enough)
fn func1<'a>(x: &'a impl Foo<'a>) {
    println!("{}", x.foo());
}

// 2) HRTB (assure the compiler that for any lifetime 'r it's ok)
fn func2<T>(x: T)
where
    for <'r> T: Foo<'r> + 'r
{
    println!("{}", x.foo());
}

Otherwise, you don't actually need the lifetime in the trait you gave, so that's an option too:

// 3) No lifetime parameter
trait Foo {
    fn foo(&self) -> &str;
}

struct Bar(String);

impl Foo for Bar {
    fn foo(&self) -> &str {
        &self.0
    }
}

// O.K.
fn func(x: impl Foo) {
    println!("{}", x.foo());
}

Ok, thanks! I am trying to understand, although I am having a hard time reading the post you linked. However, what can I do if the trait object is constructed inside the function func and not passed as argument? Example:

trait Foo<'a> {
    fn foo(&'a self) -> &'a str;
}

struct Bar<'a> {
    x: &'a str
}

impl <'a> Foo<'a> for Bar<'a> {
    fn foo(&'a self) -> &'a str {
        &self.x
    }
}

fn new<'a>(x: &'a str) -> Bar<'a> {
    return Bar{x: &x};
}

fn new_trait<'a>(x: &'a str) -> impl Foo<'a> {
    return Bar{x: &x};
}

// works
fn func1() {
    let x = new("a");
    println!("{}", x.foo());
}

// does not work
fn func2() {
    let x = new_trait("a");
    println!("{}", x.foo());
}

Ok, yeah, I'm a bit unsure about the fundamental issue here. This thread is basically your issue though. In your case, I would honestly just avoid putting a lifetime on self, which should almost never be done unless you have multiple lifetimes going on.

trait Foo<'a> {
    fn foo(&self) -> &'a str;
}

struct Bar<'a> {
    x: &'a str,
}

impl<'a> Foo<'a> for Bar<'a> {
    fn foo(&self) -> &'a str {
        self.x
    }
}

fn new_trait(x: &str) -> impl Foo {
    Bar { x: &x }
}

// works
fn func() {
    let x = new_trait("hello");
    println!("{}", x.foo());
}

Actually, you don't even need to return a reference to cause issues:

trait Foo<'a> { 
    fn foo(&'a self) {} 
}

struct Bar<'a>(&'a str);
impl<'a> Foo<'a> for Bar<'a> {}

fn new_foo(x: &str) -> impl Foo { Bar(x) }
fn use_foo<'a>(_: &'a impl Foo<'a>) {}

// error[E0597]: `x` does not live long enough
//   --> src/lib.rs:698:14
//    |
//698 |     use_foo(&x);
//    |              ^^ borrowed value does not live long enough
//699 | }
//    | |
//    | `x` dropped here while still borrowed
//    | borrow might be used here, when `x` is dropped and runs the 
//        destructor for type `impl vec::Foo<'_>`
fn test() {
    let x = new_foo("a");
    use_foo(&x);
}

I'll definitely defer to someone with a better understanding of the lifetime system here...since I'm really not sure what's going on.

First of all, and before I forget, thanks for posting the Cargo.toml [dependencies], which makes reproducing the error a piece of cake.

The theory as to what is Rust complaining about

Let's first consider a lifetime parameter: take, for instance,

fn foo<'a> (s: &'a str)
{
    ...
}

For how long is s borrowed? We cannot really know, since that is something that is chosen by the caller, but no matter how short that borrow is, 'a is a lifetime that spans over all the function body and beyond / 'a must end after the function foo returns.


Now let's imagine a type T that is generic over some lifetime 'a: T<'a>.

  • For the example, we will test two implementations:

    • covariant w.r.t. 'a

      struct T<'a> (
          &'a (),
      );
      
    • invariant w.r.t. 'a

      struct T<'a> (
          &'a mut &'a (),
      );
      

And let's study the followng function:

fn foo<'a> (x: T<'a>)
{
    fn max_borrow<'a> (_: &'a T<'a>)
    {}

    max_borrow(&x);
}
  • covariant T<'a> compiles fine: Playground

  • invariant T<'a>, on the other hand, yields the compilation error you have been having:

    error[E0597]: `x` does not live long enough
      --> src/lib.rs:10:16
       |
    5  | fn foo<'a> (x: T<'a>)
       |        -- lifetime `'a` defined here
    ...
    10 |     max_borrow(&x);
       |     -----------^^-
       |     |          |
       |     |          borrowed value does not live long enough
       |     argument requires that `x` is borrowed for `'a`
    11 | }
       | - `x` dropped here while still borrowed
    

This happens because we want to borrow a Foo<'a> for the whole lifetime 'a; we usually kind of assume that it means wanting to borrow the Foo<'a> for its whole own lifetime (as a binding), i.e., borrowing the Foo<'a> until it dies / is dropped.

It turns out that this is not always the case: as I said at the beginning of the post, the generic lifetime of a function is assumed to last beyond the body of the called function (for obvious soundness reasons). Thus, when trying to borrow it for the lifetime 'a (in max_borrow), we were actually trying to hold the borrow beyond the end of the function. But since the function takes ownership of its Foo<'a> parameter, it becomes local to the function and is thus dropped before the function returns. Hence the error.

The covariant case (the most usual one in practice), dodges this issue by shrinking (thanks to covariance) Foo<'a>'s lifetime into some inner anonymous lifetime that does thus not outlive the function, hence making the borrow on its own not need to be held that long, thus making it doable.

  • impl Trait<'a> (and dyn Trait<'a>) happen to conservatively be always invariant in 'a.

Solution (in practice)

To solve this issue, as with many lifetimes issues, one does not really need to understand them exactly: suffices to spot &'a Foo<'a> as a "code smell" / potentially problematic pattern.

What the programmer most probably intends, more often than not, is that for some type Self (which can be Self = Foo<'a>), we just want to borrow Self, with no special constraint on the duration of this borrow.
Using &'a self = self: &'a Self = self: &'a Foo<'a> does not express that "I don't mind how long the borrow is" intent. On the contrary, it is telling Rust "I want the borrow to last exactly 'a, where 'a is that lifetime parameter in Foo. Instead,

one must use &self = &'_ self = self: &'_ Self, where '_ is an elided and thus "free" generic lifetime parameter.

For instance,

trait Guy<'a> {
    fn get_text (self: &'a Self)
      -> &'a str;
}

becomes

trait Guy<'a> {
    fn get_text (self: &'_ Self)
      -> &'_ str;
}

which can be simplified into not having this unused lifetime parameter:

trait Guy {
    fn get_text (self: &'_ Self)
      -> &'_ str;
}

and funnily enough we can always name our free lifetime parameter, as long as it is available. In this case, the name 'a is now available:

trait Guy {
    fn get_text<'a> (self: &'a Self)
      -> &'a str;
}

So, at the end of the day, we have just moved the genericity from the trait itself to the trait's associated function.

Result

struct My<'text> {
    text: &'text str,
}

trait Guy {
    fn get_text (self: &'_ Self)
      -> &'_ str;
}

impl<'text> Guy for My<'text> {
    fn get_text (self: &'_ Self)
      -> &'_ str
    {
        self.text
    }
}

async fn hello (my: impl Guy)
{
    println!("{}", my.get_text());
}

#[::tokio::main]
async fn main ()
{
    let my = My { text: "a" };
    hello(my).await;
}
5 Likes

Nice write up! This was the key realization for me:

impl Trait<'a> (and dyn Trait<'a> ) happen to conservatively be always invariant in 'a

For those looking for more about variance, the nomicon is a good reference.

Ah, impressive. I was looking at this earlier thinking that the issue might have to do with the bug in the compiler that causes it to assume that RPITs borrow from all input lifetimes. (there's an issue for this but I can't seem to find it right now!)

Because the example is so contrived, I assumed that there's more to the problem than shown here, and that there is a reason for it to have a lifetime bound. Under that assumption, I tried working around it with HRTB types like impl for<'a> Trait<'a>, and that at least seemed to work (for similar reasons, as the lifetime is no longer invariant).

1 Like

There is also this other very lengthy and detailed post I have written quite recently: Looking for a deeper understanding of PhantomData - #4 by Yandros

In there, I show a macro to test subtyping (and thus variance):

macro_rules! const_assert_subtypes {(
    for [<$($generics:tt)*]
    $T1:ty : $($T2:tt)*
) => (
    const _: () = {
        fn foo<$($generics)* (x: *const $T1)
          -> *const $($T2)*
        {
             x
        }
    };
)}

trait Trait<'lifetime> {}

const_assert_subtypes!(
    for [<'short, 'long : 'short>]
    dyn Trait<'long> : dyn Trait<'short>
);

// Same for dyn Trait<'short> : dyn Trait<'long>
// And for impl Trait
1 Like

@Yandros thanks a lot for the full explanation. That's extremely helpful!

Now I do understand how to fix the code snippet I posted above. If I could still ask one clarification, however: the reason why I wrote &'a self in the first place was to make the function send compile in the following snippet:

use tokio::sync::mpsc; // 0.1.22

struct Bar<'a> {
    s: String,
    sender: mpsc::Sender<Data<'a>>,
}

struct Data<'a> {
    s: &'a str,
}

impl <'a> Bar<'a> {
    fn send(&'a mut self) {
        self.sender.try_send(Data{s: &self.s});
    }
}

fn main() {
    let (snd, _) = tokio::sync::mpsc::channel::<Data>(100);
    let mut b = Bar{s: String::from("hello"), sender: snd};
    b.send();
}

Problem is: if send takes &'a mut self then main does not compile. If I change send to take &mut self then send itself does not compile anymore. Playground. I have the impression I am doing something totally wrong, I would just like to understand what.

By the way, if you have any other comprehensive references on rust lifetimes, I'd be interested to read them. Thanks!

You are setting up a self-referential struct. Your code is similar to

struct SelfReferential<'slf> {
    s: String,
    at_s: Option<&'slf str>,
}

impl<'slf> SelfReferential<'slf> {
    fn new (s: String)
      -> Self
    {
        Self { s, at_s: None }
    }

    fn init (self: &'slf mut Self) // borrow until `Self` dies
    {
        self.at_s = Some(&*self.s);
    }
}

Best case scenario, .init() borrows its argument until it is dropped, thus making it unusable.
Worst case scenario, as in your example, the 'slf lifetime spans in practice beyond the lifetime of the SelfReferential struct (as I showed in the previous post with Foo<'a>), and then you simply cannot borrow it for that long.

You thus needs to get rid of it being self-referential. So Data cannot be tied to any borrow / lifetime, and should instead be using an owned variant of str, such as String / Box<str> or Rc<str> / Arc<str>.

Then, for your .send() method, you have two choices:

If you intend to be calling .send() many times, then neither solution is good: the former copies the String contents everytime, the latter just straight up panic!s.

The solution then is to use the .clone() solution, but with a .clone() implementation that instead of copying the String, copies just the pointer (shallow copy). This is achieved by using Arc<str>:

2 Likes

Thanks! That was really helpful!

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.