Lifetime elision rules for methods with generic param type

I've run into a situation where the compiler seems to be applying a lifetime elision rule that isn't described in the documentation. Here is a minimal program to demonstrate:

struct Container<T> {
    value: T
}

impl<T> Container<T> {
    fn explicit_param_type(&mut self, v: &str) {
    }

    fn generic_param_type(&mut self, v: T) {
    }
}

fn main() {
    let mut h: Container<&str> = Container{value: "foo"};

    {
        let s = String::from("Hello World");
        h.explicit_param_type(&s[..]);  // compiler allows this
        h.generic_param_type(&s[..]); // compiler complains about this
    }

    println!("{}", h.value)
}

The compiler is ok with the call to the explicit_param_type, but complains about the call to generic_param_type with the error:

error[E0597]: `s` does not live long enough
  --> src/main.rs:19:31
   |
19 |         h.generic_param_type(&s[..]);
   |                               ^ borrowed value does not live long enough
20 |     }
   |     - `s` dropped here while still borrowed
21 | 
22 |     println!("{}", h.value)
   |                    ------- borrow later used here

The only difference between the two methods is that explicit_param_type has a parameter of type &str, but generic_param_type has a parameter of type T, which in this case resolves to &str.

Based on reading the lifetime elision rules in The Rust Programming Language and The Rust Reference, I would have thought that for both methods, the self and v parameters would get their own distinct lifetime parameter. That seems to be the case for explicit_param_type(), but for generic_param_type(), it seems like the compiler instead infers the same lifetime for self and v, and therefore complains because the method argument has a shorter lifetime than self. I can't find anything in the documentation that would explain why the compiler would do this. From my reading of the docs, both method calls in the sample program should compile.

Is there somewhere in the docs that explain why the compiler infers different lifetimes for a method parameter when its type is generic?

Thanks.

These methods are not the same after lifetime elision, because they don't have the same number of elided lifetimes. I'll rewrite the methods as free functions to try and make the problem more obvious:

fn explicit_param_type<T>(&mut Container<T>, &str);
fn implicit_param_type<T>(&mut Container<T>, T);

After applying the rules of lifetime elision, these become:

fn explicit_param_type<T, 'a, 'b>(&'a mut Container<T>, &'b str);
fn implicit_param_type<T, 'a>(&'a mut Container<T>, T);

Neither the 'a nor 'b lifetimes are meaningful because none of them appear more than once. However, in your example, you call these methods with T = &'c str, leading to the following monomorphized signatures:

// signatures as seen by the borrow checker in fn main()
fn explicit_param_type<'a, 'b, 'c>(&'a mut Container<&'c str>, &'b str);
fn implicit_param_type<'a, 'c>(&'a mut Container<&'c str>, &'c str);
  • explicit_param_type says "I'll accept a string with any lifetime".
    (in practice, this means: "I will parse this &str into an owned value, so a temporary string is ok")
  • implicit_param_type says "I'll accept a string with the same lifetime as the strings I hold."
    (in practice, this means: "I will store this &str inside a container, so it had better remain valid!")

One more point: The fact that the methods take &mut self plays a role here as well. If the methods were instead:

fn explicit_param_type<'a, 'b, 'c>(&'a Container<&'c str>, &'b str);
fn implicit_param_type<'a, 'c>(&'a Container<&'c str>, &'c str);

then both would work just fine, because given an arbitrary &Container<&'c str> and a &'b str, covariance allows the compiler to choose an even shorter lifetime for 'c that is the intersection of 'b and 'c. Notice how neither of these methods are capable of e.g. storing the string inside the container, because they can't mutate the container, so this cannot be used to circumvent memory safety.

In the case of &mut self methods, however, storing that string is possible, and so we must ensure it lives long enough. That's why &mut U is invariant in U, forbidding the compiler in this case from choosing a shorter lifetime for 'c.

4 Likes

Thank you for the fantastic answer! That all makes sense now.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.