Can someome explain this variance vs. borrowing code?

I am trying to understand how variance relates to borrowing and lifetimes. In this playground:

Why does switching T between Covariant and Contravariant break things? And why does changing _x to be a mutable reference (for T = Covariant) break things?

See Borrowing something forever - Learning Rust and read at least 10.3, 10.4, and 10.6 (especially 10.6). Come back if you have additional questions.

2 Likes

I will read much more than just those sections - since the whole reference looks like what I need. Thanks!

On this page: Lifetime bounds - Learning Rust
it says both:

When you have a function argument with a nested reference such as &'b Foo<'a>, a 'a: 'b bound is inferred.

and

A T: 'a means that a &'a T would not be instantly undefined behavior.

My confusion is with the first. I would only have expected Rust to infer Foo<'a> : 'b, not 'a : 'b, and that second line somewhat reinforces that expectation.

Why does Rust infer 'a : 'b, without caring about how 'a is used in Foo<'a> (it may be contravariant, for instance)?

I can't find a source at the moment[1] but when the compiler sees Foo<'a> it assumes that contained somewhere inside is a reference with lifetime 'a. That means Foo<'a> cannot live longer than 'a or else it would be holding a dangling reference. In other words, 'a outlives Foo<'a>, which outlives &'b Foo<'a>, thus 'a outlives 'b.


  1. However this is easy to convince yourself. The compiler simply refuses to let you define a Foo<'a> type without mentioning 'a somewhere inside the type. No matter what variance that 'a ends up with, Foo<'a> is not allowed to outlive it. ↩ī¸Ž

But I can define:

struct Contravariant<'a> {
  weird: fn(&'a ()),
}

fn fun<'a, 'b>(x : &'b Contravariant<'a>) { ... }

which has no reference inside it at all. Admittedly, this is a very esoteric case. Covariance and invariance are much more common.

Outlives relations are syntactic, so if Foo<'a>: 'b, then 'a: 'b, and vice-versa. The motivations are at the top of the RFC.

Variance is a property of super/subtyping, i.e. which lifetimes can coerce to one another. Bounds checks don't consider variance, they consider a specific type, ignoring its possible coercions. The bound is a property of the type, not some supertype or anything else it can coerce to.

I think this will be easier to explain with an example. Let's consider the function pointer type fn(&'a str).

fn witness_outlives<'a, T: 'a>() {}

fn example<'long: 'short, 'short>() {
    // These are fine
    witness_outlives::<'long,  fn(&'long  str)>();
    witness_outlives::<'short, fn(&'short str)>();
    witness_outlives::<'short, fn(&'long  str)>();

    // These are errors
    witness_outlives::<'static, fn(&'long  str)>();
    witness_outlives::<'static, fn(&'short str)>();
    witness_outlives::<'long,   fn(&'short str)>();
}

Even though a fn(&'long str) cannot coerce to a fn(&'short str), fn(&'long str): 'short. And even though a fn(&'short str) can coerce to a fn(&'long str) or fn(&'static str) due to covariance, it does not satisfy a : 'long or : 'static bound.

Generally this isn't a problem when lifetimes are inferred, as contravariance will kick in:

fn witness_outlives_value<'a, T: 'a>(_: T) {}

fn example_value<'long: 'short, 'short>(
    contravar: fn(&'short str),
    covar: &'short str,
    invar: fn(&'short str) -> &'short str
) {
    // OK
    witness_outlives_value::<'long, _>(contravar);
}
1 Like

I am still staring at this in disbelief:

Why? :face_with_spiral_eyes:

It's not the same meaning; without the annotation forcing a specific type, the function pointer is free to coerce to a fn(&'long str) or fn(&'static str). (When the compiler infers the type parameter, due to _ or being elided, it's not limited to the exact type of contravar.)

So coercion occurs before type inference!

I think my mistake was that I was assuming coercion had to be part of the bound check in order for both of those cases to work the same, which I was assuming (incorrectly) they would. That would have required that the bound check be more than merely syntactic. That would have allowed type inference to go first, then have the actual type coerced based on variance to satisfy the bound check (if it could, which it can in this contravariance case).

OK - on to the next confusion...

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.