Why do the lifetime semantics change between these two main functions?
When the reference to d is wrapped in a struct the lifetime length requirements extend & I don't understand how to prevent it from doing so.
If the reference is created in the loop, it should not need to last longer than the loop's body unless a Fn(&mut Context) gains some capability it doesn't have in the other case.
What am I missing? Is it possible to maintain the same lifetime requirements when a is wrapped in b?
use std::rc::Rc;
// Compiles
// fn main() {
// let closures: Tree<A> = Tree::Group(
// vec![
// Tree::Leaf(Leaf { run: Rc::new(|_| {}) })
// ]
// );
// let mut a = A {};
// for _ in 0..2 {
// closures.visit(
// &mut a,
// |node, ctx| {
// match node {
// Tree::Group(_) => (),
// Tree::Leaf(leaf) => {
// (leaf.run)(ctx);
// }
// }
// }
// );
// }
// }
// Doesn't compile
fn main() {
let closures: Tree<B> = Tree::Group(
vec![
Tree::Leaf(Leaf { run: Rc::new(|_| {}) })
]
);
let a = A {};
for _ in 0..2 {
let mut b = B {
a: &mut a
};
closures.visit(
&mut b,
|node, ctx| {
match node {
Tree::Group(_) => (),
Tree::Leaf(leaf) => {
(leaf.run)(ctx);
}
}
}
);
}
}
struct B<'a> {
a: &'a mut A
}
struct A {}
type RunFn<Context> = Rc<dyn Fn(&mut Context)>;
enum Tree<C> {
Leaf(Leaf<C>),
Group(Vec<Tree<C>>),
}
impl<C> Tree<C> {
fn visit(&self, ctx: &mut C, visit: impl Fn(&Self, &mut C)) {
let mut stack: Vec<&Self> = vec![self];
while let Some(tree) = stack.pop() {
match tree {
Tree::Leaf(_) => {
visit(tree, ctx);
stack.pop();
},
Tree::Group(nodes) => {
stack.extend(nodes.iter().rev());
}
}
}
}
}
#[derive(Clone)]
pub struct Leaf<Context> {
pub run: RunFn<Context>,
}
// After fixing `#![deny(elided_lifetimes_in_paths)]` errors
let closures: Tree<B<'_>> = ...
The resolved type of closures has a single lifetime, let's call it 'c.
let mut a = A {};
for _ in 0..2 {
let mut b = B { a: &mut a };
closures.visit(&mut b, |node, ctx| match node {
As per the method signature, the first argument to visit is a &mut B<'c>. It has to be 'c exactly because B<'c> is underneath a &mut _ -- due to invariance, to throw in some subtyping jargon. So every time through the loop, you have to exclusively borrow a for 'c. (&muts are exclusive borrows.)
error[E0499]: cannot borrow `a` as mutable more than once at a time
--> src/main.rs:28:28
|
28 | let mut b = B { a: &mut a };
| ^^^^^^ `a` was mutably borrowed here in the previous iteration of the loop
...
36 | }
| - first borrow might be used here, when `closures` is dropped and runs the destructor for type `Tree<B<'_>>`
And I believe it's still talking about destructors because, as far as the compiler can tell, you could be storing the &'c mut A in the Tree<B<'c>>.
In the compiling version, there's nothing forcing all the borrows of a to be active at the same time. They be created and end immediately after the call, each time through the loop.
While the variance rules aren't intuitive it's very good to know that that was the culprit.
For those that stumble on this in the future - there are obscure rules that change lifetime semantics when you use generics & lifetimes in certain ways.
In simple & possibly inaccurate terms: my use of a sub-lifetime (the 'c in B<'c>) made it so I was stuck with one lifetime rather than being "generic" or variant over lifetimes like I'm allowed to do with A.
Oh, they are 100% intuitive. They are not ad-hoc or "obscure", as you put it. They are a direct logical consequence of what references can do. They are as basic and fundamental as any other borrowing and lifetime rule.
The basic idea is: you can pass a &'long T where a &'short T is expected – there's nothing wrong with providing a reference of which the validity is longer than strictly necessary. (Not allowing this would in fact render around 99% of interfaces completely unusable – you could never pass a &'static T to anything that would be satisfied with a short, temporary lifetime.) This is called covariance.
The one thing that you have to consider is that mutable references are a two-way channel. You are allowed to read from, as well as write to, a mutable reference. This means the following. When you have a &'_ mut T<'lt>:
the lifetime of the inner type can't be shorter than 'lt, because that could lead to a dangling pointer when "the other side" (consumer) of the reference tries to read from the referent.
But it can't be longer than 'lt, either – because that would cause a dangling pointer if the consumer of the mutable reference writes through it. Essentially, it would allow the other party to replace your T<'long> with a T<'lt> where 'lt is shorter than 'long. That's the exact same kind of dangling pointer/use-after-free error, except that it is now the creator of the mutable reference who experiences it, rather than the consumer.
Together, this means that a &'_ mut T<'lt> can only ever accommodate exactly a T<'lt>, which is called invariance.
Finally, when you have a function as the argument of a higher-order function, and it is declared to accept a parameter with lifetime 'lt, then you can only pass functions of which the argument admits a lifetime equal to or shorter than 'lt. This should, again, be completely obvious: if a function expects its argument to live longer than the caller can guarantee, that's once again a dangling reference/use-after-free. This phenomenon is called contravariance.