"Impossible" message from the borrow checker

This needs to be used as the definition of Lending in the context of the playground linked in the OP:

===> like so and so

It compiles, but that means that this bound:

Lender + Lending<'lend, __ImplBound, Lend = (usize, Self::IntoIterator)>

Probably isn't met or isn't doing what you want, since it puts an equality bound on something that is not Lender (which depends on for<'x> Lending<'x, SomethingElse>)

This compiles though:

pub trait NodeLabelsLender<'lend, __ImplBound: lending::ImplBound = lending::Ref<'lend, Self>>:
    Lender + for<'all> Lending<'all, Lend = (usize, Self::IntoIterator)>

(Sorry for the messy playground.)

(Written before I’ve read @quinedot's reply above, which seems to point to the same point of issue, but also suggests solutions)


Oh my… yes, you’re combining different Lender bounds here:

Lender + Lending<'lend, __ImplBound, Lend = (usize, Self::IntoIterator)>

where Lender implies

for<'all> Lending<'all>

If these match, then they interfere more problematically[1] (the compiler isn’t too good at handling multiple bounds like Trait<Associated = Foo> + Trait<Associated = Bar> that both touch the same associated type but call it different things).


This somewhat explains all the confusing stuff... nonetheless if you actually want to use your bound defining the type as the tuple in question, e.g. in

while let Some((_, _)) = lender.next()

then the version that used different types &'lend Self vs. Ref<'lend Self> of course fully breaks.


  1. or at least more subtly, let’s say :slight_smile: ↩︎

The cause of the error is both the trait bound and the println! call. It's an error in your code, not the compiler, though the reasoning is a bit obscure. Let me figure it out.

The compiler complains that lender is borrowed as mutable twice, which means that the borrow in (&mut lender).next() lives longer than a single loop iteration. Normally this doesn't happen, the receiver is borrowed anew on each iteration. However, note that your modified iterator returns Option<Lend<'_, Self>> which is tied to the lifetime of the &mut lender borrow. If something causes the returned item to live longer than a single iteration, the iterator will also live longer.

We have two clues: the trait bound and the println!("{:?}", x); call. The call invokes the trait bound, it implies that x: Debug. But note that the trait bound doesn't guarantee Lend<'_, Self>: Debug in general! Indeed, this is only the case for the specific lifetime 'a, which is the same lifetime as in <S as SplitLabeling>::Lender<'a>. Note that <S as SplitLabeling>::Lender<'a> is the type corresponding to the returned iterator lender itself. This means that lender.next() is Debug only if it lives as long as lender, and lender lives for the entire loop. This means that lender.next() lives for the entire loop, and due to the lifetime bounds in Lender::next, this means that &mut lender lives for the entire loop, causing the double borrow. :black_medium_small_square:

If you comment the loop body, it is no longer required that x is Debug. It may or may not be, we don't know, and the inference above doesn't prevent the proper drop of &mut lender.

If you use the Some(_) pattern instead of Some(x), this is a special case of the above, but we also ensure that the returned item is immediately dropped (the _ pattern doesn't bind any value, anything matched against it is eagerly dropped).

If you change the trait bound to for<'a, 'b>, like @steffahn suggested, you break the requirement that Lend<'b, Self>: Debug implies that 'b == 'a, and the borrow can also be properly dropped.

This also compiles...

pub trait NodeLabelsLender<'lend>:
    Lender + for<'all> Lending<'all, Lend = (usize, Self::IntoIterator)>

...which makes me wonder if having to duplicate the default lifetime guarding parameter is a red flag. Something to play with on a rainy day...

This solution could still be incomplete in case the IntoIterator type is supposed to depend on 'lend.

Ah could be (and almost surely the same with what I just posted). Have to step away for right now though.

It does, but the rest of the project kinda explode. :joy: I have to check whether it is just a matter of propagating this new bound correctly.

Yes—that's the whole point of the whole thing. We return an IntoIterator that depends on 'lend. Typically, you're sorting a lot of pairs offline lexicographically, and you return an iterator on iterator on successors. We want this to be lazy, so the IntoIterator returning successors depends on the global state.

I understand, but this does not compile, there's no trait, and the loop is empty, so there's also something else happening.

Hmm.. put differently, the question is whether the two 'lend lifetimes of NodeLabelsLender and Lender are somewhat independent or not, I guess :thinking:

In any case, this is certainly a … complex … lifetimes situation :joy:

For context, I think this is as simple as the compiler sort-of focusing just on the concretely specified Lending<'lend, __ImplBound, Lend = (usize, Self::IntoIterator)> bound, coming from NodeLabelsLender<'a>, in order to make sense of the the <_ as Lending<'_>>::Lend type on the type Lender<'a>: NodeLabelsLender<'a> type. As soon as you’ll be wanting to do anything like while let Some((_, _)), you need it to focus on that specific instance anyways as that’s the only one that promises the tuple type, but the compiler isn’t super smart around this, so the problem already surfaces without that.

This means that it only really accepts the fn next(&mut self) -> Option<Lend<'_, Self>> for the one lifetime it has the extra information about Lend<'_, Self>, that is, 'a, the same 'a from the Lender<'a> associated type. It thus creates a &'a mut …::Lender<'a> reference which borrows the thing forever, and creates the borrowcheck error.

1 Like

I might be leaving for today soon, too. If the problem isn’t as easy to tackle, feel free to come back asking for help - ideally with extra info such as e.g. some light commentary on the purposes of these traits and e.g. a representative example type that implements them (in a way that makes use of the lending-iterator capabilities) so it becomes more clear what is supposed to borrow from what here.

Pretty much my thoughts. Although I don't see any good reason why the compiler should pick that lifetime in general, for the Some(_) pattern. Looks like a quirk of type inference. It's not the only one, if a single specific impl is available, type inference commonly chooses it as the solution, even though nothing guarantees that it is the correct impl. It's super common for inference errors with integer literals: if you have an impl of Trait for a unique integer type (e.g. i16), inference will infer the integer literal to that type whenever Trait bound is required. If you have impls for at least two integer types, or none at all, inference will fall back to i32 regardless of existing impls.

Note that the specific choice of __ImplBound parameter value implies that the Lending<'lend, __ImplBound> explicit bound in NodeLabelsLender<'lend> is one of the assumed for<'a> Lending<'a> bounds in the Lender trait definition. With the choice of default values as in the example, Lending<'lend, __ImplBound> == Lending<'lend>.

If you change the default in the definition of Lending, or the specific bound in NodeLabelsLender, such that they are incompatible, the error goes away. For example, this slight modification compiles (__ImplBound == ()).

On the other hand, if we modify the linked example above to this (change loop pattern to Some((_, _))), the example no longer compiles, because the compiler can't prove the Lend<'_, Self> type in Lender::next to be of the required form.

I think it has little to do with the pattern at all. The compiler is generally really limited in reasoning about equality bounds like T: Trait<Associated = Foo>, and in a context with such a bound present will pretty eagerly re-interpret/re-write any T::Associated type into Foo immediately. In this case, there are bounds like T: for<'general> Trait<'general> and T: Trait<'concrete, Associated = Foo> combined, and this re-writing will turn <T as Trait<'some_lifetime>>::Associated into Foo eagerly, with the side-effect of restricting 'some_lifetime and 'concrete to be the same lifetime.

I would assume that thus kind of “re-writing” of types under known equality constraints cannot discern lifetimes, as in “only re-write <T as Trait<'some_lifetime>>::Associated into Foo if 'some_lifetime is the same as 'concrete”, which is understandable, because questions like “is 'some_lifetime the same lifetime as 'concrete” are not answerable in principle, given how lifetimes in Rust are designed. (The borrow checker only ever supports “I assert these two lifetimes are the same, please give me a compiler error much later in the combination of all of my assertions were contradictory”.

As for whether there’s any better approaches to handling type equalities than the apparent re-writing that happens, I don’t know; probably yes, but also general term-rewriting systems are super complex (and quickly be come undecidable, I believe) so it’s understandable that a relatively simplistic approach is taken.

It can become almost somewhat of an art to dance around what kind of reasoning you can convince the compiler of around type equalities, using trait bounds with associated types. E.g. the example I’ve documented here shows code that takes advantage of the same associated type projection “<S as TypeIsEqual>::To” being handled in different ways in the caller vs callee context. One context (caller) has a S: TypeIsEqual<To = T> bound which results in the compiler to re-write this accordingly into T. The callee context doesn’t have this and instead applies trait resolution to find the blanket impl that rewrites/simplifies this type into S.


Note that I’m not familiar with any of the compiler implementations around this; I’m just reporting on my mental model as a user of the compiler.

The type projection is irrelevant for the error, though. You can remove it, and still get the same error for the while let Some(_) case.

What's happening here is a surprising inference defaulting in the presence of redundant bounds. Basically we have that exists <'a> T: Foo<'a>, and also for<'b> T: Foo<'b>, and we have a question about the inferred lifetime 'L in fn next(&'L mut self) -> Option<<T as Foo<'L>>::Bar>. Even though the existential bound is entirely subsumed by a universal bound, and even though the signature of next doesn't restrict 'L in any way, the compiler still picks the only given existential bound as the unique trait bound (implying 'L == 'a), even though it's not unique.

I think technically it can be considered an inference bug. Bug I don't think it can be fixed, because likely someone somewhere already depends on that behaviour, and also because it's likely too closely tied to the general inference defaulting. The latter is used all over the place, and I believe in some cases there is no way to get the required generic inference without this kind of defaulting.

1 Like

What if you change the definitions of your traits like this:

pub trait Lender<T>: for<'all> Lending<'all, Lend= T> {
    fn next(&mut self) -> Option<Lend<'_, Self>>;
}

pub trait NodeLabelsLender<'lend, __ImplBound = Ref<'lend, Self>>:
    Lender<(usize, Self::IntoIterator)>
{
    type Label;
    type IntoIterator: IntoIterator<Item = Self::Label>;
}

This compiles. Does it work for your use case?

EDIT: nevermind. I can see that in your actual code, Lender is defined in an external crate.

Take #2. This compiles, and doesn't require any changes to the Lender. However, when I tried to apply it to your webgraph-rs code, I got 42 errors in different adapters. I think the change I propose is not entirely wrong, but I have no idea how your crate works, so can't tell whether those 42 errors can be easily fixed, or are a total dealbreaker.

Ok, at least it's good to know I'm not entirely stupid—it is complicated.

The purpose here is to have the lifetime of the IntoIterator in the returned (usize, IntoIterator) pair depend on the mutable borrow of next. The struct ArcListGraph in the crate shows why. It repackages an iterator on pairs of usize into an iterator if pairs (usize, IntoIterator<Item=usize>) grouping by first coordinate. Since we want to have a lazy implementation, the Succ struct, which is the implementation of IntoIterator returned, accesses the base iterator. Hence the need for the mutable borrow.

The current design of NodeLabelsLender emerged here to solve this problem. It came out after some careful treading on other options, so I'm afraid any radical change in that will have pernicious effects downstream. Also, the current design has served us very well—it's the first time we hit a wall.

Thanks for the suggestion. I'll try to put it in and see how far can I go.