When to use lifetime bounds between lifetimes?

I was re-reading Learning Rust and I realized I wasn’t able to explain properly when "lifetime bounds between lifetimes" are required.

I remembered a section of the serde book where it was used: "Understanding deserializer lifetimes". There is a section about the Deserializer<'de> lifetime. It suggests the following:

- // Do not do this. Sooner or later you will be sad.
- impl<'de> Deserialize<'de> for Q<'de> {

+ // Do this instead.
+ impl<'de: 'a, 'a> Deserialize<'de> for Q<'a> {
  • Is it still required today? (I know that new lifetime elision rules were added a few years ago, and the "Advanced lifetimes" chapter of the book was removed at that time. Maybe it’s unrelated though. From memory, it was related to bounds between (generic) types and lifetimes, a different problem)

  • Do you have an example where this would be problem?

  • Can I simplify and illustrate the same problem with this code? (= Does it suffer from the same problem?)

struct Foo<'a>(&'a u8);

trait Decode<'de>: Sized {
    fn decode(src: &'de [u8]) -> Self;
}

impl<'de> Decode<'de> for Foo<'de> {
    fn decode(src: &'de [u8]) -> Self {
        Self(&src[0])
    }
}

(Playground)

You generally need separate lifetimes when working with exclusive loans (&mut), because they're invariant, meaning they're maximally inflexible/incompatible, and the compiler won't be able to unify them, at least not without causing unintended restrictions on the program.

Otherwise they're rarely needed, because lifetimes of shared temporary loans (&) can be automatically unified.

However, in general you should avoid using temporary loans in structs, because it prevents structs from storing the data, and makes the whole struct temporary and bound to a scope of some variable it happens to borrow from.
In deserialization it's rarely useful to use temporary loans, because it means new data can't be created during deserialization (by definition it's borrowed, not created/stored), so any string/data that isn't already in the source data exactly as-is, won't be possible to store in the struct you deserialize into (e.g. JSON can never ever deserialize a number to &u8). That's why most code uses DeserializeOwned from serde.

So any lifetimes at all should be rarely needed. The explicit <'a> all over the code is either for very advanced generic use-cases, or a novice mistake. The code in between uses very little lifetime annotations.

1 Like

Thank you for the answer!

I appreciate the effort you put into your explanation, and I think it’s all good advice. However, to be honest, much of the information was familiar to me, and I still find myself seeking answers to my original questions. This likely indicates that I did not articulate my questions with sufficient clarity. I’m sorry about that.

I am looking for an example where this is required, and where failing at specifying the 'de: 'a bound properly would lead to a sad outcome.

I feel there is something here. I’m aware it’s automatically unified most of the cases, but I don’t grasp the comprehensive rules, so it’s challenging for me to identify a precise example where it does not apply.

In the simplified code snippet I provided, no exclusive loan appears, leading me to believe that the implementation's flexibility is not hindered by invariance. However, it's possible my simplification may not be accurate.

To provide additional context: I encountered a code snippet similar to the one above, which employed a lifetime bound 'de: 'a in the following manner:

impl<'de: 'a, 'a> Decode<'de> for Foo<'a> {
    fn decode(src: &'de [u8]) -> Self {
        Self(&src[0])
    }
}

However, my intuition is that it’s not necessary to write these bounds explicitly, but I can’t explain why properly. (I’m not even sure that my assumption is correct.)

1 Like

Note that generic parameters of traits are always invariant. Using a bound in this case allows covariant-like usage.

I'd have to play around to see if there's more to it than this in the case of Serde.

1 Like

If you're interested in a non-Serde example, there is this SO question that shows where leaving out the dependency between lifetimes causes an error. I'll just paste that example:

struct DebugFoo<'a, 'b: 'a> {
    fmt: &'a mut std::fmt::Formatter<'b>
}

And if 'b: 'a is changed to just 'b':

in type `&'a mut core::fmt::Formatter<'b>`, reference has a longer 
lifetime than the data it references

the pointer is valid for the lifetime 'a as defined on the struct at 1:0
but the referenced data is only valid for the lifetime 'b as defined on
the struct at 1:0

This seems like a much more obvious case, since an object must outlive references to that object. So I'm assuming this has nothing to do with invariance, but I'm not entirely sure.

That hasn't been the case since outlive bounds on structs became inferred (when required for field well-formedness).

1 Like

Oh, thank you! I'll add a comment to the SO question to that effect, unless you'd rather do it?

Go ahead :slight_smile:

1 Like