Changing lifetime elision comes up from time to time, and is a topic there are a lot of strong opinions about. You can find a lot of prior discussion, for example by following some of my links that follow.
Not having to declare function lifetimes in particular was accepted as a concept, then rejected before stabilization.
If nothing else, read this.
Back in the heyday of the Ergonomics Initiative, it was popular to refer to pre-vs-post rigorous learning. Most "ergonomic" and "simpler" arguments appeal to the pre-rigorous stage. And similar logic is frequently used to argue that adding more inference in favor of omitting explicitness / optimizing for writing in the language.
However, in my opinion, this approach is problematic when it actively makes things worse for those with a rigorous understanding. Whether or not you agree that this actually harms per-rigorous programmers or not, I feel lifetime declaration elision is way too harmful to those who have reached a non-beginner understanding and appreciation of lifetimes.
Being able to type less is not really "ergnomic" or good language design when you end up obscuring critically important information. You may not appreciate knowing which lifetimes are local or not today, but if you become experienced enough tomorrow, you will.
That's the short version; below I write way too much more. You may find it interesting, but you could also just skip it.
RFC 2115 is the kind of a mega-RFC about lifetime elision which was accepted, partially implemented, and partially unaccepted. Some parts of the RFC are
There's an interesting mix here of having to type less and having to be more explicit:
// In-band lifetimes (unaccepted)
-fn foo<'a>(a: &'a str, b: &str) -> &'a str { a }
+fn foo (a: &'a str, b: &str) -> &'a str { a }
// `elided_lifetimes_in_paths` (allow by default)
struct Bar<'a>(&'a str);
-fn bar(a: &str) -> Bar { Bar(a) }
+fn bar(a: &str) -> Bar<'_> { Bar(a) }
And the reasoning was along the lines that naming lifetimes without declaring them was still clear enough yet more ergonomic and easier to understand, whereas complete lifetime elision with no &
sigil is too confusing and should never have been allowed, because whether or not a return value (or type more generally) is borrowing something often matters a lot.
The counter-arguments were that lifetimes coming out of nowhere is more confusing, harder to read, and also opens a big can of worms with regard to accidentally using a lifetime from a prior scope, accidentally introducing a new lifetime, or accidentally conflicting with or changing the meaning of far away code -- i.e. the typical problems of "implicitly declared because you used it". There were various things suggested to mitigate some of the problems, like sigils or naming conventions to segregate implicitly declared and explicitly declared lifetimes, but none stuck.
Being able to have struct
s implicitly generic over lifetimes wasn't part of the accepted RFC, but it was discussed as an option for single-lifetime struct
s during the lead-up. Removing all or most visual clues that a type borrows would not have been acceptable under the reasoning above, but maybe there could have been
struct Bar<'_>(&str);
It was probably limited to single-lifetime struct
s because when you have multiple lifetime-parameterized fields, deciding whether they have the same lifetime or different lifetimes is something you should give deliberate consideration (and changing your mind is a breaking change). It's analogous to the function lifetime elision we do have only working when there's a single input lifetime.
Personally I'd be fine with the single-lifetime version. The only thing I'm thinking of that it encourages programmers to do worse off-hand is already something almost no-one knows about or thinks about.
Anyway, that's some history; here I mostly dump my personal opinions.
My view on RFC 2115 is that it tried to take on too much at once, it was based too much around personal preferences, and it was way too rushed because some people wanted it to be part of edition 2018. Some good things did come out of it:
-
'_
is an almost 100% net positive IMO, and has applications beyond elision; it's rare for a Rust "ergonomic" "improvement" to be this successful
-
I use elided_lifetimes_in_paths
all the time and hope momentum to enable it (at least at the warn
level) returns despite the larger RFC-unacceptance
But I'm in the camp that implicit declaration of lifetimes is a very bad thing, so I'm glad the in-band portion of the RFC died. I also disagree with the arguments that it streamlines learning or makes things any simpler. Omitting critical information or even making it less visible does not make things any simpler. It makes it visually less complicated, but actually reasoning about what goes on, or sometimes behavior itself, becomes more complicated.
If you have a function signature that requires naming a lifetime, where that lifetime came from is critical. Moreover, having fn foo</* named lifetimes go here*/>
in particular is valuable because it gives you a place to look; you don't have to read the signature and guess.
I also feel that admitting the feature needs some way of segregating local and non-local named lifetimes (case, length, extra sigil, whatever) is half-admitting that this is, after all, about (a) optimizing writing over reading or perhaps (b) some war against <...>
. Once you have such segration, you're not longer removing the declaration; you're just presenting a new way to signal it. One that just doesn't stand out so much. One that is easier to miss... which is a demerit for critical information.
I say half-admitting because I recognize that there legitimately is some group of people out there who are really thrown off by the presence of <...>
and earnestly believe it makes things less understandable. Or at least, that's the most charitable conclusion I've come to. In any case, some people just really, really want to make <...>
go away. Personally I don't find it a problem at all, and sometimes it's a benefit.