I think the only thing I’m uncertain about is whether such a function would be instant UB.
Pros: You can’t really mutate anything through a &&mut T that you wouldn’t be able to mutate through a &&T as well (e.g. through interior mutability). And a &&mut T is not unique.
Cons: There might be some kind of possibility for misoptimization that I’m missing, or just some kind of language specification saying that it isn’t allowed. Getting creative here: the compiler might assume that for x: &&mut T and y: &&mut T holds: if (x as *const _ != y as *const _), then (*x as *const T != *y as *const T).
Also, there could be API’s that only hand out &T, or non-unique Rc<T> or whatever (T: !Clone) so that you wouldn’t be able to get you hands on a value of type &&mut T any other way. And such a library could, potentially, for whatever reason, rely on this...
Another Pros (counterargument to the second Cons): Libraries like take_mut do, too, allow for getting types, e.g. a T by value, from APIs where that might otherwise be impossible, e.g. it only hands out &mut T references and T: !Clone + !Default, etc..
I’d also appreciate e.g. links to previous discussions or existing libraries if you know of any.
John Regehr had written a blog post in which mis-optimization of UB-containing C code resulted in something like this:
if (ptr_1 == ptr_2) {
printf("%d %d\n", *ptr_1, *ptr_2);
}
output:
0 1
i.e. the same memory location appeared to hold two different values at the same time. (This was not on a quantum computer. ) If I recall correctly, the source of the bug was a violation of strict aliasing, although I can't seem to find that particular blog post with the example code anymore.
If it's not UB, I believe it would still be library-unsafe. Imagine something that needed a Mutex-like guarantee that had a method like
/// INCORRECT: We statically know there are no `&Self` for `'a` if you can call this
fn ensure_exclusivity<'b, 'a: 'b>(self: &'b &'a mut Self) -> LockGuard<'a>;
Edit: The above is completely wrong reasoning, see immediately below.
Unique immutable closure borrows also came to mind, but I can't think of a way to make that "work" (break).
ensure_exclusivity is unsound (even if &&T -> &&mut T is UB), because you could call it twice with the same shared pointer. A &T can never statically prove exclusivity of anything, for any choice of T.
The easy answer to this, I think, is that yes, this is UB. Transmuting an &A into an &B has the same definedness/safety-behavior as transmuting an A into a B because references are defined to always refer to a valid live instance. (It's more complicated than this when you bring repr(C) aggregate types into the mix, but that's not relevant here.) That means that your question is equivalent to, "Is transmuting &'b T to &'b mut T sound or UB?" to which the Nomicon has a very clear answer.
So, this discussion (so far) suggest that transmuting between &'a &'a T and &'a &'b mut T is sound in both directions, i.e. going from &'a &'a T to &'a &'b mut T or from &'a &'b mut T to &'a &'a Tis sound. Pay attention to the lifetimes ;-)
Since &'a &'b T is a subtype of &'a &'a T, the transmutation in question—from &'a &'b T to &'a &'b mut T—is sound, too. Moreover, by a similar argument, even coercing &'a &'b T to &'a &'c mut T—e.g. in particular &'a &'b T to &'a &'static mut T—is sound as well. Transitively, e.g. transmuting &'a &'b mut T to &'a &'static mut T is sound, too.
But for example the other way, &'a &'b mut T to &'a &'b Tis unsound.
It also seems to be sound to transmute between &'a &'a T and &'a Box<T>, again in both directions.
Thank you, that's a wonderful discussion. Here's how I've adjusted my mental model. If anyone spots a flaw in the reasoning, I'd appreciate hearing about it!
&'a &'b mut T cannot be considered piecewise; the outside & "rewrites the whole type".
Specifically, &'a &'b mut Tis&'a &'a T (mind the lifetimes).
For the lifetime of 'a, there's a shared reference to T, precluding exclusive access.
Thus &'a &'b mut T and &'a &'static mut T are the same, no problem.
And &'a &'b T coerces to &'a &'a T, given 'b: 'a.
So &'a &'b T can be transmuted into &'a &'_ mut T, no problem ('b: 'a).
However the "overall type" of &'a &'b T acts differently:
You can pull out the &'b T using *
You can extend it to &'b &'b T (Edit: wrong, see below)
So it is unsound to transmute &'a &'b mut T to &'a &'b T
Intuitively, when 'a "expires", &'b mut T can again be actively exclusive
If &'a &'b T was allowed, you could also have &'b T at the same time; alias violation
I was thinking if I had a &'b T, I can always get a &'b &'b T, but that's incorrect. The inner reference's own (unnamed) lifetime may be shorter than 'b. (Or am I missing something more fundamental?)
(Side note, I believe everything else holds without the "extension". An implicit deref in one of the examples made me think something was going on which wasn't.)