Does pinning provide a backdoor to aliased mutability?

I continue to be intrigued by Rust's pointer model.

Consider this canonical self-referential struct:

struct A {
    x:  UnsafeCell<isize>,
    ptr_x: *mut isize,
}

Setting ptr_x to point to x and then expecting to change x and observe it through a mutable reference to A of course violates Rust's uniqueness constraint that allows only a single mutable reference to a memory location.

Indeed this can be seen in this playground. Godot never comes because the loop where we poll for change to .x is optimized into an infinite loop.
Code:

    fn change_x(&mut self) {
        unsafe {
            self.ptr_x.write(0);
        }
    }

    #[inline(never)]
    fn wait_for_x(&mut self) {
        while unsafe { *self.x.get() == 42 } {
            self.change_x();
        }
    }

What, however, if we wrap the self type in a Pin and use &mut Pin<Box<A>> as the receiver instead?

    fn change_x(self: &mut Pin<Box<A>>) {
        unsafe {
            self.ptr_x.write(0);
        }
    }

    #[inline(never)]
    fn wait_for_x(self: &mut Pin<Box<A>>) {
        while unsafe { *self.x.get() == 42 } {
            self.change_x();
        }
    }

Playground 2 (which I think doesn't even use Pin correctly since it doesn't have a PhantomData element to ensure the type is !Unpin).

In this case, x is changed through a mutable alias (run under nightly + release).
What's more, miri thinks the code that uses Pin does not violate the stacked borrows model.

Does this mean that std::pin is a backdoor to aliased mutability?

This is discussed in depth here: intrusive.md · GitHub

1 Like

Ok, I will read your article again. On first read I didn't see how it answered my question about the interaction of Pin with aliasing.

The answer is that it doesn't provide a backdoor, and that we need a backdoor.

1 Like

Let's see if I understand the current state of discussion correctly.

In general, there's lack of a defined way to provide limited aliased mutability due to the risks involved. Simply wrapping a field in an UnsafeCell does not provide such mutability (and it shouldn't). However, prior to reenabling the more aggressive exploitation of noalias in LLVM in March 2021, code that just used an UnsafeCell "worked" (hid the UB). Since then, it started failing, unless the struct is also wrapped in a pinned pointer. This is however technically unsound so the mega-tracking issue is still open.

MIRI doesn't flag it unless you give MIRIFLAGS=-Zmiri-track-raw-pointers due to the specification ambiguity.

But is it the case that the current compiler turns off noalias for pinned pointers as eddyb opined here, but what's missing is a to-be-discovered, more principled way of defining how to handle this situation? Informally, the compiler "thinks" - oh, this is pinned, it's probably pinned for a reason - namely someone hold a raw pointer that mutably aliases it (!?)

References:

(this post dates from before when noalias optimizations were reenabled in rustc)

1 Like

The more principled approach that I suggest in the gist I linked is to introduce some sort of UnsafeAliasedCell type, but it's unclear whether this is a realistic solution in practice.

1 Like

No. It's not Pin that makes the difference in your example, it's Box. If you take the Pin version and replace each Pin<Box<_>> with just Box<_>, you get the same result. Conversely, if you take the non-Pin version but make each pointer pinned, it still hangs. You're not making an apples-to-apples comparison.

What Rust lacks is something to allow one to write that second version, with Pin but not Box (yes, there is still a Box in the example, but that's just for convenient parallel to the original code -- you can easily remove it to demonstrate the same problem).

The current state of affairs, as I understand it, is that this language limitation sometimes forces you to box things to make them sound, even when the extra indirection is undesirable. Pin is mostly orthogonal and only pops up because it's also closely related to self-referential structs, but it's not part of the solution.

That doesn't seem particularly relevant to the quoted text. Dangling pointers and incorrectly aliased pointers are distinct concerns. The context of the quote was about a soundness bug unrelated to noalias.

2 Likes

I see. It's the fact that Box wraps a raw pointer, correct? So we're not actually creating a mutable reference to &mut Self when we pass a & Box as the receiver of a method, and hence it doesn't violate the rule for references.

I will note that the example without Pin, but with Box, still produces a MIRI warning when run with -Zmiri-track-raw-pointers.

:thinking: I'm not certain, but I think it's just that Box is a pointer, regardless of whether it's raw or not. That is, the fact that &mut Box<_> is a pointer-to-a-pointer is enough to stymie noalias optimizations on the inner value. But, if I'm right, this is really just a fortuitous accident and not really an intended or desirable feature of the compiler. A better compiler would be able to do those optimizations anyway, which means we actually need to use *mut T instead of Box<T> if we're actually trying to use this strategy to make a data structure. (Note that the use of Box here is different from @alice's suggestion in the post linked above, where the Box-derived *mut T is inside the self-referential struct, and Box<T> itself is not stored.)

This seems to corroborate my theory: you actually need to use *mut T to avoid the uniqueness of Box<T>.

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.