Differing match results between opt levels when casting to #[repr(C)] enum

I've been doing some serialization work and recently encountered a really unexpected situation where depending on the compiler opt-level, a match statement can produce differing results. Does anyone have any insight on why this is? This almost seems like a compiler bug, since O0 and O1 shouldn't be following different logical branches with the same data, but it could easily be UB nose-demons manifesting due to the use of unsafe type casting.

To demonstrate, try the code below with opt-level=1 and opt-level=0. With no optimizations, the discriminant is read as none, however with opt-level 1 or higher, it's detected as some.

Upon further exploration, it appears that this is perhaps related to inlining as setting #[inline(never)] on is_some causes the test to succeed even with opt-level=1

If I run this with miri it says that there is UB due to missing alignment.

You could fix this by making a struct that wraps a [u8; 4] and then mark this as #[repr(align(4))]. That inner array then should have the correct alignment.

It's UB at least for the reason that align_of::<[u8; 4]>() < align_of::<DummyOptional>(), making the cast unsound.

1 Like

It will also be UB in case the first byte is something other than 0 or 1. You may know this, but since you say this is for serialization, it's worth making sure. You must check that the data forms a valid value of the type before reading it as that type.