I was recently working on speed up the serialization of some heavy data structures.
And to do so, we basically removed some checks that were verifying that the structure is indeed constructed correctly and it's valid (not talking about check the byte-slice len or UTF-8 correctness. But about that the structure makes sense in it's context (Elliptic Curve elements)).
Then, once we did so, this was more or less the result:
/// Create a `G1Affine` from a set of bytes created by `to_bytes_unchecked`.
///
/// No check is performed and no constant time is granted. The `infinity` attribute is also
/// lost. The expected usage of this function is for trusted bytes where performance is
/// critical.
///
/// For secure serialization, check `to_bytes`
/// ## Panics
/// If the slice passed has `len < 96`.
pub unsafe fn from_slice_unchecked(bytes: &[u8]) -> Self {
let mut x = [0u64; 6];
let mut y = [0u64; 6];
let mut z = [0u8; 8];
bytes
.chunks_exact(8)
.zip(x.iter_mut().chain(y.iter_mut()))
.for_each(|(c, n)| {
z.copy_from_slice(c);
*n = u64::from_le_bytes(z);
});
let x = Fp::from_raw_unchecked(x);
let y = Fp::from_raw_unchecked(y);
let infinity = 0u8.into();
Self { x, y, infinity }
}
So the question comes by itself:
Is this usage of unsafe
the right choice? Since after checking rust unsafe book I saw the following:
In addition,
unsafe
does not mean the code inside the block is necessarily dangerous or that it will definitely have memory safety problems: the intent is that as the programmer, you’ll ensure the code inside anunsafe
block will access memory in a valid way.
But this makes me thing about the unsafe
keyword. And the fact that if unsafe
is used to mark things that are not only related to memory safety strictly, it kinda looses it's point and makes code review and read more confusing.
Which are your opinions on that? Is it clearly stated somewhere that unsafe
can/can't/should/shouldn't be used there?