Apparently arrays (
[T; N]) don't implement the
DerefMut traits. Instead, it looks like the compiler inserts some magic so a
&[u8; 42] will coerce to
&[u8]... But at the same time some (
[T; 0] to
[T; 32] or so) arrays implement
Does anyone know why this is the case? I would have thought arrays are such an integral part of the language that the compiler could hard-code these trait definitions, even if true const generics aren't implemented/stable yet.
For context, I encountered this when trying to implement a combinator similar to
futures::join_all() that's backed by an array who's length is determined at compile-time by the number of items passed in.