Apparently arrays ([T; N]) don't implement the Deref and DerefMut traits. Instead, it looks like the compiler inserts some magic so a &[u8; 42] will coerce to &[u8]... But at the same time some ([T; 0] to [T; 32] or so) arrays implement AsRef<[u8]>?
Does anyone know why this is the case? I would have thought arrays are such an integral part of the language that the compiler could hard-code these trait definitions, even if true const generics aren't implemented/stable yet.
For context, I encountered this when trying to implement a combinator similar to futures::join_all() that's backed by an array who's length is determined at compile-time by the number of items passed in.
Just a shot in the dark but could FixedSizeArray and Unsize<[T]> have something to do with this? (Not terribly familiar with compiler internals myself, so maybe it's a red herring.)
Of course, there's also the matter that arrays support a[i] and a.len() and etc., which cannot be explained by coercions. This must also be hardcoded to "behave like" auto-deref.