I defined some customized vector type:
pub enum LocalStorageVec<T, const N: usize> {
Stack { buf: [T; N], len: usize },
Heap(Vec<T>),
}
Then I want to implement the Index
trait for it:
impl<T, const N: usize, I> std::ops::Index<I> for LocalStorageVec<T, N>
where
I: std::slice::SliceIndex<[T]>,
// [T]: std::ops::Index<I>, // Uncomment would cause compile error.
{
type Output = I::Output;
fn index(&self, index: I) -> &Self::Output {
match self {
LocalStorageVec::Stack { buf, len } => buf[0..*len].index(index),
LocalStorageVec::Heap(data) => &data[index],
}
}
}
First, the trait bound [T]: std::ops::Index<I>
seems useless, because it's implied in the std library:
impl<T, I> Index<I> for [T]
where
I: SliceIndex<[T]>,
type Output = <I as SliceIndex<[T]>>::Output
So adding it should not have any effect at all? But actually, the compiler reports an error:
mismatched types
expected type parameter `I`
found struct `std::ops::Range<usize>`
In addition, (I thought) trait bounds should bring some capability instead of causing the loss of some. Why is the opposite happening in this case?