I'm trying to write a character decoder trait that supports generic implementation of the conversion algorithm so that both UTF-8 and UTF-16 instantiations of the conversion are generated at compile time. Additionally, I want the trait to be usable in a type erasure setting. I.e. it has to work as a vtable.
Even though I've added a marked trait bound on the generic part to constrain the generic instantiations of the logically private generic method to a reasonable number (2), the compiler complains. How should I promise to the compiler that, really, the decode
method only gets instantiated for u8
and u16
, so there's no need to worry about a large or unbounded number of instantiations?
enum DecoderResult {
Overflow,
Underflow,
Malformed,
}
#[no_mangle]
pub extern fn Decoder_decode_to_utf16(decoder: &mut Decoder, src: *const u8, src_len: *mut usize, dst: *mut u16, dst_len: *mut usize, last: bool) -> DecoderResult {
let src_slice = unsafe { std::slice::from_raw_parts(src, *src_len) };
let dst_slice = unsafe { std::slice::from_raw_parts_mut(dst, *dst_len) };
let (result, read, written) = decoder.decode_to_utf16(src_slice, dst_slice, last);
unsafe {
*src_len = read;
*dst_len = written;
}
result
}
trait UtfUnit {}
impl UtfUnit for u8 {}
impl UtfUnit for u16 {}
trait Decoder {
// public
fn decode_to_utf16(&mut self, src: &[u8], dst: &mut [u16], last: bool) -> (DecoderResult, usize, usize) {
self.decode(src, dst, last)
}
// public
fn decode_to_utf8(&mut self, src: &[u8], dst: &mut [u8], last: bool) -> (DecoderResult, usize, usize) {
self.decode(src, dst, last)
}
// private
fn decode<T: UtfUnit>(&mut self, src: &[u8], dst: &mut [T], last: bool) -> (DecoderResult, usize, usize);
}