char is basically
u32 with some limits on what values it can take. This allows things like not having to check if char input is invalid range when converting to 1 to 4 UTF-8 bytes, for example.
Is there a way, without writing a compiler plug-in, to specify other similar range limits on integer types? For example, is it possible to define a profile of
u16 that promises that the value can’t be a surrogate or that the value can’t be a surrogate or an ASCII value other than 0? Having the compiler check limits like these statically on literals assigned to such a profiled
u16 would allow conversions to UTF-8 and UTF-16 without run-time branches to check for possibilities that have already been excluded at compile time.