Hex literals for signed integers

I am implementing a protocol that defines special values for certain integer types.
The specification defines those as hexadecimal representations of the signed integers as returned by the "{:x#}" format string, e.g. 0x80 for i8.

However, I cannot initialize a signed integer with such a hexadecimal value:

The compiler's suggestion works as intended, but I have three issues with it

  1. It looks ugly and unnecessarily complicated.
  2. I don't trust as casting as it may break my code somewhere due to e.g. truncation.
  3. I will need to add a clippy exemption above the definition in order to satisfy my strict linter config (see 2.).

I also find this error very surprising, since the conversion into the other direction results in the exact same hex literal:

Why does Rust have this inconsistency?

Note that this is not an error, but a deny-by-default lint. You can just allow it if you prefer to write overflowing literals.

5 Likes

Ah. I didn't know that. Thanks.
I think in my particular use case, this is exactly what I want.

1 Like

If you want to keep the lint for other literals, you could define a const helper function, but it is a little ugly:

const SIGNED: i8 = from_literal::<0x80>();

const fn from_literal<const N: u8>() -> i8 {
    N as i8
}

This does the same as the `as` cast and will silently truncate the value. I would do

const SIGNED: i8 = 0x80u8.cast_signed();
// OR
const SIGNED: i8 = u8::cast_signed(0x80);
4 Likes

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.