Why is there no `impl From<u32> for usize`?

Its mere existence won't make code stop compiling directly, but it may allow you to write usize::from(u32) that will compile fine when you test on a 32/64-bit machine, and not compile on a 16-bit machine. What I'm saying is that this particular failure case is just one of many things that can fail when porting to a 16-bit platform, and a relatively small one compared to all the other inherent difficulties of tiny platform. In practice you'd try hard to avoid using u32 anyway, which is why this is irrelevant in practice.

There are popular models of MSP430 that have 2KB. There are models that have 0.5KB. There are models that have up to 66KB (not a typo). Still, what I'm trying to say that taking a typical crates.io crate that has not been written with 2 or 64 or 66KB of RAM in mind, is very very unlikely to be useful. The imaginary scenario of taking a random crate and using it on a 16-bit platform thanks to portability of u32.try_into().unwrap() used instead of usize::from(u32) is a fantasy.

The existence of usize::from(u32) is only controversial in the context of "breaking" crates that were written for 32/64-bit platforms and did not explicitly consider a 16-bit platform (i.e do not compile-test for it, do not have #[cfg] for it). What I'm trying to say that a 32/64-bit crate like tokio or serde or remaining 99% of crates-io is never going to work anyway, so whether it breaks on an usize::from(u32) doesn't matter.

Writing for 16-bit platforms needs special care, needs specific designs, needs code style that is unidiomatic on 32/64-bit platforms. Existence of usize::from(u32) is only problematic if you assume these differences don't matter and code can be ignorantly written for 32/64-bit and moved without any changes to 16-bit. It can't.

2 Likes

But nobody is talking about randomly choosing crates and trying them all out on 16-bit systems. This is a strawman argument.

Some (no-std) code that doesn't need much memory will work on 16-bit. It's a matter of how much code you're pulling in and how much data you're using, and there exists certain kind of code that will easily fit in 64 KiB (= 66 KB).

C was used on PDP-11 and that's a 16-bit system.

The talk about programming for 2 KB or 0.5 KB of RAM is a useless distraction here because 16 bit usize is not inherently limited to that.

2KiB is a common size of the only 16-bit chip that Rust currently supports. But even 64KB is still too little to take crates designed for 32-bit architectures and use them without modification (BTW 66KB is not a conversion mistake. It's the 2KiB SRAM chip with additional 64KiB RAM).

I'm emphasising taking crates designed for 32-bit or larger systems and using them out of the box on 16-bit target. This is not a strawman argument, it's the primary use-case given for omission of usize::from(u32).

If you remove the "unprepared crate, no code changes" requirement, then crates designed with 16-bit targets in mind will prefer u8 and u16, dodging the From problem. Or where they really have to use u32, they will do it very carefully, testing on actual targets, and not accidentally run into var.into() that suddenly doesn't compile for downstream users.

I want to emphasise how impractical assumption of moving 32-bit code to 16-bit platform with no design changes and no testing is. u32.try_from().unwrap() adds debug-printing machinery. It adds panic location data. It uses needlessly large data type that may even require software emulation (not every msp430 variant has 32-bit arithmetic). Even if it works in theory, even having it work is highly undesirable in practice.

By analogy, going from 64-bit to 32-bit is like going from a truck to a car. No big deal, almost the same thing. But going from 32-bit to 16-bit is like going from a car to a bicycle. You could design a car seat or steering wheel, or aircon system that theoretically fits all three, but even if that was made to work on mechanically, it'd make an awkward and heavy unergonomic bicycle. Same goes for 32/64-bit code. Even if some random u32-using code could be made to compile without any changes on 16-bit targets, its overall size and weight designed for 32/64-bit sizes is going to be unfit for purpose on 16-bit machines anyway, and will have little practical use.

If you want code to work on a 16-bit target, you have to design it for 16-bit limitations and test it. If you design and test it for 16-bit, you won't be surprised by non-portability of usize::from(u32).

3 Likes

I don't think so.

For instance, I am pretty sure that the arrayvec crate will work fine on 16-bit systems.

There is no specialized code for 16-bit there, there are no 16-bit tests in its continuous integration, and yet I think it's likely it will work fine with no changes. That based on the fact that core APIs are identical.

1 Like

It's exactly the case of a crate that can work in theory, but is a poor fit in practice. This is because of:

It's a total waste to use large expensive u32 for all array sizes, on a machine that won't have arrays larger than u16, and even a single array that needs u16 would likely be a program-dominating special case. You'd rather use u8 for all lengths, but arrayvec wasn't tailored for 16-bit, so it doesn't let you configure that.

6 Likes

OK it's not optimal, but it's going to work... which is a good thing. It can be optimized. That doesn't refute the value of it working. Maybe you don't care this inefficiency in some particular program.

Or maybe you optimize and use tinyvec instead that uses u16 for length (and also doesn't test on 16-bit platforms).

Or maybe you write your own replacement if you're trying to squeeze every last byte.

But you can't deny there is some value in portability.

I think that's an example of a negative value of such portability. We're worse off with it than without it. When u32 conversion compiles, you get crates that unnecessarily use an inefficient type. If u32<>usize wasn't portable, you could easily detect these cases, and improve the crates to use u16 instead.

These targets are so small that an inefficient code is not a case of "phew, at least it works". It's "oh no, it just blew my code size and memory budget and the stack has smashed into my data and everything crashed".

Rust's ecosystem is pretty open, so a compile failure is not a big deal. Changing u32 to u16 is a much better solution for 16-bit platforms than having u32.try_into().unwrap().

The ecosystem is already dealing with std/no_std split, and I get PRs for removing unnecessary std uses in my crates. A similar removal of unnecessary u32 uses would be good for 16-bit platforms.

3 Likes

I am all for avoiding portability pitfalls, but as someone who has recently looked into making one of my libraries work on the MSP430 - I have to agree with kornel.

You have to be so careful with memory usage that either you will:
A) Use a crate specifically designed for 16-bit.
B) There is no B.

In the arrayvec example, the assert on u32::MAX for array size is incorect regardless of maximum memory available because with a 16-bit usize, i16::MAX is the largest object Rust will support. So, array over 32K bytes? Broken.

1 Like

Not broken. The assert is a no-op on 16-bit and on 32-bit, but the code isn't broken, you're simply going to get a compile error if you try capacity larger than isize::MAX bytes.

Yes the assert is a no-op, but my point was that any code using an array (or any struct) better be under 32K or it will fail. How is that any different than impl From causing a compilation error? It's a pitfall you need to be aware of.

1 Like

This seems like a bad idea. I could see maybe a 16-bit Clippy lint for usize::try_from(u32_value).unwrap() and similar, but with this usize::from proposal, we'd be turning From from a fairly simple concept of infallible conversions on all platforms (hence the inability to convert usize to u32 through it) to one with pedantic language-lawyer-style gotchas.

For example, how is this going to be documented? Will it only be at the site of impl From<u32> for usize so that people then have to start checking every single trait-type combination that they use in case it too might be an obscure exception? Will the list of exceptions be documented in one place? If so, will people have to read and memorize what the exceptions are so that they don't have to refer back to that one place every time they use a trait?

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.