Indexing Vec with i32 is necessary

On 32 bit platform objects can be larger than isize::MAX bytes in size. It's not exactly common, but it's possible.

You can not have std::vec::Vec which is larger than isize::MAX in size but AFAIK nothing prevents you from allocating memory directly using low-level OS-provided routines and then using std::slice::from_raw_parts.

Having objects that large is not possible in Rust at all.

As it says in https://doc.rust-lang.org/std/slice/fn.from_raw_parts.html#safety,

The total size len * mem::size_of::<T>() of the slice must be no larger than isize::MAX.

See also https://github.com/rust-lang/rust/pull/95295#issuecomment-1097011792, which confirms that it's a Rust rule that objects cannot be larger then isize::MAX bytes.

(Now maybe there's some situations where you can use unsafe code and pointers and such to look at more than isize::MAX contiguous bytes. But you can't have a Rust object that large, not even as a slice. So you can never have a & to the whole such thing.)

2 Likes

I also think allowing indexing slices with i32 or especially u32 would be nice. I don't think this is technically a breaking change to std, because this is just adding a trait implementation which is allowed. But it would change the status quo with type inference.

The current approach essentially forces most of my integers to be typed as usize, even when I know they are in the range 0-1000, just because they usually end up being used as indices somewhere. If such a change were to be implemented, I would probably start using u16 (or u32?) for such integers.

In a way, it would make the code safer or more portable, because usize is a platform-dependent type. So it's more suitable for things where you want to support "anything that fits in memory", rather than 0-1000.

You can new-type it (since Vec, [T; N], and [T] aren't fundamental).

Also if indexing with u32 is allowed, then creating Vec of u32 size should also be allowed.

let v = vec![0; 1000000u32];

If usize is 16-bit, this would panic, which is the right thing to do. If you simply want a 1000000-element vector, you would just do that and never have to think about platform-dependent things like usize.

However this idea creates a lot of complications because there are plenty of other std functions that also deal with usize (e.g. len()).