`bytemuck::bytes_of()` rejects simple `[f32]` array

I use bytemuck::bytes_of() to prepare for a wgpu buffer (from a compute shader 101 example)

let input_f = &[1.0f32; 8192];
let input: &[u8] = bytemuck::bytes_of(input_f);

and get this error

the trait bound `[f32; 8192]: NoUninit` is not satisfied
the following other types implement trait `Pod`:
  [T; 0]
  [T; 1024]
  [T; 10]
  [T; 11]
  ...

It still works for array size 4096, but also fails for many other sizes (for example, 100). Shouldn't the NoUinit bound be satisfied for any size of a simple f32 array?

You could either switch to cast_slice(), which doesn't care about the input length but doesn't assume u8 output, or enable the min_const_generics feature of bytemuck, which makes it work how you'd expect (and probably compiles a little faster even)

2 Likes

That's it! Thanks a lot @simonbuchan !

1 Like

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.