Webgpu reinterpret u32 bits as f32 and viceversa

Is there something more direct / efficient than:

export function f32Bits(bits: number): Scalar {
  const arr = new Uint32Array([bits]);
  return new Scalar(TypeF32, new Float32Array(arr.buffer)[0], arr);


This notion of constructing an array for every conversion seems awfully expensive.

[1] https://github.com/gpuweb/cts/blob/2a2d3a9/src/webgpu/util/conversion.ts#L918

In Rust code, you can use f32::to_bits and f32::from_bits. That repository is all typescript code, though, so this seems like the wrong forum for the question.

That's definitely my fault; an article linked there, and I (incorrectly) thought it was wgsl code.

As an aside: is webgpu / wgpu / naga / wgsl on topic here? On one hand, it's a "different language"; on the other hand, most if the stack is Rust and GL questions seem accepted here.

You'd be looking for bitcast for... bitcasting :slight_smile:.


This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.