Rust support for AI AMD XDNA

AI is gaining ground and is increasingly supported by future CPUs at hardware level. This is also the case with AMD ZEN 5 9000 and XDNA. To what extent will RUST support this?

Who knows more about this?

Based on https://www.amd.com/en/developer/resources/ryzen-ai-software.html it seems like the expected way to use them is to use a framework like Tensorflow to define the computation you want to do using a DSL and then have the framework JIT a program that runs on the actual XDNA silicon. If my understanding is correct, then rust itself doesn't have anything to do with supporting this. It would be up to rust crates like Burn to add support for them.

1 Like

Thank you for your thoughts on this. I still hope that something will be contributed by Rust-lang himself. (By example f16/f8/bfloat)

IIRC, the last time someone asked the Rust team about f16 support a few years ago, the answer was that they were not opposed to it but at least two things should happen first:

  • At least one of the backends (LLVM, cranelift...) should have direct support for the type of interest, without rustc needing to transpile everything into calls to a software emulation library.
  • There should be one widely available hardware target supported by that backend (ideally something that the rustc CI can easily run on) that implements native support for the type.

Now that Intel Sapphire Rapids is finally out, it's getting a lot easier to get access to full hardware FP16 compute (not just f16 <-> f32 conversions) via AVX-512_FP16 and AMX, so maybe today would be the right time to request f16 support again?

3 Likes

There's the half crate that has f16 and bf16 types.

1 Like

The current status is that there is an accepted RFC for IEEE-standard f16 and f128. The RFC includes that they shall be emulated where hardware does not support arithmetic on those sizes.

3 Likes

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.