Is Rust a good AI language?

Sounds like I'm not understanding what you're actually trying to do, because that all sounds like at most needing to hand in an external raw device handle (though I'd probably provide a patch to wgpu to do it automatically based on available extensions) and maybe a memcpy between staging buffers? (though that sounds like a pretty weird issue honestly, it has multiple discrete adapters without os mediated output redirection? Not really my area, but sounds like something is screwed up)

Maybe not the first thing you want to tackle without any render API knowledge, but a year for that still sounds absurd.

But as I said, this sort of throw away project is pretty much the ideal case for AI.

The requirements are at the top of the README.md here: GitHub - ZiCog/rust_embedded_wgpu: Example WebGPU "Hello triangle" running on headless (No X11 or such) Jetson or Raspberry Pi.

I might have guessed so as well.

If you can see anything in main.rs that is redundant or a simpler way to do this I would be very glad to hear it.

Looks like the DRM integration is already out-of-box supported by wgpu, the rest is hello world for the drm crate (maybe a bit uglier than it should be eyeballing it, but heck I dunno?)

CPU fallback is kinda painful looking, but I'd need to actually play with the hardware to figure out why it's even needed. Seems like it should be able to use HOST_COHERENT memory if it's really embedded, that sort of thing. wgpu might be getting in the way here. I also don't like the double-memory copy (out of GPU, then into DRM card) but it's probably nothing.

It's not just duplicating the shaders, by the way, it has two completely separate copies of a wgu application with their own separate render flow despite the fact that the whole point of abstracting the swapchain and image views is precisely so that display differences can be ignored by most rendering code. It's honestly the most embarrassing part of this code, and makes it look way more complicated than it actually is.

I wouldn't fire an intern for giving me this code, but I probably wouldn't hire them.

To repeat, just in case: AI is fine for this sort of thing, my problem is with the idea it would take anyone a year to learn this stuff.

1 Like