Rust, gstreamer and Apple's stuff

Hi there :wave:

I'm running a CoreML model inside a Rust app for macOS, and it's been quite painful so far. I need to interface the following:

SwiftUI → Rust (with a lot of gstreamer-rs) → CoreML (Swift again)

I went for swift-bridge after having realised it would be mostly painful to have bidirectional calls with uniffi.

Aside from apparently the Servo people with core-foundation-rs, did anyone have any kind of success interfacing with Apple's stuff?

Sorry if my post is a big vague, I'm trying to understand whether I'm wasting my time. At this point I'm wondering whether I would be better off just rewriting most things in Swift for this platform and reduce the Rust part to a library of core algorithms I can call from there.

For evaluating ML models, it would probably be the easiest to simply convert them to a format that can be loaded by a Rust library (or a library with Rust bindings). For example, ONNX models can be created from CoreML models (among others), and they can be readily run from Rust.

My model is coming from PyTorch so I can even run it with Torchscript through tch. But the performance is really not the same. On my M1 Macbook Pro, it runs 3 times faster on the Neural Engine than on the GPU with Metal.

I am not aware of any other way than CoreML to run models on the NE.

I think Hotz tried at some point for tinygrad, but it doesn't look like he released anything.

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.