|
16 | use ort::tensor::TensorElementType;
| ^^^^^^ could not find `tensor` in `ort`
For more information about this error, try `rustc --explain E0432`.
error: could not compile `ultralytics-inference` (lib) due to 1 previous error
Here is my code:
use ultralytics_inference::{YOLOModel, InferenceConfig};
fn main()
{
// Load your custom NCNN model (auto-detects format)
let mut model = YOLOModel::load("custom_ncnn_model/").unwrap();
// Configure exactly like Python predict params
let mut config = InferenceConfig::default();
config.conf = 0.6; // Confidence threshold
config.show = true; // Real-time display window
config.line_thickness = 2; // Box line thickness
config.save = false; // No output saving
config.stream = true; // Process video frame-by-frame
// Run prediction (handles video streaming + visualization automatically)
model.predict_config("video.mp4", config);
}
The problem is that ultralytics-inference is using ort via a git dependency, without any version information. You'll need to manually lock the ort crate to a version that ultralytics-inference works with. Try running this command:
cargo update ort --precise 2de34065983a5c034f5afcc072b23b99479f465b
That git hash is copied from the ultralytics/inferenceCargo.lock file.
Hi mate I don't get that error anymore, thank you, but now I get this error:
thread 'main' (100562) panicked at src/main.rs:6:59:
called `Result::unwrap()` on an `Err` value: ModelLoadError("Failed to load model: Load model from custom_ncnn_model/ failed:Protobuf parsing failed.")
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace