How can I troubleshot an abnormally slow cargo check?

Hello, I have a double boot Linux/MacOS, and cargo check is 3 to 4 times slower on Linux. Someone else has run the exact same command on the exact same hardware and the same OS, and the time is on par with the MacOS one, so it's slow only on my laptop (it's not due to the OS).

I suspect there is something wrong with my settings or installation: how can I troubleshot this? I have run cargo +nightly check --timings, but the report is exactly the same as on the same hardware, same OS -- only the numbers are bigger.

I have checked with several project, so it's not specific to one codebase.

Any help would be greatly appreciated, because I have to wait ~20s for cargo-check to run, it's not bearable.

Thanks and have a nice day

You can try

RUSTFLAGS=-Ztime-passes cargo check

to find out what is taking time within rustc's actual compilation work.

There is something wrong with that command, it does a cold run every time, even if I only touch one file.

Do you have rust-analyzer or some other tool running? Any other build without the same RUSTFLAGS value set will invalidate the entire build cache.

If you're using rust-analyzer, run the “Stop server” command to temporarily stop builds while still being able to edit files.

Does your instruct cargo to touch the filesystem? I have seen bad performance from a buggy that caused a huge number of files to be stat-ed each time cargo was run.

It is not project-specific, because people with the same hardward and same project have much faster check times. I also have much faster check time when I boot on MacOS.

I didn't use rust-analyzer, but for some reason it works now. There is a fuckton of lines, though, something like dozens of hundreds: 1.05 MB file on MEGA

I use meilisearch for testing because it's a project big enough.

Hum, after investigation, it appears it has nothing to do with Rust, I have a general performance problem. Sorry for the noise.

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.