In CI builds, especially builds based on Docker, Cargo often ends up re-downloading the whole index, and all crates, on every build (because
~/.cargo is thrown away). This is problematic, because clones from GitHub are slow, and the index is quite large. Re-downloading of crate files isn’t fast either.
The obvious solution seems to pre-load and cache the index. And maybe also preload crates needed by the build, or at least some most popular ones.
What would be the best way to do it?
I’ve found a few ways, but the best ones are a bit too fiddly to nicely fit in one-liners of dockerfiles or yaml definitions that CI platforms use.
cargo search whateverpopulates the index as a side effect. It feels a bit hacky, and could stop working if Cargo changed implementation.
Cargo.lockfrom a project, but replacing
lib.rswith an empty file & running
cargo buildhas a pretty nice effect of getting all required dependencies pre built. The downside is that it is a bit fiddly to do with a bash 1-liner. It makes the container itself depend on a particular project built in it, which seems like a layering violation.
~/.cargoin a custom distro package, e.g.
apt-get install my-cargo-index. That neatly integrates with existing build systems that install and cache system dependencies anyway. It is an extra hassle to maintain a package for the index.
Are there other approaches? How do you solve this problem?