It's great to see more people talking about GitLab CI configuration. When I first set up my projects, it took me forever to figure out.
Why are you hosting your own docs? Does https://docs.rs not work for you?
@japaric's Trust project provides templates for AppVeyor and Travis, but it'd be awesome to get some GitLab CI templates up there. I can help review if you submit a PR.
At work we have a bunch of internal projects using Rust and can't publish them to crates.io (they're part of our product). We used to use GitLab Pages to host documentation, but then switched to a private S3 bucket because GitLab Pages will serve content anyone (i.e. you can't stop outsiders from seeing our internal docs).
I can't share the config we use at work, but here's a fairly similar one I've got for one of my personal projects:
Something to keep in mind with CI is to take that extra couple minutes to make sure you get caching set up right. It's a pain, but if you don't bother early on you'll find that over time build times for trivial changes go through the roof.
If your project has custom build dependencies (e.g. you interface with C code), you can create your own docker images and then push them to a private repo-specific registry provided by GitLab. It's super useful because it means there's no more need to install deps on every build.
I used to use GitLab for cross-compiling and releasing binaries, but most of our customers use Windows and cross-compiling to Windows can be a pain when there are non-trivial C dependencies involved.
I'm using GitLab because my project has native dependencies, so docs.rs doesn't work. Also, @Michael-F-Bryan , builds are still really fast if I just use apt-get with the cache, and it feels a lot simpler, so I'm sticking with that for now. But custom Docker images are a good idea if I ever need a more consistent build environment.
I like to test my crates with multiple rust version.
I made a repo with that:
I identified several ways to do that:
install the tool chains in the script/begin_script and cache that (this was my first approach, see commit f14b86753513bfcb823d3eb20b89c2853f04e2df)
use an already available docker image with all the tool chain you need
use already available docker images, one for each tool chain
do one off the two, but manage the images yourself using the gitlab repository, and using gitlab-ci do build the image
The solution I implemented is the last one: I have a gitlab-ci jobs that build a docker image with all the tool chains I need. This image is updated daily using a scheduled pipelines. The other jobs use this image.
For reference, here is the resulting gitlab-ci.yml:
I'm trying to consolidate CI documentation after getting frustrated with posts that were out of date or didn't document why they made different choices making it hard for me to make the call on which to accept.