Project with build.rs → docs.rs problems

So, I have a project that uses a build.rs file to do some custom build steps, involving bindgen to generate bindings for a "native" library that I need. This all works fine, locally.

Now, when I publish the project on crates.io, the problem arises that docs.rs tries to build the documentation for my project, but that always fails, because the custom build steps in my build.rs will fail in the environment where docs.rs is executing the build process :anguished:

The required "native" library is missing in the docs.rs build anvironemnt.

Even if I somehow was able to detect, inside of my build.rs, that we are running on docs.rs (BTW: how would I do that, if it is possible at all?), then it wouldn't really solve the problem. Sure, I could simply skip all the bindgen stuff in my build.rs, iff we are running on docs.rs, but then the rest of the project wouldn't work (i.e. fail to compile) due to the missing bindings!

What is the "standard" way of dealing with this situation?

Note: I already have a working CI pipeline, implemented via GitHub Actions, that performs the build process and also generates the docs just fine. That is possible, because GitHub Actions allows me to set up an environment where the required dependencies are available.

Is there a way to just completely disable the involvement of docs.rs for my project?

(Sure, I could just ignore that it is failing, but I don't like that idea :thinking:)

Thank you!

#[cfg(docsrs)] in your code and the DOCS_RS environment variable in your build script:

https://docs.rs/about/builds#detecting-docsrs


If you publish your crate on crates.io, it will also be documented on docs.rs.


If you can install your dependencies from apt get, you could add them to the container that docs.rs uses to build crates. Or maybe you could compile the dependencies from source when building in the docs.rs container?

2 Likes

If you can install your dependencies from apt get, you could add them to the container that docs.rs uses to build crates.

If I understand this guide correctly, then I ultimately would have to make a pull request and hope that the Docs.rs guys would be willing to add "my" required package to their build environment – which I assume is unlikely to happen for just a single project that needs it.

I also tried installing the required package at build time, by detecting the Docs.rs build via DOCS_RS environment variable and then issuing an apt-get install from my build.rs script. But that also fails, because of insufficient permissions :face_with_raised_eyebrow:


My "solution" for now is to detect the Docs.rs build in my build.rs script, via DOCS_RS environment variable, and then manually setting up the paths to the to the required "header" files, so that the bindgen step can succeed. Since I can't actually install the package, I must keep around a copy of the required "header" files in my repo, so that I can fall back to these, in the special case of a Docs.rs build. That's kind of ugly, but at least seems to work...

While I've never done this myself, I wouldn't be too pessimistic about this. If you look at the file for Linux dependencies you'll notice that there are quite a few.

If you were to expand this and keep a copy of the whole source of your dependencies so that you could actually build it with the available tools in the docs.rs image, you'd be at what I was trying to suggest with

and then you could call this a feature :slightly_smiling_face: (like openssl does with vendored)

3 Likes

The docsrs about builds page will likely be useful.

See Forge for how to add a dependency [to crates-build-env or otherwise].

It may take a bit, but docsrs is very willing to provide the necessary build environment, so long as you aren't outright abusing the service.

For native dependencies that aren't guaranteed to exist on the system, it's reasonably common to provide a default experience that will work independent of the system state. I've often seen buildscript logic that's basically

match (cfg!(feature = "build_lib"), cfg!("link_lib")) {
    (true, true) => panic!("bad config"),
    (true, false) => build_from_source()?,
    (false, true) => find_link_to_system()?,
    (false, false) => find_link_to_system()
        .or_else(build_from_source)?,
}

There are of course tradeoffs to whatever behavior you choose here. Make sure to document it well! And take a look at some of the popular *-sys crates for how they handle it.

But, to more directly address the case of docsrs — if your native library's headers don't use #if to change between targets (i.e. the generated API is identical on any target), it's actually better to bindgen the bindings ahead of time and commit the Rust file, instead of running bindgen at build time. A build time bindgen is necessary if the headers are target dependent or if different supported versions of the headers aren't perfectly compatible with being used as-if the main supported version, but if there's a reasonable option that can soundly generate bindings AOT[1], generally it's worth taking it IMO.

If the library claims ABI stability (able to relink with newer versions of the library without rebuilding the downstream users) then using "old" headers is fine. You'll just maybe want some solution for conditional usage of newer API.

You can set the documentation field in your package manifest to change what cratesio links to, and configure the docsrs build (e.g. to fail builds quicker), but docsrs will always process every package uploaded to cratesio, and if you can get the docsrs build functional, that's preferred to self-hosting.

cargo-docs-rs can be used to mimic the docsrs build process locally, although that doesn't mimic the build environment (for that, use crates-build-env).


  1. E.g. I'm working on bindings to a commercial lib. Due to the nature of distribution, the lib crate must support a wide range of the built library, distributed as a built binary, and the header version must exactly match the build version (it's validated at runtime). This sounds like a case for build-time bindgen (and legal reasons might push me to do so anyway), but my current plan is to publish multiple versions of the lib-sys crate, one per release of the commercial lib. The safe Rust wrapper can utilize whichever version, and binaries can use an = version constraint on the lib-sys crate to ensure that they get the appropriate version to match the lib distribution they're using. ↩︎

4 Likes

Since I’m currently “the guys” I can tell you we add everything that is just an ubuntu package.

Just adding the package is often the easiest way to make the build work.

If that doesn’t work, there are other options that @CAD97 nicely described.

3 Likes

It always slips my mind for a bit, but a significant reason to prefer not doing bindgen from buildrs is supporting build script overrides. The binary package can use config.toml to declaratively set what gets linked (and how) for any crate that sets package.links, and doing so replaces the build script running. So, ideally, the buildscript of a *-sys package shouldn't do anything that can't be done via the build script override.

2 Likes

Since I’m currently “the guys” I can tell you we add everything that is just an ubuntu package.

Just adding the package is often the easiest way to make the build work.

If that doesn’t work, there are other options that @CAD97 nicely described.

So, is there any chance for adding the libtss2-dev and libjson-c-dev packages?

@jofas described it well above:

If you can install your dependencies from apt get , you could add them to the container that docs.rs uses to build crates.

@jofas described it well above:

If you can install your dependencies from apt get , you could add them to the container that docs.rs uses to build crates.

Yeah, I understand that I can clone crates-build-env, add the required packages to packages.txt and then build the image myself. This will add the packes to my local variant of the image, so I can locally build my project using that modified image. And, in fact, I just did exactly that. But how does all that help me with the problem that the "official" Docs.rs service fails to build the documentation for my project when I publish my project on Crates.io? :confused:

After all, the image that is used by the "official" Docs.rs server would have to be modified, right? So, that is why I asked, whether you would consider adding these packages...

That's where you make a PR to crates-build-env; that's the official process for adding something to the image used by docsrs builds.

1 Like

So, I was following this guide:

In step "Add package" I deliberately did not add a package (for now), expecting that it would fail to build my project because of the missing package – in the same way that the "real" Docs.rs currently fails to build my project because of the missing package. However, when I ran the step "Testing the image", it apparently did do a whole lot of stuff, but nothing that looks like an error. Seems to complete without problem, when it clearly shouldn't :thinking:

Next, I edited the file "src/lib.rs" (in my $YOUR_CRATE directory) and deliberately added a syntax error. This way it should 100% fail, right? Nonetheless, when repeating the "Testing the image" step, it once again did do a whole lot of stuff, but again nothing that looks like an error.

Does the described "Testing the image" step use any files files from my $YOUR_CRATE directory at all? Yes, it does! If I delete/rename the "Cargo.toml" then it actually does fail with an error. So it does require a "Cargo.toml", but completely ignores build errors, including syntax errors.

Something with this procedure seems fishy, or I just don't get it... :face_with_raised_eyebrow:

rustdoc can be surprisingly lenient in some cases. (It replaces basically all code that isn't used in types or function signatures with loop {} to reduce the work required. Or well, it did at one point, and it similarly avoids checking things that don't matter today, but via a different mechanism that can support RPIT.) I'm not saying that's necessarily what you've run into, but that's one potential reason.

More likely, you encountered a "successful" result of the documentation build failing. Which is different from an error due to failing to find the package at all.

I'm just hypothesizing, as I haven't used it, but you do need to check the output of the documentation build. I expect you'll find a docsrs page showing failed doc build. The process of trying to build the docs makes the docsrs page, including ones where the docs fail.

2 Likes

Okay, thanks! Meanwhile, I think I have come up with a simpler testing method:

Just build the crates-build-env image like this:

cd crates-build-env/linux
docker build --tag build-env .

Start an interactive shell in the container image we just created:

docker run -it build-env

In the container, install Rust:

curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

...and then check out my code via Git and just build it via cargo build.


PR is on the way :smiling_face:

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.