Doc-Tests vs Docs -- Best Practice

So, as I understand, regular Tests are run in the local environment in which they're written.
But Doc-Tests are run as external code that draws on a compiled

So, if you have a and binaries-tests just won't be run against code (in documentation). But if you add a with the mods declared there tests will be run against code in all the documentation.

A (tragic!) result of this is that internal documentation code examples aren't generally tested -- as only public functions are accessible.
(Very Sad -- the distance of documentation and code is such a huge problem -- I loved the idea of tests in docs.)

BUT additionally, private functions documentation is still run in the tests. So if you have code examples demonstrating behavior or use of private function (again, for other devs/self) then these cause tests to fail.

Is the only way around this telling the system not to test every private function? (That seems fraught -- as nothing force checks that you removed that do not test flag from documentation if you change a function to public -- giving readers false assurance.)

Are there are other work arounds? (A few years ago there was a crate that could be used to modify visibility via traits, but it's unmaintained and a rather invasive solution.)

Alternate image with documentation example assuming local scope:

1 Like

Also, if no current workarounds: is there a good way to contribute to the rust project's error messages -- as a new user learning how to properly path code from mods is, like me, likely to spend a frustrated while figuring out that their code isn't running because Doc-Tests source completely differently than regular Tests and are only for API documentation.

That can't be true. cargo test --bins runs tests in all binaries. Adding --workspace runs all tests in all packages in the workspace, etc.

In addition to unit tests, (functions annotated with #[test]) there are also integration tests. And integration tests are run similarly to doctests; they exist in their own crate and can only use public APIs. Doctests are intended for testing public APIs in public API documentation. Test your private APIs in unit tests and document your private APIs without doctests.

I edited to remove confusion. The tests won't be run against code in documentation. As in the images above.

The idea that doctests are only for public APIs is what I'm asking about. This
(a) seems like a huge shortfall -- as a developer relies on documentation -- including just internally to understand and navigate the code.
(b) causes direct conflict with internal documentation as the doc tests attempt to run code in private documentation, but merely aren't able to (due to scoping / non-pub issues).

What I'm asking is (a) if I'm missing something (your response seems to indicate that as far as you're aware this is the state of things -- yes?) and (b) what is the best practice given all this. Do developers just not put code examples in internal documentation? Do they manually flag documentation not to be tested and hope that they remember to remove that flag if they change the pub/private status of a function later?

I'm very surprised at this corner of rust and wondering how to navigate best.
(Also curious if other people are as alarmed as I.)

I'm not sure consensus agrees with that. Documenting private APIs is a good practice but depending more heavily on doctests than unit tests for private APIs seems weird to me. Personally, I look for examples in documentation for public APIs, like most developers, but I don't expect to find them for private APIs. Primarily because accessing private API documentation requires jumping through some hoops for open-source projects. (See below for more on this.) If I want to find out how to use private APIs, I first look for unit tests. And failing that I will grep the codebase.

Unit tests already have the capability of testing private methods and functions, and they can generally be more thorough in what they test about the API compared to a small code example in documentation. Aren't unit tests the right tool for the job?

As far as I am aware, doctests are not intended to test private APIs at all. So yes, that is the current state of things as I understand it.

Best practice is to use unit tests.

In my experience, and as far as I know from code I've seen and worked on that is written by others, code examples don't have to appear in doctests for private APIs because they already appear in unit tests. Docs for private APIs are almost always slim anyway, because tools like will generate only public API documentation. So that's really all there is to look at. AFAIK, there is no way to get to generate documentation with private items.

Just getting cargo doc to output any documentation for private items requires a special CLI flag. I suspect few people bother doing this locally. Maybe in CI? I don't know of any examples in the wild.

In other words, private API documentation of any kind is probably uncommon because there is some friction in the ecosystem.

It seems unusual that you would expect the visibility of functions and methods to change frequently enough for this to be a major concern. (With such an unstable API, wouldn't you want to limit how much effort goes into documenting internals?) But I think the same conclusion probably holds; the unit tests are functional examples, even if they don't appear in documentation.

All that said, I have no visibility into rustdoc. The best I can do is basically the same as you can: Search for existing open issues to find out what the status may be, if any. This one is relevant: Private doc test flag ¡ Issue #60820 ¡ rust-lang/rust ( The last reply to that thread links back to other relevant threads on this forum.

So, it seems you are not alone but there has been little effort to address this shortcoming. If I had to guess, that is probably because unit tests are adequate and more capable than doctests.


My intuition is that if they get tempted to do this, they split those particular internals out into a separate crate where they're crate-public, because if the internals are complex enough to needs examples in their documentation, the module might be a crate in disguise.

1 Like

If you put Rust code blocks in documentation comments for private functions, they would by necessity be marked ignore (to indicate that they are Rust code but should not be run as documentation tests). When rendering documentation with an ignore tag (e.g. you later made the function pub or you used --document-private-items), the block gets an ⓘ icon indicating that (and a hover tooltip that states that) the example is not tested.

Generally, tests for the purpose of being tests should be written as unit tests (#[test] in the crate) or integration tests (#[test] in the tests folder), not as documentation tests. Documentation examples are collected and run as tests to ensure that the example compiles and is producing the result it's claiming to produce. But the purpose is to serve as a documentation example first, not as a test.

If a function is simple enough that examples written for the purpose of being basic usage examples are sufficient testing, then it doesn't need other tests... but it's also fairly trivial in that it doesn't want more testing coverage.

Even just filling issues is a great help, because it helps people like @ekuber who do a lot of work improving compiler diagnostics to identify places where high-impact improvements can be made.

If you're interested in going deeper, then @ekuber has a talk about doing just that:

The infrastructure of rustdoc is such that actually doing so won't be simple[1], but at a high level this specific issue could be addressed by tracking the module a documentation test comes from, and then if a name doesn't resolve, check if it's a member of the source module (even if private), and if so, show a specialized error message.

  1. rustdoc is, at a basic level, just invoking cargo run on the contents of the documentation test. ↩ī¸Ž


Ah, we're thinking of doc-tests' purpose very differently.
I'm not interested in using them to test the code. I'm interested in testing the docs.

One of the big problems with documentation (and comments) is the semantics aren't bound to code logic. Eventually non-trivial drift tends to occur.

The exciting thing about the simple assert(...) + doc tests is that you have a way of creating documentation in a way that's natural for people and computers to read.

It's just a nice way to fight documentation drift. And partial is so much more than nothing! :slight_smile:

Also, when I say "documentation" I'm including (as in the pics above) simple hover docs that pop-up while writing code. (lmk if there's a better phrasing I could use) The ability to press two keys and immediately see a signature and examples and few words of description is super helpful -- even when just going back to my own coding, I think. And it would be great to have the examples tested, to fight doc-drift (and also thereby encourage, even very simple, documentation by making it more trustworthy).

1 Like

Interesting. I only see that when publishing docs though. But the way I tend to use docs most is just via the "hover" docs in the actual code. At least in neovim I don't see the ⓘ icon there, though perhaps that's in other IDEs(?).

Re: purpose of testing.
I mentioned to parasyte above, but I'm def not interested doctests to test code :slight_smile:, but to help fight documentation drift -- to ensure at least the examples section of docs are accurate. So when someone pops up a hover doc on some bit of code to remind them of use or discover it it will at least be example-wise accurate. (Many people advocate against comments or sometimes even documentation precisely because of the cost of doc drift making any commentary but the code itself untrustworthy.)

That's super helpful, thank you.
It'll be a minute at the soonest before I go that route, as I'm just learning Rust , but that a really attractive contribution avenue and really nice video. I'll start looking at some of those compiler issues to begin, regardless.

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.