Discussion: is cargo really a package manager?

Lateley i've been thinking about Rust as a useful language for website development. There are a lot of libraries with various quality scattered around, some are very well designed and tested, some not so, but what i see is lacking is a some sort of framework that allows to efficiently manage larger projects, to compose packages together, to manage dependencies and so cargo comes into question.

In a way cargo works like make, it compiles Rust code; it does it in a lot friendlier manner, since user does not have to specify each file that has to be compiled, but looking from functionality perspective those two tools have the same role. But not only that 'cargo' automatically downloads dependencies, in this regard it kind of works like package manager; it allows to install packages, so, kind of like package manager but not quiet.

Websites require that there are an HTML, CSS, and JavaScript files. There might be necessary some images for UI, maybe fonts. But cargo/crates in practice do not support packaging into them other resources than library or application itself. While crate is just an archive that contains crate sources and built artifacts, it does not allow to explicitly package images, fonts or other resources in it, although technically it would be possible.

Yes, sure, there are macros include_str! and include_bytes! and currenlty a lot of "web-related" crates take that approach, but to me this does not seem viable approach for bigger projects.

For example, if i would like to include bootstrap.js with UI elements, i would like to specify it in Cargo.toml as dependencies to current project, and cargo deals with dependencies very good; i could even go so far as to create a crate for each UI element that only contains JS and CSS files (from dependency perspective cargo handles this easilly), but in practice it is not possible to do it in a clean way, because crate must be a library. Yes, sure, i could use include_bytes! and somewhat achieve the goal, but this really seems more like a hack than a solution.

When building a crate that contains JavaScript, a minifier could be run; CSS minifier could be run as well, and what not. But cargo gets in a way with doing this. 1. it does not allow to package built artifacts into package/crate, even if i use build.rs file; 2. it does not allow to redistribute built result (since each project has to compile dependency crate localy, thus all build-deps have to be built as well; thus instead of just downloading minified JS file, user would download whole source and build minify it localy, which seems a waste of resources and time).

Another approach would be to use another/different package manager to build JS packages, but then i must build all project/Rust crates with that package manager as well, because cargo does not allow to specify a different package manager for dependency crates. It is not possible to delegate dependency building to other tool, cargo kind of forces "all or nothing".
So if a choice is made to use different package manager, then i'm on my own then, which is not a nice place to be, because cargo has a lot of good features and things already solved.

So my questions are:

  1. What is cargo? Is it intended to become a functional package manager that is capable to package resources or other artifacts as well or is it intended to be capable only to compile single little standalone projects? How do you see cargos role/place?
  2. How do you deal with UI elements like images, fonts, etc.? I mean, it would be so nice if those artifacts could be managed like any dependency.
  3. If not using cargo, then what are the alternatives? Crates.io is a nice package registry, it's very convenient to add a new dependency and it "just works". Is there someone using different package/dependency manager to build Rust projects?
  4. How hard it could be to implement a "resource crate" or similar concept to achieve this for current cargo? Cargo kind of seems quiet flexible, but i do not know internals and limitations.

Cargo is a package manager. A package manage for building Rust libraries and programs. It does it very well.

I am not yet into developing Rust for use with the browser so I don't deal with it. However I feel that is outside of the scope of Cargo. Cargo builds Rust. I don't think it should take on responsibility for building and bundling all kind of other things. Especially not foreign languages like JS.

I am not aware of any other Rust package repositories. I will be happy if Rust does not start to suffer from the chaos of the web world with a hundred different packaging and building systems and new ones sprouting every week. I don't need that kind of fragmentation.

There is a way out.... When using Cargo to build Rust if there is a build.rs file in the project that will be compiled and run first. It is used for building any dependencies in C and that kind of thing. I guess build.rs could be used to do anything else you want, like transpiling typescript, bundling resources or whatever whims web developers have that week.

Actually I think it should be the other way around. Web developers tools like webpack, rollup or whatever should use cargo to get the Rust code built and then bundle up the resulting Wasm, as they do everything else.


If I understand correctly, this is an exact use case for wasm-pack - it builds Rust with Cargo and packages the resulting Wasm into NPM format, which can be consumed by any JS packaging tool.


Well, this is nice while the built result is only a command line tool without any artifacts. In this case, cargo does not have to deal with JS code, it just would have to bundle it. I mean if someone is programming GUI application, he needs icons, fonts, maybe some other artifacts to be bundled with the resulting package. At this point in time everything has to be compiled into library, which does not seem correct.

I'm not aware of any either; only that it is possible to create a private repository. But in this case it does not help, because cargo as a tool does not allow to bundle other artifacts into the package. I agree on the fragmentation issue.

Yes, i know about build.rs, but it has, has a huge drawback that whenever a dependency is pulled in, it is run un a given host. What i would like to achieve is that when package is built, it is done on build machine and all that depend on that package do not have to build it. Because in web case, more often than not, JS code and images do not need to be compiled in any way, they just have to be accessible by the web server as files and laid out into specific directory.

I think the same would apply to someone who is creating a desktop environment, or native GUI application...

On this i can't agree. I believe that if i'm building a web service in Rust, then i want to pull necessary dependencies through cargo. If i am extending existing JS webserver, then i would be using those tools to call Cargo an pull in Rust dependencies.

In this case, i am not focused on wasm so much. My intention is not to compile Rust nor JS to wasm. What i need is a way to bundle images, JS and CSS files for use as-is on web page and for me compiling everything into library with include_bytes! macro seems unmanageable for bigger projects. So that's why i kind of started to think about it deeper, that actually when building a package, a more generic package manager is necessary and in a way cargo is close to be such. But is it ever intended to be as such?

There has never been intention to make cargo more that a focus on Rust. Many dream.
The "more generic package manager" does not exist and never will. The closest the world got (IMO) to something software related in the more-generic classification was Java.
Beyond Rust you are likely to use the packaging tools for the platform you are deploying to. You then get in trouble (if you care about bugs) with web servers that have been designed for live updating.

Your into a whole different level. Some (myself) like complete separation of package manager from build bots.

I could make a comparison to C++ world. The big name build system for C++ now a days is Cmake. Like Cargo Cmake only concerns itself with building libraries and executables from C++ sources.

When a C++ user wants to bundle executables and assets into a single file for distribution they will be using something else to do that. That something else likely uses Cmake to get the executables built. (Anyone here have experience of this kind of thing?)

Personally I don't think Cargo should grow into a general purpose bundler for desktop apps, games, web apps etc, etc. That is an unbounded requirement.

1 Like

Good point. The tool we have that is designed for all languages and all types of packages ("unbounded requirements") is Make. My impression is that it is still very useful for mixed language projects, to glue together different build and packaging tools.

1 Like

This kind of answers my question; but Cargo is very functional that it almost teases you to want to use it as a generic package manager.

I've never tried to implement generic package manager, thus have no idea what kind of challenges are there. I guess dependency management is just one of them. In Rust case since SemVer is used, dependency resolution is somewhat "easier" than in generic case; but for me SemVer works just fine.

No, i don't feel as if being on a different level. Just searching for a ways of how processes could be improved.

Yes, i would be interested if someone could share some experience regarding this as well.

I understand your point. Though in simple application cases, Cargo feels like complete package manager, just run cargo install and you get the application, just like with any other well known package manager.

While I agree that Cargo shouldn't take on unbounded extensibility, I think there are a lot of use cases that would benefit from packages being able to present non-(executable or library) outputs. If there was a place that the package files, or the build script, could put arbitrary files that can be named and used by dependents, then this would enable a lot of use cases such as:

  • including some JS or CSS for a web bundler or server
  • in situations where dynamic linking of a foreign library is mandatory, providing a copy of that library
  • data files for the program to read, in the case where include_bytes!() is not an acceptable solution for whatever reason (e.g. perhaps they are supposed to be editable by the end-user, or they need to be open()able)

Cargo would not do anything with the content of these files, only copy or delete them inside target/ when appropriate; they would be used by either whatever invoked cargo, or a dependent's build script.

There's already the unstable RFC 3028 "artifact dependencies", in which binaries produced by Cargo can be used by later build steps. This would be the same kind of thing but with files that aren't specifically the output of the compiler.


Yes. This sums up nicely how i was seeing things. I had read about artifact dependencies in hope that it allows to pack arbitrary files before starting this discussion.

I kind of understand the worries about "unbounded extensibility" as well, because the more capable the tool becomes, the more people start to use/abuse it for various things and request more features so somewhere the line has to be drawn.

Do the kids of today...cough...web developers even know what make is?

As brilliant, simple and versatile as make is I'm not sure I want to wish it on anyone.

1 Like

Most of the are social, not technical. And the core issue is “uncooperative upstream”. Cargo (or any other per-language package manager) can assume that uses would be cooperative.

But if you are doing general-purpose package manager then you have to deal with questions like how would you correctly handle upgrade from version 42 to version 15.

These need some kinda technical solutions, of course, but they also need an army of packagers and these have to be coordinating, too.

It quickly becomes managerial nightmare, but technically there's nothing too much complex.

Let me put it that way: Android's main branch includes about million lines of Make in 10000 .mk files, million and half lines in around 20000 of it's own blueprint files.

There used to be more blueprint but some of it was already converted to around three hundred lines in 3000 of new bazel files while make files continue to prosper.

The only change that happened was replacement of GNU make with kati. Power of specialization: ninja priocesses five million lines produced by kati almost hundred times faster than GNU make processed it's one million lines, but… it's not universal thus not a replacement for make.

It's possible that it feels like it works so well because it doesn't try to do all the other things, and thus is really good at what it does, rather than minimally-usable for everything.


One of the things I have always, always struggled with as a game developer, in every language and for every platform, is cleanly packaging resources. Textures, animations, audio, tile maps, models, scenes, no matter what the resource is, if I have to load it from the file system at runtime, I have to pay the error handling tax.

I/O errors, file-not-found errors, file permission errors, errors caused by truncated files. My code is sprinkled with conditions to handle errors that should not exist. The only solution I've found for managing resources is compiling them into the binary with include_bytes! and friends. If the executable loaded, all of the file data it contains are guaranteed to exist, without permission errors, or I/O errors, or any weird errors caused by file corruption. Leave it to the OS to correctly page out the .rodata section.

This makes Cargo an absolutely perfect package manager for game development. In the world of web servers, I'm also using the same trick for static CSS files. It's a bit more difficult to do this with TLS certs because they need to change more frequently than you would want to build. But for anything truly static, This Is The Way.

1 Like

I don't believe this is a feasible approach for bigger games, but i am not a game dev. Neither this protect against I/O errors due to corruption, if data on disk gets corrupt, then even library can get corrupt and eventually the process will crash. And there are a lot of drawbacks for "packing-in" data. For example, in web often times you just need to change some JS file content or CSS, reload browser, see how it goes and continue like this till you get the job done; so, fast reloading is essential for this kind of workflow; if i have to rebuild the crate all the time, just so my new JS file gets reloaded it takes just too much time. On the other hand, if JS file is as separate file, i can rebuild web server only when server code changes.

Another thing where this is unfeasible is when you have a file packaged that is in a "starting" position, but must change over the time. If speaking in parallels to games, i would suppose that could be a user created map for game, maybe saved game state. In the end, you have to deal with file system anyways. In web server case, this is a configuration file for server; you want it to be accessible to admin, to tweak server for certain situations, this file should have a "starting state" whose configuration works like a nice example and then it gets changed.

Yeah, a lot of crates do that, but do they do that because that's the optimal way, or do they do that because at the moment there is no better way available to achieve this without introducing 3rd party tools (if they existed) or implementing custom builder?

It could be a problem on Windows, where PE files have a rather small maximum size limit (4 GiB). iOS is even worse at 500 MiB. For Linux/ELF, the limit is 64-bits, so it effectively does not have a limit (given commonly available storage capacity).

But this is not the whole story. Compression should also be considered. And the most extreme form of compression (at least in games) is procedural content generation. I'm not saying every game needs to be 96 KiB, but if you need to fit a modern AAA game into 500 MiB, there are ways!

While true, it's beside the point. We can probably trust the file system to not just randomly cause havoc. But can we trust users to not accidentally delete or move files? Not that it matters anyway! The fs APIs we have in Rust all return Result types. We must unwrap them. Error handling is strongly engrained. Move that error handling to compile time.

You can split the development workflow so that it loads from the FS in the normal way and bundle for release. That's literally how traditional web development is done anyway.

This is not a good reason to avoid the FS where it makes sense to avoid it. Of course you have to write things to disk. Of course you have to support community mods. But nothing forces your mod loader to read 1,000 files. Give me one distribution file that the game loads. Who cares if it's a zip file or something totally custom like Doom WAD files. Just move as much complexity to compile time as possible. That's what it's all about.

Have you heard of "the barrel file debacle"? Loading files from disk is kind of slow, actually. Especially when there are a lot of them. Especially when they are on NTFS [1]. Most especially when they are on NFS. And then there are the network concerns, since we're talking about web apps. You don't really want your clients sending 100's of requests for individual files, do you?

Packing files is optimal.

  1. Try reading thousands of files from NTFS, then run the same test on the same hardware using ext4. Spoiler alert: https://pages.cs.wisc.edu/~bart/736/f2016/Akshay_Vaibhav_ntfs_ext4.pdf and Ubuntu Linux, Day 16: EXT4 vs. NTFS | PCWorld ↩︎

Yeah, it is. But in this case i have to write my own script to build package. But seeing how capable cargo is, i was thinking if this could be implemented by reusing existing functionality. Or if people would even be interested in such a thing, or do they think it is just plain wrong way to do things.

My idea was, that each component for web site could be like a crate, so when i place it in deps, cargo downloads it automatically as a result it provides me JS, CSS files, etc in target directories. So community could share a web UI blocks just like crates and when they are added in deps, they just download necessary artefacts/resources. The main benefit i wanted to get here from Cargo was dependency resolving.

What else i wanted to achieve was that downloaded crates would contain already minified JS files, thus when someone adds a dependency onto any of those crates, they do not have to wait at build time while the build.rs is run for each dependency, just to minify JS files in it.

The package managers i've seen usually produce a tar or zip file, that contains contents and there is an instruction file which instructs which file must be extracted where. Even in cargo case, you can easily extract crate file with tar -xf. So for my intended use case the only missing feature is to allow to bundle resource files. It could be done for example by build.rs file writing to stdout commands similar as there are already prefixed with "cargo:package-this-file-in".

While playing around with build.rs i tried to manually copy those files in the directory where cargo is staging files before archiving, but it detects such a "mis-use" and complains (rightfully so).

I totally agree with this one, highly likely, no one does.

Haven't heard about this exact debacle, but am fully aware of the situation. This why JS minification is done; for dev build i could run without minification, for release with minification. But this kind of digresses away from the topic i had in mind which has more or less a focus on "should cargo allow to embed other artifacts in crate not just library files", "should it be closer to package manager or should it be just a build system for app and packaging has to be done manually". Something on those lines...


This is slightly tangential, but some recent discussions have come back around for shipping pre-built proc-macro crates for the same reasons. It seems to have a bit of momentum (but not necessarily enough to get off the ground).

I'm not sure if a generalized approach is useful with Cargo. It's an idea that has been around for a long time, however. Here's a blog post from 2018 that mentions downloading pre-built artifacts: Rust 2019: Think Bigger (fitzgeraldnick.com)

Technically, the crate (zip file) does include everything in the source repo by default, including images, JSON, HTML, markdown, and anything that is not explicitly excluded. What you would be interested in is a way to somehow get at these files from a dependent. I don't think I would personally have any use for this, so I'll leave it there.

Every time I try to use make, I have to re-learn what a space and tab are.