Dependency Resolution with multiple versions?

Hi everyone,

I am reading Dependency Resolution - The Cargo Book and some things are not clear to me.

  1. What is actually happening when two different versions exist in the dependency tree? How does it build the binary??
    For example I have just run cargo tree -d in my sample project and I have found this:

  2. Lets say I have the following:

  - /main_app/
    - ([dependencies] lib1)
  - /lib1/
    - ([dependencies] crate_abc)

the question is can I access crate_abc without actually specifying it in [dependencies] in the main_app and how?

  1. If it actually builds many different versions of the same crate into the application (as I can see in my very simple app where there only few crate dependencies specified). I guess it consumes more space and probably decreases performance. Would it be better to have some kind of framework? So everyone rely on the same frameworks and have all those rand_chacha of the same version and the only framework version increases overtime (providing with long term stability and security)? For example such frameworks could exist (rustlib.core, rustlib.math,,, etc). It may not reduce the number of version conflicts to zero, but it could reduce it significantly.

It builds just fine. Every crate will see the version of the dependency it specified (according to semver compatibility rules). When building a given crate, Cargo passes the paths of its dependencies to rustc explicitly, so each crate is built against the appropriate version.

You can't by default. Your library can publicly and explicitly reëxport crate_abc, though, and then you can access it as my_lib::crate_abc.

Why do you think that? And why do you worry about that if it's mere speculation (ie. you haven't measured it and haven't found it to be an actual performance problem)? First of all, one piece of code doesn't run slower just because there's another piece of code sitting beside it in memory. Big binaries can increase cache pressure, but most Rust binaries hardly ever fit in L1 instruction cache anyway, so binary sizes are not where you should look for optimizations in like 99% of the time.

Have you considered that there is a good reason why it's not done like that? The fact that it's simply not possible should be a pretty great reason. Cargo only pulls in multiple versions of the same crate if they are semver-incompatible (as they are in your example above). Some crates depend on older versions, others depend on newer versions, and no amount of "frameworks" can solve this, because you simply can't force everyone to always upgrade their libraries' dependencies at the same time, throughout the entire ecosystem, immediately, and without exception.

If you depend on crates A and B, but crate A depends on rand 0.7 while crate B depends on rand 0.8, then the only thing cargo can do is compile both. Or it could outright refuse to compile your code, but then you won't be happy either, because it would make it impossible to use both of your required dependencies.

How can I do that? What is the syntax to reexport crates?

I could have multiple web applications (same approach, same tools) which could rely on the same bunch of crates. So solving all deps and testing only the one so called "framework" and use it in all my applications is my one of the reasons.

Please note, It is not about to compile the entire framework into my application, but rather to pickup only those crates I need. On the other side if I create a shared library, everything will be compiled into the binary. Or as an alternative that library could exist aside of all my applications as a shared compiled dll.

pub use other_crate;

1 Like

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.