Why is there no convenient toolchain for C and C++ like Rust?

If i want to create a new Rust project i just have to execute cargo new <project_name>.

In C or C++, i have to create the project manually and add a make file. Or use something like Qtcreator to create the project for me.

Is there a reason why C and C++ do not have a standardized tool-chain like Rust despite being much older ? Is there some technical boundary in these old languages ?

2 Likes

I'd say it's not "despite", but "because of". They're older then the well-spread practice of the standardized toolchains, and everyone has already built their own one.

15 Likes

I don't know really but:

The C language was created when there was no networking everywhere, let alone the internet, let alone the World Wide Web, so there was no thought of anything like crates.io or git and such conveniences when building programs from the many components created by others.

Compliers back in the day had to work on machines that were orders of magnitude slower and had orders of magnitude less memory and storage space. There was not the resources to do all that Rust and Cargo do. Files had to be compiled, one by one, within the limited machine resources available. Than the compiled objects linked together.

As computers improved it became possible to have build systems and project management and IDE's that did all that. At that point compiler vendors devised their own solutions. Every IDE had some notion of a project building system. The world was fractured. That fracturing is still not fixed.

C++ being evolved from C inherited all that build system mess. Only now are "modules" coming to C++ in an attempt to get rid of old clunky header files and do something more like modern language systems do. Of course C++ has to stay backwards compatible so it has to support header files and all that junk adding more complexity to an already deranged language.

In short C and C++ are way they are for historical reasons. The "technical boundary" comes from the requirement they have for backward compatibility. Which makes progress very difficult. Nothing ever gets improved, just layers of complexity added.

Luckily C itself has managed to resist massive extension and remains simple and sweet as the day it was born (almost). C++ is doomed to collapse under its own weight.

22 Likes

Zig can be used as a compiler for C/C++. Probably not what you’re looking for though.

2 Likes

Side note: When I started getting into Rust, I hated having to use cargo. Now I love it and don't want to miss it.

I'm curious to know why that might have been? What language(s) were you coming from?

Having had to maintain C/C++ code that was expected to work on Windows, Linux, Mac and target embedded systems it has always annoyed me having to maintain all the different ways to do that, from make files to Visual Studio projects. I dreamed of a cargo like thing that would just do all that tedious work for me.

3 Likes

I wanted to start by overlooking the process of compilation better, so I (better) see what's happening behind the scenes. By jumping into Rust using cargo, there is a lot of things that happen implicitly.

C, Lua, and other scripting languages.

I can't speak for @jbe, but my initial hesitation wrt cargo was that it felt like way too much magic. It stems from, among other things, having maintained packages for non-mainstream platforms over the years. One of the most important lessons I learned from that is that the more "helpful" a build system is, the more annoying it is when it inevitably fails.

I, naively, believed that build errors would be a semi-normal occurrence. I underestimated how portable Rust code would turn out to be.

Rarely have I been so pleased to have been proven so completely and utterly wrong.

7 Likes

There's no technical reason. In theory C++ even has modules that make compilation model much more uniform and Cargo-like.

It's entirely a social, logistical, and business problem.

There's no de-facto standard tool, because C had many implementations on many platforms, and none has emerged as a clear winner.

There's no standard-standard tool, because traditionally this has been deemed out of scope of language standardization. On top of that standards around C are influenced by compiler vendors, who already have their own preferred tooling, interfaces, and userbases depending on them, so they're not eager to throw it all away for some new tool.

And finally there's just a massive momentum behind existing projects. Everyone using C or C++ already has adopted some build system, has chosen some project layout, some configuration method. Moving to a different one is a lot of work, and they may not even want to switch it if the new build tool doesn't support all the features exactly they way they like them.

9 Likes

Creating a good tool chain is much harder than most people would expect.

It is to the great credit of the Rust team and community that Rust tools seem to work so well, from the usually very helpful compiler messages to the usually smooth sailing with cargo.

In Scala, the standard build tool is SBT ("Scala Build Tool", formerly called "Simple Build Tool", but it's anything but simple). SBT is actually quite clever and powerful, but hard to learn beyond the basics, with an ever-changing DSL, often awkward, and sometimes buggy. Many community members love to hate it, and some have created alternatives.

There is this Scala project I have recently been contributing to, which builds fine when I call SBT from the command line, but fails to build for me from inside IntelliJ IDEA with Scala plugin.

In Python, there is Pip, which for a long time could break old dependencies when you install something new, which is why they created conda. Just yesterday, I had this Python project where my script would complain about a missing module, even though conda insisted it was installed. After much fruitless debugging and googling, I fixed it by deleting the environment and rebuilding it from scratch.

The only minor gripe I have with Rust tooling is that I wish Rust tools would be available as up-to-date native packages, so that I could just update them with apt update, but otherwise, Rust tools are so cool.

2 Likes

Yeah... that won't work with any kind of self-proclaimed stable operating system (especially Debian Stable-based). It's really not an issue with Rust or Rust packaging. Of the languages I used, I actually found Rust to be the easiest to install on every platform, perhaps that is also because once it is installed (which takes just one command) everything just works out of the box, no manual configuration required to build a project you just cloned / created.

1 Like

Maybe you can't get the most recent Rust tools into the Debian repository, but you could set up a third-party package repository with the latest versions and tell people to set up apt to point to that repo.

2 Likes

That sounds like a great idea, but I guess the rust teams themselves have no interest in doing that because it would split the maintenance into so many more parts. Which package managers do you suppport? Only apt or maybe also snap or pacman, brew?

I'm pretty sure Rust tools are “so cool” precisely because they are not available as up-to-date native packages.

Many distro systems impose insane delays and force year-long wait times on everything.

And if you have to support that you need kludges. And these kludges, eventually, start failing, too, thus you need kludges for kludges… eventually it turns into unholy mess.

Rust solves this problem with it's “evergreen” approach where instead of kludges you update the component which doesn't work… but that idea is fundamentally incompatible with “native packages”.

2 Likes

Imagine if someone released a perfect Cargo equivalent for C tomorrow. If the problem was only technical, I assume the reaction would be a joyous "finally!" and it'd take the C world by storm. But what I really expect is that it wouldn't go further than build2 did, which is self-described Cargo-like package manager for C, that has less than 400 packages after 7 years.

Cargo is implemented well, but it also was bundled with Rust from 1.0, so Rust did not have time to develop a dependency mess like Python (and it could have, I've used Rust 0.x with Makefiles and manual -L flags for dependencies!). Cargo had the luxury of learning from previous package managers, and now the winning formula is known, so it's easier to replicate it.

By standard of C build tools, Cargo is very basic. It doesn't integrate well with other build systems. It doesn't support native dependencies except via opaque ad-hoc build scripts. It doesn't support configuration beyond boolean additive feature flags. It can't run post-build processes, so you can't even make a macOS app bundle or a Windows exe with an icon, nor install a man page. It can't build dynamic libs correctly with specific sover and rpath. It can't export test results as JUnit, and so on. Existing C projects would find a lot of deal-breakers in Cargo's model.

17 Likes

This recent blog post about integrating Rust projects into a non-Cargo build system is a decent read related to this topic.

3 Likes

And yet, it works a lot better. Maybe those "deal-breaker" features are not what actually matters, after all.

2 Likes

As usual, it's "due historical reasons". Basically C and C++ are old enough that there are so many solutions that none is considered standardized. The closest to any standard is Makefile which is traditionally authored by hand but e.g. GNU project uses autoconf and automake to automatically generate suitable Makefile to match the current operating system.

Most IDEs implement some kind of project setup but the compiler toolchain for e.g. Linux vs Windows vs Mac OS are different enough that no common implementation exist. The autoconf and automake could work in theory but especially the Windows support for autoconf is really poor. In practice, autoconf is often used on Linux-like systems, Xcode IDE is used for Mac OS projects and Windows projects use whatever is the then latest Microsoft solution for their compiler.

1 Like

Actually, I think C/C++ kinda have standard option for builds: Cmake. The problem is that it is slow so people create ninja and others.

But CMake is supported at many platforms so it is nearest thing to a standard.

1 Like

In C and C++, static linking is relatively unusual. In Rust, it is the norm. I think this language design item matters a lot to the tooling around it. Because executables are single self-contained binaries, you can "just" put them where you want to install them. If they are dynamically linked, you have to organize the dependencies suitably, generate correct RPATHs or such, etc. You have to define what happens when the user "uninstalls X": does it also uninstall its dependencies? Are they shared? Stuff like that gets messy when you want to, e.g., put executables in Linux distros or (IIRC) macOS app bundles, which have their peculiar layout, so there are many cases to consider and a lot of complexity.

Something similar can be said of Python, as an interpreted language where the notion of "linking" doesn't even exist. Python's problems are multiplied by the fact that since Python is too slow for writing performance-critical code, C extensions are everywhere. In Rust, the toolchain doesn't attempt (IIRC) to do anything with respect to non-Rust dependencies like providing them prebuilt or compiling them automatically. In Python, all non-Python packages are expected by Python users to be "pip install"able without a C compiler, and this leads to distributing binaries on the package index, lots of questions about platform ABIs and which libraries should be vendored and how different packages share libraries, etc., which has also chiefly contributed to the Conda ecosystem being created as a parallel universe next to the more standard (PyPA) ecosystem (pip, build, etc.). It also means Python needs to interact with C build systems, of which there are a great many, so there are also a lot of Python "build backends" that exist solely to provide an interface to a specific build system (setuptools has its own build system, meson-python interfaces with Meson, scikit-build interfaces with CMake, pymsbuild for MSBuild, sip for sip bindings to C/C++ libraries built with QMake, and so on and so forth), and that is obviously a contributor to the well-known fragmentation of the Python packaging landscape, which in some ways reflects the underlying fragmentation of the C landscape.

Not to diminish the work of Cargo developers, but I just wanted to explain why, in my opinion, the design of Rust-the-language is a great enabler for simpler tools than what other languages allow for.

Also see this, slide 4, on C++: "Absurd compilation model, weak linkage and module
system, nigh-impossible to write tools for."

With that being said, there are also good ideas in Cargo that, e.g., the Python community is taking inspiration from these days. For example, the idea that everyone should get the language from a single "manager" tool (Rustup) instead of from some vendor like Linux distro, Windows store, Homebrew/Macports, etc. PEP 711

1 Like