Rust support in Buildroot (WIP)


#1

Hi!

In order to build programs written in Rust and/or Cargo crates for embedded
systems, I’m currently working on adding support for Rust in Buildroot.

Buildroot is a set of Makefiles which makes building Linux embedded systems
easy. It enables the user to build a cross-compilation toolchain (or use an
existing one), then a Linux kernel and a root filesystem populated with a
selection of applications.

It supports a wide variety of embbeded devices (Raspberry Pi, etc) and provides
configurations for many QEMU architectures (ARM, MIPS, x86_64, …).

Support for Rust and Cargo is available in the “feature/rust” branch of this
Buildroot repository. It may need some polishing before being upstreamed.

Here is the method used to add this feature, as well as the issues encountered.

Two packages have been added in Buildroot:

  • package/rust: this package fetches a snapshot of the Rust compiler to
    bootstrap the build a host version of the Rust compiler and cross-compiles the
    standard library for the configured target (e.g. an ARM device).
  • package/cargo: this package fetches a snapshot of Cargo to bootstrap the
    build of a host version of Cargo, then generate a configuration file suitable
    for the target (stored, along with the downloaded crates, in a dedicated $CARGO_HOME).

Either when using an internal or an external cross-compilation toolchain,
Buildroot will refer to the target using a GNU triple in the form of
<cpu>-buildroot-linux-<system> (e.g. “arm-buildroot-linux-gnueabihf”).

Futhermore, Buildroot allows the user to fine-tune the description of the target
architecture (e.g. NEON instruction set is supported, CPU type is cortex-a57,
etc) and use this information to tune some packages.

So, to properly configure the build of the host Rust compiler, a tool named
rust-target-gen is used. This Python script has two purposes:

  1. creation of a dedicated target configuration file by searching for a
    matching one among mk/cfg/*.cfg and patching it with the correct target
    name/cross-compiler prefix.
  2. creation of a dedicated target specification file (in JSON format), with the
    given target information and LLVM features.

The creation of the target configuration is easy (basic search/replace). For the
creation of the target specification file, it is a bit more tricky, as the
information for the “blessed” platforms is only available as Rust code in
src/librustc_back/target/*.rs files.

So, I decided to collect the information from the Rust source files in a CSV
file, that rust-target-gen uses internally. Maybe there is a better way to do
this.

I also have a few patches for Rust (harmonization of the target configuration
files) as well as for Cargo.

Additionally, this tutorial explains how to properly add a Cargo crate as a Buildroot package. I
successfully ran an “Hello World” program on QEMU ARM, x86_64, i386, MIPSEL and
AARCH64 systems.

Comments and suggestions welcomed.


#2

Very excited about this!

How much more would need to be done for this to make it into buildroot master?


#3

I sent a first patch for review upstream, which only adds the Rust compiler. If it makes into master it into one form or another, I’ll then send the one for Cargo (I’ve reworked the Cargo package slightly, to handle Cargo bootstrap more cleanly.

Buildroot provides a target to download all the source tarballs of the packages to build, so an offline build can be performed. However, Cargo does not fit well in this case, as it has to fetch its huge list of dependencies from http://crates.io to build itself.

At first, I thought this “fetch” operation could be done after downloading the Cargo source tarball, by extracting only the Cargo.toml file and using it. But Cargo relies on more than one Cargo.toml file, so the whole archive is to be extracted. And extracting the same archive twice does not look sensible to me.

So, may be for the offline build, we will need to add support for using a crates.io mirror.


#4

Have you see cargo-vendor? Maybe we could integrate that for facilitating offline builds. One index can be built up for a project by just using the command. That index and all the associated libraries can be shipped together as a comprehensive package, just tarred up.