Rust 2020: Growth

Nodesource's curl | sh sucks too. But that script just adds "deb https://deb.nodesource.com/$VERSION $DISTRO main" line to apt sources, which can be done manually. It is a real apt repository, with a proper system-level deb package, so I can manage it with Ansible. Rustup is a total PITA with Ansible due to not being a real package, being user-specific (root vs non-root sudo distinction is tricky), and requiring special PATH, so the rest of commands doesn't work with it.

2 Likes

@ZiCog A GAT is a generic associated type which to summarize allows you to do the following:

trait Foo {
    type Bar<'a>;
    fn baz<'a>(&self, val: Self::Bar<'a>);
}
impl Foo for Qux {
    type Bar<'a> = &'a [u8];
    fn baz<'a>(&self, val: &'a [u8]) {}
}

Although not very important, this can allow for more complex designs and also streaming iterators.


Extra: Take a look at the bit on what kindedness is from the rfc which explains it better than I do here.

5 Likes

I think this point bears reiterating. Rust already has a lot of really good documentation in things like TRPL and the standard library docs, but sometimes it's a lot easier to learn by watching a real human implement a project, making mistakes, and providing a running commentary on what they're doing and why.

I really enjoy @jonhoo's streams but they tend to be geared towards more technical topics (e.g. implementing Java's concurrent hashmap) which, while it is something I'm fairly comfortable with, isn't really something you want to show a newbie when teaching them the language.

5 Likes

I ascribe the secret of C's success to speed and simplicity.

1 Like

Odd, I have always come to the opposite conclusion. I'd rather that applications I use (a compiler is an application) do not require installing into the OS from deb packages. Unless they are provided by the OS vendor

  1. I have had bad experiences where installing debs from odd places as you described has upset some dependency and hosed my OS. Or it has made a mess of the package system. I think very hard about tweaking my sources lists now.

  2. It always seemed like a security to problem waiting to happen. What with needing root privs.

  3. I may not want to be using the same compiler and same version for all that goes on on my machine. I want to try test with new releases without going the Full Monty. Local install and the ability to revert is desirable.

As such I'm happy that things I use like Rust, node.js, NATS, CockroachDB etc all install very easily without ever having to touch root.

Of course I know nothing of Ansible. In the light of the above it sounds like it needs fixing :slight_smile:

1 Like

This post was flagged by the community and is temporarily hidden.

I kinda agree. I think there are two main hindrances to growth at the moment

  1. IDE support still kinda sucks.
  2. Documentation of non-std crates mainly sucks.

I know the first is being worked in by the rls-analyzer team, so I am hopeful that will improve a lot this year.

The second I think some people will find controversial. But many times I have seen crates announced on Reddit or wherever and when I go and have a look there is nothing but the generated comment docs. Lack of READMEs, examples, tutorials. A lot of stuff is really not very good I am afraid.

The async/await documentation picture is particularly bad. Twice now in the last couple of days I have tried to asyncify a small program I have and have run into trivial problems I could not solve even with Google. The bifurcation between async-std and Tokio is going to be annoying - and then there's the futures library itself, so I guess it's a trifurcation. It's annoying to start with async-std then Google for a concept, only to have hits for Tokio come back. and vice versa.

2 Likes

feature(generic_associated_types) basically just checks syntax at this point. There is a lot of compiler legwork that needs to be done before it's good enough to use, even on nightly. I believe the code you wrote should compile under the RFC; that the compiler doesn't support it yet is "only" an implementation issue.

I get the notion that GAT allows a trait to talk about types that will not be actually defined until some impl is written that defines them.

Not quite how I would put it; I'd say that they allow a trait to express types that are not fully defined until the associated type itself is used. The classic example is streaming iterators, which is in the RFC. (I could go on, but don't wish to derail this thread further.)

3 Likes

Got it. I think... thanks.

I worry that I'll end up having to read code, let alone write it, that looks like the generic/template/meta-programming stuff we see in C++.

1 Like

C certainly offered speed and was simple.

In my experience it was not simple to learn and/or do anything major with.

I was given an IBM PC, the Microsoft C compiler and it's manuals in 1982 and a job to do with them. After much frustration I asked the fellow programmers around that office at Northern Telecom a question like: "Is it really normal that in C one can write a program that just disappears into darkness and crashed the whole machine requiring a reboot? No compiler errors no run time error messages, nothing?"

You could have heard a pin drop. Mouths open in disbelief they all looked at me as if they had the most stupid maggot on the planet working for them.

Very soon I realized I should treat C like assembler, which I was used to, got the job done, regained some respect around the place.

On reflection, years later, I came to realize it was not such a stupid question. You see I was introduced to programming with BASIC and later taught myself Algol. I was used to the idea that a high level language was, well, a high level language.

C/C++ became huge because they were pretty much the only choice when performance and/or small memory foot print were primary requirements and all those random bugs, and crashes were considered to be the unavoidable price you had to pay to get it. Not to mention portability.

9 Likes

No, you need to consider use cases other than your own. Rustup is absolutely wonderful ... for workstation installs.

Ansible is a tool mostly used for provisioning fleets of servers. The server will probably only have one install of Rust, because you're not supposed to experiment with new compiler versions on your server; that's what workstations are for. And while I'm sure Ansible is capable of running arbitrary shell commands, you really want to be able to hook the package manager, and hooks for the system package manager already exist.

  • By hooking the package manager, you can gracefully handle things like network failure. Ansible knows that a package is supposed to be installed, and it can query the PM to check, and if it's not installed, it can retry.
  • By hooking the package manager, you can use a local mirror of the package repository, which makes provisioning new servers much faster.
  • And, most importantly, if something goes wrong, you're supposed to just reimage the server and start over, which means that you want the setup process to be as deterministic as possible. This is easiest if you have good, built-in, support for version pinning, and while rustup does have support for it, Ansible does it by default.
8 Likes

https://forge.rust-lang.org/infra/other-installation-methods.html

If you prefer not to use the shell script, you may directly download rustup-init for the platform of your choice:

If you just don't want to use curl | sh, download rustup-init for your target directly.

The official Rust standalone installers contain a single release of Rust, and are suitable for offline installation.

If you don't want to use rustup at all, you can directly get the rust version you want.

1 Like

No doubt I should. Your description of Ansible brings some questions to mind:

  1. Why do you need a compiler, the entire build chain, on all the members of the fleet of servers? Why not just push the pre-built and tested application executables? As we do for fleets of embedded systems.

  2. I can understand the desire to reimage a server and start over if it's gone bad. But if one is building and running everything as root how can one do forensics on a hacked or otherwise broken system. You can no longer trust you logs, your tools may be compromised, etc. We want and have safety in Rust why throw away the safety of the OS?

Anyway, we could probably chew on everyone's needs for deployment forever. For example here https://www.youtube.com/watch?v=qCB19DRw_60&t=6s is a recent talk by Ryan Levick of Microsoft discussing the issues they have in taking Rust into use with their existing build systems and so on.

Hey get that, MS is looking to adopting Rust. If that happens we can expect a lot of growth all of a sudden.

2 Likes

To be fair, async/await is new and the ecosystem is still developing. It's the perils of being on the bleeding edge. That said, documentation in third party crates could stand to be improved. Maybe we should encourage a short tutorial doc page. Usually some examples will be included but it's often too brief and sometimes assumes knowledge that a new user may not have. A dedicated tutorial would perhaps focus minds on producing a newbie friendly introduction.

For sanity's sake, the method you're describing is the correct way to install packages on a system with Ansible (or any configuration management tool) and all other ways range from sub-optimal-but-fine to hacks.

I might be mistaken but I have yet to be declared insane. I'm curious to know why what is described for Ansible is "correct". It flies in the face of a lot of old Unix advice. Like not running ones Apache as root.

But this is the wrong place for such a tutorial.

As @bjorn3 linked, the standalone installer is what you want for deployment automation. In a homespun Ansible-like tool, it was easy enough to automate like so (ruby, sorry, Apache License):

    def download_and_install
      ifile = File.basename( installer_url )
      sudo <<-SH
        [ -e /tmp/src ] && rm -rf /tmp/src || true
        mkdir -p /tmp/src/rust
        cd /tmp/src/rust
        curl -sSL -o #{ifile} #{installer_url}
      SH

      hash_verify( hash, ifile, user: :root ) if hash

      sudo <<-SH
        tar -Jxf #{ifile}
        rm -f #{ifile}
        cd rust-#{rustc_version}-#{platform}
        bash ./install.sh
        cd / && rm -rf /tmp/src
      SH
    end

    # The URL to the installer tarball (xz compressed)
    def installer_url
      [ 'https://static.rust-lang.org/dist',
        "rust-#{rustc_version}-#{platform}.tar.xz"
      ].join( '/' )
    end

Note that this allows sha256 hash_verify on the tarball prior to installing, so I would think it would make most ops/security folks relatively happy?

Fair point. So the deployment automation for rustc is really just provisioning a well controlled, reproducible internal build host, which can be kept in sync for OS/distro updates on the "fleet of servers". Once you have binaries built you can distribute them as you suggest.

1 Like

Google is confused with Rust. But, it's not just the older version of docs, books etc: :man_shrugging:

5 Likes

Yes, there is also a game called Rust. I usually search for rust lang <problem>.

5 Likes