Are Rust crates secure?

Continuing the discussion from Rust beginner notes & questions:

In the below post, read from “If this happened in scenario #2, that poor solo enterprise developer would not be a happy person:” else branch to Does Rust have severe shortcomings?.

Crate discoverability:

FYI 2fa / email approval is being discussed for crate publishing here:

4 Likes

I'm finding the topic of this thread really vague. I accept that there is a problem in open source that dependencies gone rogue can cause havoc. But this doesn't seem specific to Rust at all. Indeed, because dependencies in Rust are chosen by the binary (subject to constraints imposed by libraries) as opposed to having a single version of each library per system, it seems better in Rust than in .NET.

Is the topic of this thread perhaps "how can we design a safer open source world"?

3 Likes

:thinking:

“how can we design a safer open source world” ... using Cargo as an example.

Note that the question posed by @peter_bertok is "A dev in scenario #2 would just avoid Rust if he’s got any brains. I certainly would not use it as it stands, because there’s virtually zero protection from the kind of vulnerabilities that were not just predictable, but predicted, and have occurred for real. Why would I risk it? For what? Twice the runtime performance? Pfft… I could just request a server 2x as big and not risk my job and my career."

Note: I edited the introduction to reflect this.

1 Like

I feel that @peter_bertok is perhaps trying to argue for a fat stdlib, to reduce the insecure ecosystem problem.

Is there an open source ecosystem where the situation is better?

1 Like

"Is it secure" is too broad to answer other than "no", because nothing is absolutely secure. However, in practice, so far, I think the crates are secure enough.

The Cargo and crates.io teams follow best practices, e.g. all downloads are over HTTPS, package checksums are verified (and soon also the checksums in the index will be signed, creating a chain of trust).

The community is small enough that many top crates are by well-known, trusted authors.

Rust/Cargo is lucky to be a step behind JS/npm, so we get early warnings of problems that npm runs into.

3 Likes

Purely on security front, I presume Linux distributions are doing better, because all packages are vetted and there are dedicated teams backporting security fixes.

I do wonder if there's demand for such approach for Rust. Would you use an alternative Cargo registry that has only manually reviewed crates?

1 Like

For context: Haskell has a similar thing in Stackage. I believe it's a collaboration between some big companies.

I wonder if it would be hard to make the client-side part of something like Stackage for Rust.

1 Like

I think there would definitely be some. More and more big companies are experimenting with Rust.

1 Like

How would a new Rust user know who can be trusted? Would a malicious code injection by an user be caught and reversed soon enough?

2 Likes

This.

Secondly, how would anyone know to continue trusting crates, and all of their transitive dependencies? What if a "crate turns bad" because someone new takes over its maintenance, a password leaks, or someone just stops updating it and critical vulnerabilities go silently unpatched?

This whole issue may sound a bit academic, but there was a great post just recently on the Reddit /r/rust forum that highlights the many security issues around popular crates, such as a reluctance to merge pull requests related to DoS issues, few crates using ![forbid(unsafe_code)], and a lack of clearly visible quality metric or other auditing system in cargo.io:

https://www.reddit.com/r/rust/comments/8zpp5f/auditing_popular_crates_how_a_oneline_unsafe_has/

One option is to set up a bug bounty for high profile zero days on popular rust crates. Possibly the bounty rather than money should be a rust sticker signed by one of the core team :slight_smile: ?

FWIW, I'm not sure this fits in the "community" category. This category is more about community organizing, what you're asking is more closely related to Cargo.

Updated the tag

1 Like

Are these rhetorical questions for which you think there's a good answer in other open source communities but not Rust? Or are they simply open-ended questions about how any open source community can possibly create a trustworthy set of common libraries?

1 Like

It's a question about how Rust itself can elevate to being a highly secure programming environment. The requirements for such is that LLVM vulnerabilities are patched, the compiler is secure from someone doing malicious code injection into compiled binaries, and the Crates are not compromised.

The premise is that if I or anyone else use Rust as is to build an IoT network, can I be sure that thread safety itself will ensure my devices and network is secure? Because, given that Rust is likely to make it harder to run remote exploits targeting the binaries themselves, there would be more of a focus from governments and bad actors around the world in compromising the programming environment to introduce vulnerabilities higher up in the chain.

So, is this being mitigated in the Rust environment specifically, and how can it be further improved?

3 Likes

On the issue of trusting crates on crates.io, how is trusting them different than trusting any set of open-source dependencies that come from anywhere? Say you didn't have crates.io and instead were using libs downloaded from various upstream sources directly. What about if it were all in the standard lib?

I see absolutely no difference in these trust scenarios. In either case, the only way I can truly trust all the dependencies is to audit them and then vendor/lock them down into my build for that version of my build. If I upgrade any of those dependencies to a newer version, I must audit them. Every time. That is the ONLY way to have true trust in ANY scenario whether the dependencies come from standard lib, a repo like crates.io, or from 3rd party downloads directly from upstream. This even applies to commercial/close-source software with the added downside that, in most cases, I don't even have the option to audit the source, I just simply have to trust that because I gave them money, they are doing their job properly and have my best interests at heart (which has been proven time and again to not always be the case unfortunately).

So, what I'm saying, is that way too much is being made of the whole, "I can't trust crates.io" issue. It's pretty much a non-issue if you really consider what "trust" in dependencies is.

That being said, things like adding signing or 2FA to crates.io are worthy of pursuing. Also, the notion of curated crates.io alternatives is not a bad idea as well. In fact, I can imagine a scenario where a company or consortium of companies create a curated alternative to crates.io that you can pay a membership to. This could possibly include certain "guarantees" around testing standards, documentation standards, auditing, chain-of-custody etc. that have had official audits etc. Perhaps adhering to things like FDA regulations or FIPS regulations, etc. My guess is this would be costly (as it should be). Seems like a business opportunity?

5 Likes

So your last paragraph is what this discussion is about. The fact of the matter is, we can say that trusting 95% of the crate authors is fine, after trusting their work from previous audits. But now money has to be spent to verify that the other 5% didn't muck with some small dependency that my updated crate relies on. So how can one ensure that some level of open-source security can be guaranteed for Rust?

Imagine this scenario:
A vulnerability has been discovered in a security drone controller that allows remote login and spying. The drone is used at a multinational R&D facility to patrol the perimeter and verify everyone on the premises signed in at the gate with their ID.

The news articles are reporting that a security researcher says that the LLVM edition of the Rust lang compiler was compromised 5 years ago, when this vulnerability was introduced.

The security researcher says that, because Rust lang says "thread safety" on their website, the contracted programmer for that part of the code assumed that remote buffer overflows would be contained. Unfortunately, the Rust community rested on their laurels, and let the security of their dependencies and compiler dependencies slip, and today Rust can only be used after a thorough audit of every single dependency used. Hence, most companies are using dependencies that are 10 years old and have some otherwise minor, but well-known exploits due to how long the dependencies have been in circulation. This is because half the newest dependencies were shown by Shor e.t. al, to be heavily infiltrated by a foreign government to ensure exploits can be run on Rust-compiled code, which could otherwise have been prevented had they not rested on their laurels.

1 Like

This isn't coming from nowhere either, mind you. I was a security researcher of a reputable firm for a short while. And, a colleague was working next to me in the lab on a mitigation procedure for fixing up a couple of thousand point-of-sale terminals. A third party library had several buffer overflows in it, and this code was compiled long ago and the source code long forgotten. It was programmed in C of course. Some other R&D guys had managed to run a game of tetris on the terminal by programming the chip of a credit card that would run the exploit on the terminal when the card is inserted. They could also make the terminal print a 'payment approved' receipt from the terminal that was totally fake with this exploit.

2 Likes