Rust is definitely safer than average C or C++ code, not only in memory and data race safety aspects, but also has very robust error handling, so I’d definitely feel safer with Rust than typical C/C++.
However, AFAIK real safety-critical C code is written in specific ways, e.g. without dynamic allocation or recursion. Rust can theoretically be used with similar restrictions, but I don’t know if anyone has seriously looked at writing such guidelines and tooling for Rust.
And then there's a question whether LLVM and its optimizer is trusted to compile the code correctly. If you'd allow clang and -O2, then Rust should be fine too. I suppose you’d want to stick to one version of the compiler and never update it. Rust is still evolving quickly and doesn’t do “LTS” versions.
The lack of formal specifications might be a problem too. I've got no experience in such safety critical domains, but I just had to think of this awesome formally verified kernel where they prove the correctness of the code, such kind of stuff isn't possible with rust as far as I get it. Not sure how important this turns out to be in real life security scenarios, though
Safety comes from many things, one important source is tooling. And if you look at the wide choice of tools for the C and C++ language (like http://frama-c.com/ ), you see that Rust isn't there yet, but it will get better
In real life people are putting MS Windows inside medical equipment, which I consider horrifying, so to some degree the bar is quite low.
I have worked in avionics, and that was better than an MRI machine running MS Windows, but it wasn't as good as they pretend.
Compared to the scary examples, I think Rust must already better, but Rust has higher aspirations than that. Compared to the better examples (avionics), I suspect Rust is also up-to-snuff, but Rust is still new.
How bad could internal Rust bugs be at this point?
Another considerations is libraries. How bug-free is the standard library? How bug-free are the other libraries one might expect to use?
I can see your point about tooling, but a lot of the tooling needed around C++ is going to be to make up for shortcomings that Rust doesn’t have. Still, I can see other tools would still help.
I hadn't thought of that, that might be the biggest problem. Rust makes it easy to pin library versions, what practical problems would arise in pinning the compiler version?
Thanks,
-kb
P.S. An aside: I have a book somewhere about provably correct software. It described a formal specification language and software could be checked against it! Um, what's to stop anyone from writing new bugs in this new language? (The language looked compilable, certainly complete enough to write bugs in.) Making people write the program twice, once in each of two different languages, has some appeal, but a lot of the bugs I have written over the years were bugs in my understanding not in my typing. That kind of bug is easy to write twice.
You'll notice that bug-free dependencies are a requirement for secure code, and the efforts towards this should be gaining momemtum.
You're right with regard to the very low bar that is set for safe code, and my experience is that critical code usually just means very old code that was written by an excellent programmer 20+ years ago and every one else is just terrified of modifying it. Hence, I'd say that Rust is ready for being investigated as a safe critical code language. Such code would likely require much individual review and testing before being approved.
Most university work focusses on esoteric security topics. In practice safe code is code that does not contain bugs and executes reliably, or if not, has a reliable failsafe. The RustBelt initiative gives a provably secure subset of Rust language to use, but in the field Rust is proving to be safe with less stringent requirements: Sergey "Shnatsel" Davidoff – Medium
I have limited experience in building software for trains and railway signalling and while I love Rust for general systems programming, I am somewhat skeptical about its appeal from a functional safety point of view at the present time.
In these applications, domain-specific limited-expressiveness high-readability languages like IEC 61131-3 or SCADE are often used and complex and powerful programming languages like C++ are often avoided. The idea behind this being that these tools tend to enable people to build complicated systems with hard to understand failure modes. If languages like C++ are used, they tend to be restricted using code guidelines and linter-level tooling.
Of course, Rust avoids many of the issues that plague complicated C++ programs, but it is still a complex language that is difficult to learn and also has a reputation of being hard to read. So without a tooling-supported limited-complexity dialect, I see Rust having just as hard a time as C++ of being taken up in safety-oriented fields that originate more in electrical engineering than computer science.
That said, when I see safety critical C/C++ code, it often is not of magically superior quality than other C/C++ code. To me it seems, that the difference is mostly due to process, i.e. more rigorous requirements engineering and comprehensive verification and validation based on that. So an organization that is willing to apply C++ to these problems and under these conditions, can almost surely benefit from using Rust IMHO.
But then the language and the compiler is probably just not old enough as age of a code base is often taken as an indicator of maturity and a prerequisite for trustworthiness. My guess is that some of the organizations developing safety-critical systems just have to spend some time with the tooling in the fringes of their product lines to establish that trust and the testing that underpins it.
It's theoretically possible that Rust proves to be much better than previous high-level languages at critical code. The results from RustBelt here, do seem to give a subset of code that is provably safe. That paper is rather long though, and I've only read parts of it.
I only have some experience in critical hardware design for military applications, and the same qualification holds. The design margins are massively increased (over-designed) and everything is checked three times at least. From a high-level view, it just looks like a bigger system was used to do fewer things.
Unfortunately true. Age is a proxy for having been tested, therefore most assume age is a prerequisite for correct functioning. It's not, testing is a prerequisite for correct functioning and age is an unintended correlation. That being said, Rust will have to get some grey hairs before most managers will go near it for critical code purposes.
For the near-term, the fact that language features are being added does reduce the ROI slightly given that the safe subset of Rust will lack some otherwise good features.
I am not sure this applies here, at least not w.r.t. the concerns I voiced personally: Even if Rust's type system is proven sound and the implementation considered trustworthy, it is still a complex language and organizations concerned with functional safety could avoid it for this reason alone.
They usually want languages that are easy to reason about even if they are less productive or possibly even if they are less memory safe. Also the whole point of the various graphical notations is that the resulting programs are easier to explain than Rust's type system even if the underlying infrastructure to turn these programs into executable machine code is just as complex, e.g. SCADE.
No, for such project, safety is a critics, life-cycle is another critics.
Usually in safety critical context we use provably compiler, that is, either based on millons of millons of hours the code generated by the compiler is safe, or based on formal method. The first case requires usually very old compiler, the latter, is not yet provided by Rust.
On terms of life-cycle, the project may be maintained during at lease 20 years, that means you should be likely sure people should be able to find someone to maintaint the code then...