Adoption without traditional/formal standard

This is kind of a continuation of this post:
https://users.rust-lang.org/t/specification-standardization-and-independent-implementations/32764

I currently work in the embedded space and raised the topic of potentially adopting Rust.

A colleague of mine sent me this blog post (see the "The Rust Programming Language" section).

Not knowing much about traditional/formal programming language standards and knowing very little about Rust in general, my question is:

Is the author's (of the blog post) hesitancy to adopt Rust well-founded, or does Rust not apply (as the original forum post seems to be implying)?

In other words: how does one convince the old guard to adopt Rust (when appropriate) without formal/traditional standards (e.g. ANSI C)?

I have zero knowledge on the railroad industry. But the Python, one of the most popular language used by the IT industry, does not have any formal specifications. Can you describe more about the employers you consider?

You have to keep in mind that in the embedded world, it's the total opposite of "move fast and break things". It's very much "move slow and don't break anything", mainly because it's often difficult to update/patch deployed embedded firmware.

It's not so much a question of the standard on its own, but how (in the C case) the standard aims to (among other things) prevent compiler incompatibility.

The employers are not the focus of my question.
I'm just asking in general how the Rust community should convince the old guard to adopt something better.

There are efforts to bring about a Rust specification : Ferrocene

It seems to me, from what I've heard, having a language specification is more about reassuring the insurers and lawyers than actually having quality guarantees.

2 Likes

Thanks, that Ferrocene link looks like a good read.

It seems to me, from what I've heard, having a language specification is more about reassuring the insurers and lawyers than actually having quality guarantees.

I hear you, but how about from a technical point of view?
Rust written for embedded projects now, will it likely be compilable 10+ years from now?

I think that this is both (1) a big deal for certain people and (2) not as big a deal as some people try to make it.

Rust is a young language. It takes time (and a certain kind of external pressure) for a language to be standardized. C took roughly 19 years. Rust is only 11. I think it's perfectly fine to say "yeah, if you're doing something where a standard is required, Rust might not be ready for that yet". It doesn't reflect poorly on Rust.

At the same time, applications which require the use of a particular version of a particular programming language standard are pretty rare, even in contexts that require high reliability. Reproducibility of builds is better guaranteed by simply refusing to update, and reliable behavior by writing thorough tests. I've heard more people hand-wringing over Rust's lack of a standard than I've actually heard of people who wanted to use Rust but couldn't because it doesn't have a standard.

To the question,

Does your employer actually care about language standards? Or do they care about creating robust, performant and maintainable software? If it's the first thing, you shouldn't try to sell them on Rust, because they don't care about it. If it's the second thing, the non-existence of a language standard probably isn't the make-or-break factor. Let's not forget there are still plenty of reasons not to choose Rust that don't have anything to do with standardization.

Additional previous discussions:

7 Likes

Absolutely, my intention was not to do that at all.

I agree and I think this is the angle I would probably take with employers.

Definitely the latter.
I'll try and edit my original question, because my intention was not to focus on my current/future employers, but more of a general "how to convince the embedded world in general to adopt Rust (where appropriate of course)

Of course, always favour the right tool for the job.
Yet for greenfield projects when Rust is a choice, this is where the crux of my question arises.

Let me bring another point to the table: the existence of a standard is not necessarily a good or desirable thing. There are all sorts of stupid things in standards, think unnecessary bureaucracy, which actually end up making code much worse, e.g. because they impose unfounded stylistic restrictions on programmers, while missing the big picture.

The #1 problem in the programming industry is probably security vulnerabilities related to memory unsafety. Rust solves that efficiently, and no other language does that as well as Rust manages to, period. So I'd get on a train of which the control software is written in Rust any day.

C has a Real™ International™ ANSI™ Standard™. Is that a good thing? Yes. Do compilers implement the standard? No, they don't, they all introduce their own subtle incompatibilities because they (actually or allegedly) know better.

And most imporantly: Do people obey the standard? No, they don't. Most "C programmers" I know have no idea that there is such a thing as a C standard (let alone multiple versions of it!), or that the behavior of C programs is specified by the C Abstract Machine, as opposed to whatever combination of buggy hardware, buggy compilers, and misinformed gut instincts about a simplistic, 40-year-old mental model of computers they happen to be having.

And then there's the thing that is MISRA C, with its arbitrary rules like "thou shalt not use recursion". That's what I call missing the forest for the trees – MISRA is also just a specification that people either uphold or do not, and enforcement is flaky. There are 3rd-party static checkers, and some very niche compilers that have built-in support, but neither of these options are nearly reassuring enough in the case of a blatantly and inherently unsafe language like C.

So, if you try to convince someone of using Rust, tell them about the real merits of the language, the tooling, and the ecosystem. Don't look for imaginary merits. Of course, you will not be able to convince lawyers and regulators, but if someone professional is in charge of making the decision, they'll surely appreciate real, practical benefits.

9 Likes

We've done some work in high-assurance projects, and we've had this discussion a few times.

My argument has always been that I think there's a valid and good idea behind formal standards, but there's a point at which the purpose of having a formal standard begins to be hollowed out. The example I usually use is C++ and its standard library. We have a C++ project in which we use a few C++ features for which the standard originally specified one semantic, but in reality there is another -- and they've had to publish erratas to rectify these issues.

In my view, the true spirit of a formal standard is that if a problem is detected, the problem will turn out to be in the implementation. If an issue is discovered to be in the formal specification, then it may be a sign that the standard is either too vague or too unwieldy, and thus can not serve its proper purpose. (Generally speaking, of course -- we'll never be able to completely get rid of the human error factor).

In a roundabout way I've been saying "Why demand formal standards that implementations can't/won't follow anyway?".

It's not an argument I'm completely convinced by myself, to be honest (as I said, I see the point in formal standards), but it has allowed us to use Rust in a few more places than we otherwise would have.

3 Likes

In years gone by I used to argue in favour of standards. I did not want to use languages that did not have an authoritative standard like ISO, multiple vendors working to that standard and wide user base support. Why? Because I like the idea software I write, or the companies I work for write, can be be deployed on a wide range of operating systems and hardware architectures. Can be used with compilers from multiple vendors. This has all kinds of advantages in the durability and longevity of ones creations. Clear economic benefits. As such Ada, C, C++, Javascript were in, almost anything else was out.

It might be useful to think about where such industrial standardisation evolved from. As far as I can tell it all started in the dawn of the industrial revolution. It became clear that it would be more efficient and profitable for everybody if they worked to common standards of measurement and produce parts to common design standards. One of the earliest examples being the standardisation of threads on nuts and bolts. With a standard like the "British Standard Whitworth" for screw threads a manufacturer could order nuts and bolts from different nut and bolt makers and be sure they would fit together. An obvious advantage in ensuring supply and promoting competition. Then we get into the whole metrication thing. And so it grows.

My view of standards was based on that history. Except the nuts and bolts were now language compilers and users programs. Clearly having multiple vendors building compilers to the same standard was a good thing for the language users and compiler vendors alike.

In recent years though I have begun to doubt that view of standards in software, especially language standards.

For a start software is not a physical thing like nuts and bolts. And especially Open Source software is very different than closed. A single Open Source implementation of a language is usable by everyone, things like Clang, GCC and Rust support just about every platform one can imagine. Arguably it would be better, more efficient, to have everyone working on perfecting that single implementation than many closed source vendors all trying to provide standard compliant offering. Already there we have all the advantages of closs-platform, multi-architecture support. And the lonegevity assurance.

Then I see we have international standards for things that are pretty much never used. BASIC, Pascal, and the like. Languishing, unmaintained, far behind what the users of those languages actually use.

Then I wonder what is the point of a C or C++ standard? Those standards leave so much that is "implementation defined" or down right undefined. The ongoing train wreck that is the ISO C++ standard is painful to watch. Language like Ada and Javascript fare better in that respect.

And of course, the standards don't ensure quality. A bolt made to a modern metric thread standard does not ensure that the materials of which it is made are up to the job.

From my experience of embedded and safety critical software, those who are serious about safety and corrctness don't just get their compilers from any old vendor that claims standards compliance. No they go to Greehills and the like with long track records of conformance and quality assurance. Those who are not so serious advertise their standards compliance as marketting material.

So what about Rust? Despite my former insistance on standards here I am using Rust.

I have no worries about that platform and hardware support. I have no doubt Rust will be with us for a long time. Especially with the likes of MS and Intel getting on board.

I suspect it would be good to have a formal specification. Such that the likes of GCC and Microsoft can build Rust support into their compilers correctly.

Who should be the gardians of that standard? Isn't that what the Rust Foundation is for? Things like ISO, I think not.

Sorry for what grew into a rambling essay folks.

2 Likes

So I was just wondering...

I might ask the "old guard" a couple of questions:

How does having a formal/traditional, ANSI/ISO, standard ensure the correctness of their C/C++ compilers?

How does having a formal/traditional, ANSI/ISO, standard ensure the correctness of any software they write in those languages?

Assuming correctness in safety-critical, security-critical and other software is a major concern.

As far as I can tell it does not.

They know that. That is why they get their compilers from trusted vendors and employ a plethora of procedures, techniques and tools to check out their creations.

I was also wondering... what would it mean to try and demonstrate the correctness of Rust? Even if it had a formal specification/standard. I mean, between the source code I write and the executable I get out there is a billion lines of Rust compiler and LLVM intermediate representations, transforms, optimisers, code generators, linkers (I might be exaggerating a bit).

It boggles my mind that it works at all !

I mean, when it comes to correctness, there are two questions:

  1. Would it be correct if it did what we intended?
  2. Does it do what we intended?

Tools such as a standard address the first part.

Edit: well or maybe there's a third part, which is really what standards address: what did we intend for it to do?

My naive logic suggests that is impossible to prove/demonstrate that the executable I get out of my Rust source does what it should, and nothing more, all the time.

Assuming we had a formal specification of what it should do of course.

I mean, to do so would require that LLVM is compiling code correctly. LLVM is written in C/C++ and therefore, as far as I understand, that is impossible.

As Microsoft has been saying recently, despite writing their own C++ compiler, employing the best programmers, using all kind of coding standards, review/test procedures and analysis tools, they still have an ocean of bugs and security vulnerabilities to contend with.

It's hopeless.

So, is there any possibility we might one day see a Rust compiler entirely written in Rust? Self hosting as it were. Then we would have something to talk about.

Yes, it's possible that eventually we will get a pure Rust backend and Cranelift makes a great progress on this front, but I would not place such high hopes on it. Formally proving everything from source to binary would require fundamentally changing how almost all programmers handle software engineering and arguably Rust itself would not be a good fit in this endeavor, since it's a child of the same "irresponsible" practices.

Unfortunately, formal verification in its current state is economically unfeasible for mass software production (even for system programming). I have an idealistic hope that this situation will eventually change and that the next generation of system programming infrastructure (i.e. "replacement" for Rust, in the same way as Rust "replaces" C/C++) will focus on ergonomic formal verification. In this ideal world we would prove prove properties of our code important for domain in which we work and we would be able to guide compiler optimizations if necessary (e.g. by proving properties which compiler was unable to auto-derive or by writing assembly for hot code and proving it's functionally equivalent to high-level source code).

SPARK is certainly a step in this direction, but still quite far from the described ideal. Also there are aspiring developments such as formally verified seL4 kernel and Fiat-Crypto which implements cryptographic primitives with formal proofs and achieves much better performance compared to "classic" implementations. But unfortunately they are quite "expensive" and really hard to be applied at scale. Even in an optimistic scenario, it would take at the very least several decades for this generation change to ahppen and looking at the dynamics of software development in general I am not sure if it will happen at all...

Coincidentally came across this (Tim Sweeney: “ISO obstructs adoption of standards by paywalling them”):
https://news.ycombinator.com/item?id=26390040

1 Like

Nah. If you are in the business of writing a C++ compiler coughing up a couple of hundred bucks for the standard document is nothing.

If not you can get the final draft for free. See https://www.youtube.com/watch?v=fWWBG7zekL8

A standard you have to pay for, is a standard worth ignoring.

1 Like

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.