Why are some people against the Rust-Lang?

Hello All,
How about this article? Why are some people against the Rust-Lang? For example:

Rust is not a useful systems language, for the most part. While it supports "asml", the register explicit use is awkward, and the assembly instructions do not include the protected mode instruxtions necessary to implement certain kernl functionality:
Inline assembly - Rust By Example
I think it'd be more useful to link as C language bindings, instead.
Linux has an apt-get for Rust, as a language. They didn't actually "add it", or they'd accept kernel submissions written in Rust.

Thank you.


c-rusted sounds like an interesting experiment from a university research group. To annotate normal C language so that one prover could prove many properties of the code as Rust does but still be able to compile it with regular C compilers. Without seeing how this looks in practice it is impossible to say why one would want to use or not want to use this. But:

  1. I imagine the annotated code is hard to write and horrible to look at. Would one really want to write new code in that?

  2. Nobody is going to annotate millions of lines of old C code such that the prover can run on it.

As for the quote you give, wherever that came from, saying that "Rust is not a good systems language" is a an odd statement given that many systems, built on "bare metal" have been created in Rust including at least one capable Linux like operating system. Use of Rust in the Linux kernel is in its early days.

That complaint about not being able to use protected mode instructions from Rust seems very minor to me, considering that there is a vanishingly small amount of such code in the Linux kernel.

The author of Linux, Linus Torvalds, is famous for being extremely critical of almost everything. For him to say that use of Rust in the kernel "is not altogether a bad idea" is high praise indeed.

This is programmers and programming languages we are talking about here. Programmers are famous for getting attached to whatever language(s) they use and loudly hating on all others for all kind of silly reasons.

Finally, most sensible Rust developers and users do not look for Rust replacing the billions of lines of existing C code, or any other langue code. That is unreasonable and unnecessary. They take a more symbiotic view. Rust and C can live together.


For clarification of your question; you are using the quotation markup

My own text

> some quotation
> multiple lines

My own text again

which renders something like:

My own text

some quotation
multiple lines

My own text again

However, it is not clear where you are quoting from. The linked article does not feature any of the sentences in your quotation block. Perhaps these are your own criticisms, or perhaps you are quoting them from somewhere else I couldn’t find; which one is the case might be (at least somewhat) relevant for how we want to answer your question/post.

In case these are your own words, it wouldn’t fit with the initial question “Why are some people against the Rust-Lang?” though. (Which I’ve understood as a [not rhetorical] question and made the title of your post,[1] hopefully that’s accurate?) Or maybe you want to say you are paraphrasing common criticism of Rust? If they aren’t quotes, maybe you wanted to create a listing instead, e.g.

My own text

* my own list entry
* another point which can also
span multiple lines

Still my own text

My own text

  • my own list entry
  • another point which can also
    span multiple lines

Still my own text

  1. For other readers: this topic’s OP was originally a relatively off-topic reply in another topic. ↩︎


To answer the question from the topic title, there are many reasons:

  • Tech people tend to be very against "hype", and anything they see as too good to be true, they counter with very harsh anti-hype criticism. Rust claims to solve memory safety, which has been a pain for as long as system programming existed, so it sounds like too good to be true.

  • It's in human nature to dislike change. C has been around for very long. For some people it hasn't changed a bit for their entire career. C users are also kind-of self-selected to like C exactly the way it is. Rust takes quite a different approach, has many features not present in C, and a similar-but-different syntax. It's disruptive and weird, and that brings resistance.

  • There's a prevalent view that unsafety and vulnerabilities are not language's fault, but bad programmers' fault. This means there's nothing wrong with C or C++, and programmers just need to try harder. From that point of view Rust doesn't solve anything, and insults good classic languages.

Of course these cases don't mean all criticism of Rust is irrational or unfair. Rust has its limitations and downsides. "Rewrite it in Rust" is not practical for every project, and it's a bit of an obnoxious meme.


The paper is a bit confusing due to seeming machine translation at points, but the gist is pretty common: here's our safety annotation system for C. The distinction seems to be it borrows some concepts from Rust, though it's not clear of it's using a full stacked/tree borrowing system (or how it could without much more annotation noise)

Their points are not really against Rust at all, simply, as far as I can tell, the idea that you have to rewrite C code to get the safety it promises.

I'm unconvinced by the paper alone that they can offer anything like the safety promises of Rust, let alone the simple ergonomic improvements and the unrelated to safety features like macros and the cargo/crate ecosystem.

However, that's hardly an argument to not try to make C code annotation better! The checks they show are quite nice, and the comparison they make to the JavaScript to Typescript gradual adoption seems pretty convincing to me.

I would personally like to see such an approach steal a bit more from the cppfront / "C++2" concept, though, that of giving themselves a fresh syntax space with better, safer, defaults on a per file basis. Ergonomics matter a lot, defaults even more so.


I've looked at the annotated code in the paper - it's not that horrible to look at, but without significant investment in tooling, it's going to be hard to write, because the analyzer will warn if any of your dependencies aren't already annotated.

And trying to apply the annotations to an existing chunk of code myself reveals the other issue - the complexity of annotating existing C code without reworking the code is comparable to that of writing Rust FFI bindings for that same C code. If the annotations could be added automatically, and then human checked, that'd be a different matter, but for now, it's not clear to me that it's better than using c2rust on the same code and then refactoring and rewriting to remove unsafe.


And at the same time, Rust upsets people because Rust's solution to memory safety doesn't require anything particularly modern in terms of theoretical underpinnings. All of the theoretical concepts needed to understand what Rust does to solve memory safety were in place by 1981 (arguably a lot earlier, but the earliest reference I've found that explains everything is a paper from 1981 talking about Algol 60); it's just that Rust is the first mainstream language to put the pieces together.

If you've been telling people that we've been doing our best for your entire career, and this is simply a hard problem to solve, then you discover that we've missed something that would have made it simpler for a few decades, it's embarrassing - it's easier to suggest that the new thing is somehow flawed, than to admit that we've missed something simple for so long.


ALGOL certainly had robustness as a priority. I spent some months teaching myself ALGOL as a young student in 1976. As far as I recall a program never randomly crashed or produced random results. Which was just as well as compilation was a batch job from punch cards that were run on a mainframe at night by computer operators (And people complain of Rust build times!). Sadly I had to stop that as my actual studies (physics) started to consume all my time. Fast forward to learning C for work in 1983, I was horrified to find that almost anything would compile and produce an executable, with no errors, which then crashed, produced garbage and even locked up the whole machine. Leaving one scratching ones head for hours. Never let anyone tell you a programming neophyte should start with C rather than Rust.

Tony Hoare tells of how his company built a FORTRAN compiler. It compiled FORTRAN to ALGOL and then compiled that to machine code. The customers hated it because: There was a performance hit, array bounds checks at run time for example; A lot of their existing code failed to compile or run due to its safety errors. They had to remove all the safety checks to sell the compiler!

Later I encountered Ada. Great a proper language with an emphasis on correctness. Well, as you may have noticed few programmers have even heard of Ada today. It is used still in safety critical software. No body liked it. It's verbose, they said, or it's to complex, or it under performs or the binaries are to big...

So I see a trend here. The world of programmers have consistently rejected languages that emphasised correctness. It's not like they missed something, they actively reacted it.


Actually this describes precisely the valuable lesson to be learned from first learning C and then Rust: it really teaches the value of memory safety.

Those languages weren't rejected because of the correctness focus.
The acceptance of Rust suggests that they were rejected because of a dev experience that wasn't good enough, and if that's the case it doesn't matter anymore if the tech has other virtues. It pretty much trumps everything else from a human interface POV.


There's an engineering tradeoff underlying this, and the balance point on that tradeoff varies over time. Rust isn't particularly good at correctness of programs compared to something like Agda, but it is much easier to write Rust programs than Agda programs.

The sides of the tradeoff are problem domain complexity, runtime cost, programmer effort, and compile-time cost. Sensible engineers recognise that the right balance between those sides varies over time. Ada in part failed because the compile-time cost of Ada code was high compared to C code, and at the time, it was cheaper to put more programmer effort into C code than to put more compile-time effort into Ada code.

For example, in 1988, I was using a computer that, adjusted for inflation, would cost $3,400 in today's money (not including monitor); that set my compile-time cost budget as allowing for what I could reasonably do on a single core 8 MHz ARM2 CPU with 1 MiB RAM. Today, a comparable spend comfortably gets me 16 cores at 3 GHz, and 64 GiB RAM. That, in turn, means that I can afford considerably more compile-time cost for the same productivity, because the compute resources at compile time are much higher than they were 35 years ago. Similar applies to runtime costs; you can get a 200 MHz 32-bit ARM core with hundreds of kilobytes of embedded RAM for less than I used to pay for a 4 MHz Z80 and 256 bytes of RAM; I can thus afford much higher runtime cost.

Countering that, programmer effort hasn't become cheaper, and problem domain complexity has risen - we expect to do more with our systems than we could do in the early 1990s, and we don't have an army of cheap programmers who can implement anything you ask for quickly.

As a result, the optimum tradeoff has moved - we want less programmer effort, since that's still expensive, but we can afford to spend a lot more at compile-time to make that happen. We can also afford a little more at runtime, but not as much as we can afford compile-time cost, because runtime cost is also being consumed by problem domain complexity.

And this shift has been happening continuously over the existence of computing - when you learnt on a batch programming mainframe, the cost of a runtime crash was high, because you'd obliterate not just your own work, but also many other people's work. When the cost of runtime crashes fell (because you had a machine to yourself), it was cheaper to let the machine crash and reduce programmer effort than it was to maintain the high-effort, high-correctness world. And now the compile-time cost is falling, too, so it's cheaper to reduce programmer effort by having more checks done by the compiler.


Linus Walleij put it quite nicely in his Rust in Perspective writeup:

The ambition of Rust is, as I perceieve it, and whether the people driving it even knows it or not, to finish what the ALGOL committe as primus motor started in 1958, and what the Garmisch NATO conference concluded was necessary in 1968: to develop a language for systems programming that rely on formal logic proof, and to fulfil what ALGOL never could, what Pascal never could, and what the whole maybe-not-700 functional programming languages never could: a language that joins the disciplines of computer science and software Engineering into ONE discipline, where the scholars of each can solve problems together.

From the same article, in response to the various "C with a borrow checker" concepts, such as the one mentioned in the OP:

The C programming language cannot be subject to the same scrutiny as Rust, simply because of all the (ab)use it allows, and which was mentioned by Wirth in his historical perspective: if a type can be changed by a cast and array indexing is not even part of the language, there is nothing much to prove.

The entire article is not only worth a read but is necessary reading material. Even if you get nothing out of it other than "these problems have all been thought about before, deeply, by the very people who are quoted time and again when the topic of correctness in programming languages comes up."


Thank you! I feel a bit foolish with that misattribution.

1 Like

Thanks for the heads up on that article. Most interesting look at language history.

From that article:

A notable contributor to this codebase, apart from Hoare, is Brendan Eich, one of the founders of the Mozilla project and the inventor of JavaScript.

My top two favourite languages of all time are Rust and Javascript. Despite them being polar opposites in so many ways. But that fact suggests there is something in common about them that attracts me. I have no idea what it might be.

Perhaps. However I already learned about memory safety having been expected to master assembly language as a teenager in a technical college in 1973. So finding that C did nothing to help with that after using ALGOL was a shocking step backwards.

Whilst I agree with you that programmers should be aware of memory, pointers and their safe use I think that is better done with an exposure to assembler rather than C.

I don't see that. What I have seen is that the various teams I have worked with that used Ada loved it. I don't recall anyone ever complaining about that Ada "dev experience". There is no way I can see the "dev experience" of C or C++ as being better.


2 points I want to make about that:

  1. How can you be sure about that? Have you also spoken to programmers who have rejected Ada in favor of C, and heard their reasons for doing so? Because without that information, all you have is experiences based on a (necessarily small) sample that is quite possibly subject to selection bias i.e. you might've spoken to people who happened to like Ada enough to stick with it and then tell you about it, perhaps even despite its inevitable shortcomings. I'm not saying the people you spoke to were fanboys (I have no way of knowing that) but if they were, I imagine your conversations with them wouldn't have been much different in terms of what to take away from those conversations, making it difficult to tell the difference between the 2 possibilities.

  2. Its adoption also failed because it was missing something else every new programming language needs: a killer application, i.e. a hook to get people to use the language in the first place.
    The killer application of C was Unix, that of Perl and Bash was scripting, Javascript had the Web, and Rust has the killer combination of versatility of deployment (meaning it can worm its way into existing systems), and safety and performance (the latter 2 meaning people actually want Rust to worm its way into their existing systems, though of course Rust has much more to offer than just that).
    Since every rule has its exception, I'll also include the counterexample of Go here: the reason it became popular wasn't because you can write servers in it (that could already be done in plenty of languages), or even goroutines. The reason it became big is Google and its marketing power.
    But what of Ada? It also had no killer application aside from safety, and from what I can tell there wasn't a lot of cultural overlap with the business or FOSS worlds either, meaning it wasn't likely to jump over on its own into those worlds. Yet It's not a counterexample either, since that would require it to be mass-adopted first, which is counterfactual as of writing.

As for the more ontopic question, I personally haven't so much seen informed people being against Rust, and more making nuanced decisions (and then blogging about those decisions) based on their use case.
If I see any opinion at all that is just categorically against Rust, generally I find that it comes from an uninformed place, and once that has been remedied, said opinion often behaves like fresh snow exposed to a midday desert sun.


You are right, I cannot be sure. Can any one?

In my world of programming, since 1980, programmers did not get to choose their language. Perhaps even the companies they worked for did not get to choose. For example Coral and Ada were used as the military customers mandated them. Later in Windows application world C and C++ were used because, well that is what MS supported. In the embedded systems world it was all C, because device vendors supported it well for their platforms. In the web world, well, you know, Javascript. And so on. I guess if I moved in the financial word it would have all been Cobol for a long time. My point is nobody talked of "developer experience" much, they use got on with it.

Mind you, it is notable that all kind of ways were found to wriggle out of the Ada mandate, so much so that it is not mandated anymore. So I guess there was resistance to Ada.

Having used a bunch of these languages myself I don't see much of difference in "developer experience". I'd say they all sucked equally but in different ways. :slight_smile:

Of course it's all different today in the wide world of Open Source and a million starts ups created by programmers, they do get to choose their language. A situation I only found myself in four years ago. All of a sudden I had the choice. I chose Rust. Perhaps out of nostalgia for my fist love ALGOL. Perhaps out of fear of going back to the nightmare of C++.

I have to agree with that.

I also agree with what you say about the opinions of the "uninformed". Every time I come across a story on Slashdot or elsewhere about Rust there is a ton of comments throwing mud at Rust. Mostly the reasons given for being against it are not based on any reality. It's shocking how irrational programmers can be. I enjoy a good "language war" debate as much as anyone, but with rather points were argued based on actual facts.


Somewhat ironic wording, considering Ada is still used in high-assurance applications where a failure may lead to literal death. Killer applications is exactly the type of thing that Ada is avoiding! :slight_smile:


Hmm.... Ada was created and mandated for military projects by the American Department of Defense. And hence used in lots of "killer apps" like the ones I worked on in the UK in the late 80's early 90's.


*nod* Given that Ada is used in military contexts, I think the phrase "killer application" takes on a new and much more sombre meaning.


Rust is what Jonathan Blow calls a big idea language - not against it, but as a games developer I think he feels it has too much "friction". Talks about it in this old video - which I found quite interesting.

Not sure how he feels today - but he did go on to implement his own language (beta, but not open yet).

1 Like