The paper is a bit confusing due to seeming machine translation at points, but the gist is pretty common: here's our safety annotation system for C. The distinction seems to be it borrows some concepts from Rust, though it's not clear of it's using a full stacked/tree borrowing system (or how it could without much more annotation noise)
Their points are not really against Rust at all, simply, as far as I can tell, the idea that you have to rewrite C code to get the safety it promises.
I'm unconvinced by the paper alone that they can offer anything like the safety promises of Rust, let alone the simple ergonomic improvements and the unrelated to safety features like macros and the cargo/crate ecosystem.
I would personally like to see such an approach steal a bit more from the cppfront / "C++2" concept, though, that of giving themselves a fresh syntax space with better, safer, defaults on a per file basis. Ergonomics matter a lot, defaults even more so.
I've looked at the annotated code in the paper - it's not that horrible to look at, but without significant investment in tooling, it's going to be hard to write, because the analyzer will warn if any of your dependencies aren't already annotated.
And trying to apply the annotations to an existing chunk of code myself reveals the other issue - the complexity of annotating existing C code without reworking the code is comparable to that of writing Rust FFI bindings for that same C code. If the annotations could be added automatically, and then human checked, that'd be a different matter, but for now, it's not clear to me that it's better than using c2rust on the same code and then refactoring and rewriting to remove unsafe.
And at the same time, Rust upsets people because Rust's solution to memory safety doesn't require anything particularly modern in terms of theoretical underpinnings. All of the theoretical concepts needed to understand what Rust does to solve memory safety were in place by 1981 (arguably a lot earlier, but the earliest reference I've found that explains everything is a paper from 1981 talking about Algol 60); it's just that Rust is the first mainstream language to put the pieces together.
If you've been telling people that we've been doing our best for your entire career, and this is simply a hard problem to solve, then you discover that we've missed something that would have made it simpler for a few decades, it's embarrassing - it's easier to suggest that the new thing is somehow flawed, than to admit that we've missed something simple for so long.
ALGOL certainly had robustness as a priority. I spent some months teaching myself ALGOL as a young student in 1976. As far as I recall a program never randomly crashed or produced random results. Which was just as well as compilation was a batch job from punch cards that were run on a mainframe at night by computer operators (And people complain of Rust build times!). Sadly I had to stop that as my actual studies (physics) started to consume all my time. Fast forward to learning C for work in 1983, I was horrified to find that almost anything would compile and produce an executable, with no errors, which then crashed, produced garbage and even locked up the whole machine. Leaving one scratching ones head for hours. Never let anyone tell you a programming neophyte should start with C rather than Rust.
Tony Hoare tells of how his company built a FORTRAN compiler. It compiled FORTRAN to ALGOL and then compiled that to machine code. The customers hated it because: There was a performance hit, array bounds checks at run time for example; A lot of their existing code failed to compile or run due to its safety errors. They had to remove all the safety checks to sell the compiler!
Later I encountered Ada. Great a proper language with an emphasis on correctness. Well, as you may have noticed few programmers have even heard of Ada today. It is used still in safety critical software. No body liked it. It's verbose, they said, or it's to complex, or it under performs or the binaries are to big...
So I see a trend here. The world of programmers have consistently rejected languages that emphasised correctness. It's not like they missed something, they actively reacted it.
Actually this describes precisely the valuable lesson to be learned from first learning C and then Rust: it really teaches the value of memory safety.
Those languages weren't rejected because of the correctness focus.
The acceptance of Rust suggests that they were rejected because of a dev experience that wasn't good enough, and if that's the case it doesn't matter anymore if the tech has other virtues. It pretty much trumps everything else from a human interface POV.
There's an engineering tradeoff underlying this, and the balance point on that tradeoff varies over time. Rust isn't particularly good at correctness of programs compared to something like Agda, but it is much easier to write Rust programs than Agda programs.
The sides of the tradeoff are problem domain complexity, runtime cost, programmer effort, and compile-time cost. Sensible engineers recognise that the right balance between those sides varies over time. Ada in part failed because the compile-time cost of Ada code was high compared to C code, and at the time, it was cheaper to put more programmer effort into C code than to put more compile-time effort into Ada code.
For example, in 1988, I was using a computer that, adjusted for inflation, would cost $3,400 in today's money (not including monitor); that set my compile-time cost budget as allowing for what I could reasonably do on a single core 8 MHz ARM2 CPU with 1 MiB RAM. Today, a comparable spend comfortably gets me 16 cores at 3 GHz, and 64 GiB RAM. That, in turn, means that I can afford considerably more compile-time cost for the same productivity, because the compute resources at compile time are much higher than they were 35 years ago. Similar applies to runtime costs; you can get a 200 MHz 32-bit ARM core with hundreds of kilobytes of embedded RAM for less than I used to pay for a 4 MHz Z80 and 256 bytes of RAM; I can thus afford much higher runtime cost.
Countering that, programmer effort hasn't become cheaper, and problem domain complexity has risen - we expect to do more with our systems than we could do in the early 1990s, and we don't have an army of cheap programmers who can implement anything you ask for quickly.
As a result, the optimum tradeoff has moved - we want less programmer effort, since that's still expensive, but we can afford to spend a lot more at compile-time to make that happen. We can also afford a little more at runtime, but not as much as we can afford compile-time cost, because runtime cost is also being consumed by problem domain complexity.
And this shift has been happening continuously over the existence of computing - when you learnt on a batch programming mainframe, the cost of a runtime crash was high, because you'd obliterate not just your own work, but also many other people's work. When the cost of runtime crashes fell (because you had a machine to yourself), it was cheaper to let the machine crash and reduce programmer effort than it was to maintain the high-effort, high-correctness world. And now the compile-time cost is falling, too, so it's cheaper to reduce programmer effort by having more checks done by the compiler.
The ambition of Rust is, as I perceieve it, and whether the people driving it even knows it or not, to finish what the ALGOL committe as primus motor started in 1958, and what the Garmisch NATO conference concluded was necessary in 1968: to develop a language for systems programming that rely on formal logic proof, and to fulfil what ALGOL never could, what Pascal never could, and what the whole maybe-not-700 functional programming languages never could: a language that joins the disciplines of computer science and software Engineering into ONE discipline, where the scholars of each can solve problems together.
From the same article, in response to the various "C with a borrow checker" concepts, such as the one mentioned in the OP:
The C programming language cannot be subject to the same scrutiny as Rust, simply because of all the (ab)use it allows, and which was mentioned by Wirth in his historical perspective: if a type can be changed by a cast and array indexing is not even part of the language, there is nothing much to prove.
The entire article is not only worth a read but is necessary reading material. Even if you get nothing out of it other than "these problems have all been thought about before, deeply, by the very people who are quoted time and again when the topic of correctness in programming languages comes up."
Thanks for the heads up on that article. Most interesting look at language history.
From that article:
Perhaps. However I already learned about memory safety having been expected to master assembly language as a teenager in a technical college in 1973. So finding that C did nothing to help with that after using ALGOL was a shocking step backwards.
Whilst I agree with you that programmers should be aware of memory, pointers and their safe use I think that is better done with an exposure to assembler rather than C.
I don't see that. What I have seen is that the various teams I have worked with that used Ada loved it. I don't recall anyone ever complaining about that Ada "dev experience". There is no way I can see the "dev experience" of C or C++ as being better.
How can you be sure about that? Have you also spoken to programmers who have rejected Ada in favor of C, and heard their reasons for doing so? Because without that information, all you have is experiences based on a (necessarily small) sample that is quite possibly subject to selection bias i.e. you might've spoken to people who happened to like Ada enough to stick with it and then tell you about it, perhaps even despite its inevitable shortcomings. I'm not saying the people you spoke to were fanboys (I have no way of knowing that) but if they were, I imagine your conversations with them wouldn't have been much different in terms of what to take away from those conversations, making it difficult to tell the difference between the 2 possibilities.
Its adoption also failed because it was missing something else every new programming language needs: a killer application, i.e. a hook to get people to use the language in the first place.
Since every rule has its exception, I'll also include the counterexample of Go here: the reason it became popular wasn't because you can write servers in it (that could already be done in plenty of languages), or even goroutines. The reason it became big is Google and its marketing power.
But what of Ada? It also had no killer application aside from safety, and from what I can tell there wasn't a lot of cultural overlap with the business or FOSS worlds either, meaning it wasn't likely to jump over on its own into those worlds. Yet It's not a counterexample either, since that would require it to be mass-adopted first, which is counterfactual as of writing.
As for the more ontopic question, I personally haven't so much seen informed people being against Rust, and more making nuanced decisions (and then blogging about those decisions) based on their use case.
If I see any opinion at all that is just categorically against Rust, generally I find that it comes from an uninformed place, and once that has been remedied, said opinion often behaves like fresh snow exposed to a midday desert sun.
Mind you, it is notable that all kind of ways were found to wriggle out of the Ada mandate, so much so that it is not mandated anymore. So I guess there was resistance to Ada.
Having used a bunch of these languages myself I don't see much of difference in "developer experience". I'd say they all sucked equally but in different ways.
Of course it's all different today in the wide world of Open Source and a million starts ups created by programmers, they do get to choose their language. A situation I only found myself in four years ago. All of a sudden I had the choice. I chose Rust. Perhaps out of nostalgia for my fist love ALGOL. Perhaps out of fear of going back to the nightmare of C++.
I have to agree with that.
I also agree with what you say about the opinions of the "uninformed". Every time I come across a story on Slashdot or elsewhere about Rust there is a ton of comments throwing mud at Rust. Mostly the reasons given for being against it are not based on any reality. It's shocking how irrational programmers can be. I enjoy a good "language war" debate as much as anyone, but with rather points were argued based on actual facts.
Hmm.... Ada was created and mandated for military projects by the American Department of Defense. And hence used in lots of "killer apps" like the ones I worked on in the UK in the late 80's early 90's.
Rust is what Jonathan Blow calls a big idea language - not against it, but as a games developer I think he feels it has too much "friction". Talks about it in this old video - which I found quite interesting.
Not sure how he feels today - but he did go on to implement his own language (beta, but not open yet).
The only language I've seen demonstrated by it's developer that didn't seem like pointless surface syntax shuffling of some other, much better supported language, is, somewhat ironically, Herb Sutter's cppfront, which he regularly disclaims as being "nothing but better defaults and simpler syntax for C++, no seriously I took out things I thought were too much like a new feature"
The moral of the story seems to be to neg your own language as hard as you can (or be good enough a language that someone else wants to demonstrate it)
Herb is amazing and a brave man. Here is a guy high up and well respected in the C++ world, long time secretary and convener of the ISO C++ standards committee, who stands up and basically says "Hey guys, you know that language we have been working on for 40 years? Well I think it is a disaster, a train wreck. It's accreted tons of overly complex features that nobody can live long enough to learn all of let alone how they all interact with each other. Meanwhile it has done nothing to fix the problems it inherited from C, in fact goes out of its way not to. I think we should do this...".