I am not trained in Computer Science nor a historian, so all of my assumptions may be wrong and I do not need a scientific paper as an answer. I am just curious.
From what I have read in the past I was under the impression, that on this world's universities lots of computer scientists invent new languages to try out new and fascinating concepts and most of these languages are never recognized by any number of people. I remember to have read that "the only program ever written in many of these languages" was "their own compiler". Or something along these lines... But every once in a while there was a concept that proved to be great and get known and used. Usually not in the academic language that was designed to investigate the new feature but in some "real world" language that profits from the academic origin but is no longer purely academic.
If any of that is true I wonder, whether the "ownership" model has such a parent language (or two or many). Was ownership developped with Rust on Rust or is/was there some this-made-jane-a-phd-lang ? If so, does it deserve to be recognized by the Rust community?
Ownership already exists in C++, it's just terribly implemented (automatic cloning and manual moving combined with a confusing mix of reference types). But you can relatively easily write C++ in the style of Rust w.r.t. ownership, it's just that it will be very verbose.
Type theorists also talk about "linear types" and "affine types" which must and can be used exactly once, respectively. So there has been theoretical work in this area, too.
The big idea I haven't seen elsewhere is single-threaded exclusive mutability. Mostly because the only language to deal with ownership (value types) and mutable pointers (indrection) was C++ for a long time. Type theorists' languages are usually purely functional and don't allow for mutability, while most mainstream languages don't care about value semantics and make Every Object a ReferenceTM, and they couldn't be bothered less by the presence of single-threaded memory corruption shared mutability, either.
That said, the initial, very early development of Rust can be pinpointed very accurately; the author was then-Mozilla employee Graydon Hoare. The development of modern Rust is a community effort, though.
Of other recent non-research languages, the only other that I know which takes the approach of controlling aliasing is Parasail. Not coincidentally, I would argue, both Rust and Parasail guarantee data-race freedom, while most langauges do not.
Though when searching for this citation, a number of academic papers did indeed crop up. (I didn't bother checking any of them to see if they were talking about the same thing.)
W. r. t. ownership, this list – if anything – only quotes “move semantics” from C++ though.
AFAIK, the borrow checker system is something new. AFAIR, there's academic efforts to analyze the system and show its soundness after the fact, which means the practical implementation came first (with practical goals in mind and probably driven by intuition) and theoretical analysis came afterwards (and by other people, I guess) and showed – perhaps surprisingly – that even though it's a novel system, it's not full of logical holes and inconsistencies, but can be made sense of.
Don't quote me on any of this, it's just what I feel I might have read somewhere (and I don't remember where exactly). The “academic efforts to analyze the system” I'm referring to are things like stacked borrows.
Perhaps you could have guessed this part, but the rest of Niko's blog also occasionally references research papers that are influential, e.g. here:
For this reason, we need to accept something called “first-order hereditary harrop” (FOHH) clauses – this long name basically means “standard Horn clauses with forall and if in the body”. But it’s nice to know the proper name, because there is a lot of work describing how to efficiently handle FOHH clauses. I was particularly influenced by Gopalan Nadathur’s excellent “A Proof Procedure for the Logic of Hereditary Harrop Formulas”.
Also, I think the (research) language Cyclone deserves a special / direct mention, since it was an important source of inspiration to Rust w.r.t. lifetime-tracked pointers (the Rust reference(s)), although it does not look like the unique vs. shared duality appears in that language, which is a pity, since that duality plays so marvellously well with lifetime tracking:
While reading this -- if you're foolish enough to try -- keep in mind that I was balanced between near-total disbelief that it would ever come to anything and minuscule hope that if I kept at experiments and fiddling long enough, maybe I could do a thing.
I had been criticizing, picking apart, ranting about other languages for years, and making doodles and marginalia notes about how to do one "right" or "differently" to myself for almost as long. This lineage represents the very gradual coaxing-into-belief that I could actually make something that runs.
As such, there are long periods of nothing, lots of revisions of position, long periods of just making notes, arguing with myself, several false starts, digressions into minutiae that seem completely absurd from today's vantage point (don't get me started on how long I spent learning x86 mod r/m bytes and PE import table structures, why?) and self-important frippery.
The significant thing here is that I had to get to the point of convincing myself there was something there before bothering to show anyone; the uptick in work in mid-to-late 2009 is when Mozilla started funding me on the clock to work on it, but it's significant that there were years and years of just puttering around in circles, the kind of snowball-rolling that's necessary to go from nothing to "well .. maybe .."
I'd encourage reading it in this light: delusional dreams very gradually coming into focus, not any sort of grand plan being executed.
This presentation was an initial vision statement, written before Rust had even become an open-source project. Nearly everything in the presentation has changed in the eleven years since it was given.
This is a statement of relative priorities, not dogma. It's saying that Rust will not sacrifice performance and correctness for ease of rapid prototyping. In practice, for example, it means that longer compile times are acceptable in exchange for better optimizations or more complete static analysis. A language can't be the best at everything, so it has to make these choices. While Rust is better than Python in some dimensions, Python is better than Rust in others.
In the intervening years, one common theme of Rust development has been finding positive-sum solutions where we don't need to take a loss in one dimension to get a win in another. For example, there's a lot of past and ongoing work on compiler performance to get us to a place where we can have quick development cycles without losing the other benefits of the Rust compiler.