Here's a dump of some possibly relevant reddit discussions:
And personal blogs:
If nothing else, I highly recommend reading the iconic and extensive post, fireflowers - The Rust Programming Language in the words of its pracitioners.
As for my own opinion, I think there are too many things to list, so I'll just say things briefly.
Compile time guarantees: When I think of a "staticly typed language", I think of Java, or C#, or something like TypeScript. They give compile time assurances that your code has the correct types, and move a set of errors from runtime to compile time. Rust goes one magnitude further in this! Compile time checking extends to thread safety, to ownership semantics, to validation.
Enums mean that I can at compile time declare the set of variants my data can exist as, and when accessing that data I'm forced to consider all possibilities. Traits mean that when my code is templated/generic, I have exact capabilities I can require.
As a library author, the complex trait/generic system means I can craft intricate, yet easy to use interfaces. These interfaces cannot be used incorrectly, which means I don't have to perform runtime checks, and my users don't have to even think about problem cases, because they can't write them.
As a library consumer, the rich and extensive crates ecosystem enables me to write code in vast different domains without needing to dig into the specifics of every one. I don't need to know how a JSON parser or writer works to use serde - and my lack of knowledge won't ever be a source of bugs, because I get compile time errors rather than runtime ones.
Last, I really appreciate how rust has been designed for backwards-compatibility. Rust is extremely backwards compatible. I can run code written for Rust 1.0. Equally, I can write code which I know I won't have to update next time I update rust.
And it's not just Rust itself - the language enables libraries to have the same guarantees. Consider two facts:
- In typechecking, only the signature of functions are considered. There's no relying on the implementation for determining if callers are correct (like you can do in Scala, or Haskell)
- The extensive type system means incorrect usages of interfaces become type errors, not runtime errors
Because of these two things, libraries can be 100% sure that they maintain backwards compatibility when releasing a new interface. In Python, or Java, or Javascript, or even Scala or Haskell, you need to pay extra attention to implementations of things if you want to ensure you maintain backwards compatibility. In Rust, it's free: if you haven't changed the function signature, it remains the same. It's incredibly hard to have accidental interface breakage.
This means that as a consumer of rust libraries, I have no fear of upgrading. As long as they've kept the major version the same, my code works with the newer version. And Cargo takes advantage of this: cargo update
keeps me on the same major version, but otherwise upgrades all of my dependencies for free.
In conclusion, I just feel really taken care of when I'm using rust. There are so many trivial things, from package upgrades, to type errors, to passing in a string which an interface doesn't expect and getting a random runtime error which I'm completely free of in Rust. I can just think about the algorithms! That's why I love rust.
Edit: just to emphasize it, I really, really recommend the fireflowers post. It's the result of a whole bunch of research into why different people like rust - it gathers a few key insights into what Rust offers, contains many user testimonials, and at the end links to a number of other response blog posts which give even more insight into the situation. Disregard my own post, this is the useful thing. fireflowers - The Rust Programming Language, in the words of its practitioners
The only downside is that it's from three years ago - though if you're doing research into why Rust has tipped the scales so many years in a row, maybe that's even more relevant?