Are there any particular reasons for Rust having to maintain backwards compatibility and not deprecating STD APIs?

I just started learning Rust and don't know much about it. I did some looking up and the reasons I found is so that old rust code and be compiled with newer compiler versions, and to not have issues or inconveniences with breaking changes, perhaps like other languages do.

1 Like

The software industry has learnt that breaking changes between versions of programming languages cause a lot of churn, therefore avoiding breaking changes is not only desired but mandatory if you want to attract the type of programmers / companies that require such level of stability.

6 Likes

Yep, that's about it.

If a language wants to be adopted and used in the mainstream, it's incredibly important that they commit to backwards compatibility.

Code can survive a surprisingly long time. For example, there are some banks which still use COBOL programs from the 80s, and it's pretty common in established companies to have a C++ codebase that is multiple decades old. In a previous job, I worked on a Delphi Pascal codebase that was about 30 years old. Often, these legacy projects are critical to the business, but are hard to change because of accumulated technical debt and a lack of automated tests.

In projects this old and/or large, removing parts of the standard library forces companies to either 1) expend a lot of engineering effort to update their code to the newest APIs, or 2) pin themselves to an older version of the language and stagnate.

Neither of those options are ideal, so they go with option 3) use a programming language that is committed to backwards compatibility.

6 Likes

For an old post about this, see Stability as a Deliverable | Rust Blog

The biggest reason is that Rust really wants people to update to the new version of the compiler -- in large part because then it doesn't have to release patches to old versions of the compiler -- and that's only a reasonable expectation to have on people if updating to new versions of the compiler is smooth.

Thus people working on rustc put lots of work into keeping existing stuff working, even when they're de jure allowed to break things under the details of the stability promise.

9 Likes

Just as clarification, Rust does occasionally deprecate things in std (for example, Error's description field: Error in std::error - Rust). Doing so does not break backwards compatibility because it's all still there, you just get a warning if your code uses them. The warning does not prevent compiling and running the code, and the warnings are easily silenced if necessary.

4 Likes

Can you explain this logic? rustc tends to not be running on a public internet facing port. What would even provide motivation to patch old versions of the compiler ?

Let's say (for the sake of discussion) that my codebase has a binary containing well-debugged Rust 2015 code; we believe it to be bug-free, and we've not touched it for 8 years as a result, so nobody is current on this chunk of code.

A hypothetical Rust changes things so that editions stop being supported after 10 years; so, in 2025, the Rust toolchain will start refusing to compile that chunk of code. This is no problem for us, since the code is stable, and we simply use the existing compiler version whenever we need to make small changes to the code - it doesn't need to be updated to a new edition.

In 2026, somebody discovers a bug in rustc that affects all code that's been compiled with any released version of the compiler - maybe a codegen bug, maybe a Thompson trusting trust attack, details don't matter - which means that we need to recompile with a new compiler to remove the bug from our binary. In the current world, that's fine - we update to the current version of rustc, rebuild, and we're done. In our hypothetical world, I have a choice to make; do I port my (well-debugged and stable) code to Rust 2024 edition from Rust 2015, or do I backport the needed fix to an older compiler so that I can continue using my Rust 2015 codebase without major changes?

This is the sort of thing that motivates patching old versions of the compiler; if I'm unable to update to a new version, I'd like the old version to be patched whenever a relevant bug bites me. Rust's stability guarantee obviates the need to patch old versions; if my current compiler version has a bug that bites me, I can upgrade to the latest compiler and I'm happy.

5 Likes

That's nice theory, but it doesn't hold in practice.

Software packages that are supported for decades do exist (dBase, OpenServer, OS/2, etc).

But if you dig inside you'll find out that “stagnation” is the only word one may use: they are, essentially, frozen in time, changes happen at glacial speed, in 10 years there are less changes than in Rust world happen in 1 year… the industry standard for support is ten years plus 3-4 years of extended support. Windows XP target is no longer supported and Windows XP was an oddity, it lived much longer than it should.

At some point deprecated items should become unavailable, because what's the point of deprecating them if they would still be available forever?

The only realistic question is whether start with Rust 2024 or with Rust 2027. I think making certain function unavailable in the code compiled for Rust 2027 would be sensible: it would give Rust developers four years to bikeshed the details.

Indeed, and the package I described has stagnated - it's been unchanged for a long time, because it does what's needed of it. Eventually, something external will change, and we'll want to update it significantly, but until then, why risk breakage just to move from Rust 2015 to Rust 2024 and get no other benefit?

None of what I've said rules out removing items from future editions, as long as the newer compiler also supports the older edition so that it can compile the old and new code together. For example, if you couldn't use str::lines_any in Rust 2024 or later, but could use it in Rust 2015 through 2021 editions, and the toolchain still supports Rust 2015 through 2021, there's no issue; either I'm modernising a crate and can replace it with str::lines, or I'm leaving the crate to stagnate, and it can stay in Rust 2015 edition when I come by to make a small change.

2 Likes

Python community paid a decade(2008 ~ 2019) for the transition from Python 2 to Python 3 which was breaking, incompatible change. This made the ecosystem splitted for years and during the transition they couldn't handle any more significant innovations from the language side.

Since we learn from the past the Rust core team made different decision. Instead of single incompatible bombing, the edition system was invented to make potentially breaking but still compatible changes more regularly. Do you want to use fancy language features of edition2024 but the library which is behemoth in size still not updated from edition2018 like the serde? No problem, just use it as usual. If the rustc doesn't guarantees compatibility you need either to fork and update that massive library code and maintain them yourself, or stick to the past version and eventually fork the compiler itself like those python 2.8 people.

Edit: post mistakenly added as a reply but failed to fix anything XD

2 Likes

At some point that would have to go, too. You couldn't run or even compiler code written for MS-DOS 1.0 under Windows XP. You couldn't run or even compiler code written for MS-DOS 2.0 under Windows 11. Same thing happened before with K&R C, Fortran 66 and many other languages. It's inevitable.

Expecting the code that was written 20, 30, 50 years ago to be compileable with “latest and greatest” version of anything is unrealistic.

But the important thing is the continuity: you often need to be able to always keep one codebase which may be compiled by compilers which are 5 or 10 years apart.

The Python2 to Python3 transition wasn't disaster because they did it too fast, in fact it was dragging for far longer then needed.

No, it was disaster because of someone's bright idea to make sure code for Python2 couldn't be compiler by Python3 and code for Python3 couldn't be compiler by Python2 either!

Only when six arrived transition have started to become feasible.

Before that Python was routinely dropping features and it was adding new ones, too, but that wasn't the story you may tell at the party because transition was gradual: it was always possible to write code which would support both Python 2.2 and 2.4, 2.4 and 2.6, 2.6 and 2.7… then suddenly Python 3.0 is not compatible with Python 2 in any direction. That was “a disaster”, sure.

Rust should offer similar experience but it doesn't support 100% of all code which was ever written for older versions and it shouldn't try to do that.

The question is about how and when to do incompatible changes. You either need to do that or entity in question would stagnate and die entirely or, best scenario, it would become “frozen in time” legacy.

Linux kernel keeps deprecated things around for about a decade (10 years), but then drops these. All other popular OSes and system do the same thing.

Only these that more or less stop development (and advancement) completely may support compatibility “forever”. But then they become zombies: something that neither dies entirely nor advances and is used by new projects.

Who are you talking about? AFAIK all attempts to create something like this failed. Keeping development of a language is massive undertaking. And very expensive.

Thus such fork invariably becomes something that doesn't go anywhere. At most it would get some security fixes. If even that.

Because deprecating them gives a message to people writing new code to point them to the better approach.

It also lets us make the deprecated thing less efficient, so long as it still works, if that can help with safety or maintenance. For example, mem::uninitialized was changed, after it had been deprecated for a while, to something that's not as fast but is slightly less of an elephant-foot-gun. Now if you want the full optimization power of uninitialized memory, you have to write things with MaybeUninit instead.

6 Likes

The changes from Python 2 to Python 3 were, for the most part, quite minor. It's not unreasonable to look at them and assume that making the update wouldn't be a problem. It's not that much of a problem if you control your whole code stack. But there are two issues which made it into an insurmountable wall. First, code generation and metaprogramming, which is devilishly hard to debug, update or change. Second, all code had to use the same Python version, which means that you couldn't update your code until all your dependencies are upgraded. The time to upgrade is thus proportional to the depth of your dependency tree, and if any dependency drags it feet for whatever reason (upgrade complications, or just lack of maintenance), it can stall the upgrades for all of its dependents. That's why it's so critical that libraries using different editions can be compiled together.

The Scala community did a similar mistake, way after Python 2-to-3, with similar consequences. Gratuitious syntax changes, backwards compatibility breaks and a vague status of the new compiler means that plenty of projects are stuck on Scala 2.13, and many opt into rewriting into a different language rather than enduring the transition to Scala 3. Scala adoption took a nosedive around the time the Scala 3 project officially became public.

4 Likes

At work we have a MATLAB library that is ~20 years old and only on the 2023 version did we get a few deprecation warnings that some functions will be removed in the future. We can't just rewrite it because it's super complex and nobody that initially worked on the suite works at the company anymore.

It's the same with Windows. If you have a 32bit Win10 (which exists) you can install the NTVM to get WOW16 for 16bit programs. After all 16bit runs on 32bit the same way 32bit runs on 64bit (WOW32). IIRC, there's nothing preventing you from compiling a .NET Framework 1.0 program with .NET Framework 4.8.1. That said, MS is afaik the only company where you can expect that your programs have a lifecycle of 30+ years.

Is keeping old stuff around the best course of action? No, of course it's not. However, the alternative for many businesses is to keep old OS/compiler versions around and that's far more dangerous.

2 Likes

Yes. That's why transition succeeded at all. If they would have made Python 3 a bit more incompatible then we would have heard about Python 3 as much as we hear about Perl 6 Raku: that cool, yet weird, language that some people know, but nobody uses.

Beyond certain size “flag day” upgrades stop working. Transition from NCP to IPv4 happened in one day, but similar transition from IPv4 to anything couldn't and thus wouldn't happen in one day.

Python 3 developers showed complete lack of wisdom, but thankfully they were saved by perseverance of some other people.

Except no one does. Not even huge companies with lots of resources like Google or Microsoft control the whole stack and may attempt flag day transition.

Make up your mind, please! This maybe a critical deficiency or unreasonable to look at them and assume that making the update wouldn't be a problem, but not both simultaneously!

The big quesion is: how many versions you have to support simultaneously?

Experience shows that two versions are enough: that's how Mac evolved (from MacOS Classic on M68K to MacOS Classic on PowerPC to MacOS X on PowerPC to MacOS X on IA-32 to MacOS X on x86-64 to MacOS X on AArch64) and it remained quite popular for platform from one single vendor for all that time.

Windows supported more “editions” with weird corner cases where, e.g., MS DOS 1.0 API was supported on Windows 95 OSR2 conditionally.

But keeping all “editions” forever is not, really, a viable strategy.

Well, sure: this effectively breaks continuity. If you can't adopt new version of the language in a piecemeal fashion, then there are no advantage of continuing with the same language (or the same series of languages).

That's how powerhouse of 50 years ago evolved into oblivion.

But all projects that try to keep forever compatibility also evolve into oblivion, just for different reason!

And what do you plan to do when functions would actually be removed?

Again: what do you plan to do after October 25, 2025?

Have you actually tried to do that? I did. It's… not entirely trivial, to say the least. And .NET Framework 4.8.1 is deprecated, too (although it would, probably we supported as long as Windows 11 is supported).

Not really. If you look on the MCP you'll see:

The end result ended precisely and exactly like like in all other such other cases: Microsoft got the grip on some [pretty lucrative] market, but lost all other markets except game consoles (where it managed to, essentially, sell version of desktop PC in the guise of game console).

Rust would need to do that decision at some point, too: adapt “something” which may force it to break backward compatibility or decided that it's better to leave that “something” to other languages.

We don't know yet what that “something” would be (CHERI? some kind of AI platform? or something else?) but sooner or later there would be need to break compatibility, the only question is when and why.