Very interesting read! I really wish we could be unburdened by backwards compatibility so big changes like these are possible, but I realize that would open its own can of worms
Editions still have to interoperate with each other, they're not a free license to change anything and everything. In fact the post goes into detail on how they might or might not work for these specific changes!
Honestly backwards compatibility in rust is already half-assed anyway. Trying to use a fixed version of the compiler (important in some contexts such as security software) is a huge pain in the ass.
Rust has an excellent story for running old code on new compilers.
For example, I've been using Rust in production since just after 1.0, so about 8 years now. I maintain 10-20 libraries and tools, some with hundreds of dependencies. If I update a project with 200 dependencies that hasn't been touched in 3 years, that's 3x200 = 600 "dependency-years" worth of old code being run on a new compiler. And usually it either works flawlessly, or it can be fixed in under 5 minutes. Very occasionally I'm forced to update my code to use a new library version that requires some code changes on my end.
I've also maintained long-lived C, C++, Ruby and Python projects. Those projects tend to have far more problems with new compilers or runtimes than Rust, in my experience. Updating a large C++ program to newest version of Visual C++, for example, can be a mess.
However, Rust obviously does not support running new code on old compilers. And because stable Rust is almost always a safe upgrade, many popular libraries don't worry too much about supporting 3-year-old compilers. (This is super irritating for distros like Debian.) Which if you're working on a regulated embedded system that requires you to use a specific old compiler, well, it's a problem. Realistically, in that case you'll want to vendor and freeze all your dependencies and back-port specific security fixes as needed. If you're not allowed to update your compiler, you probably shouldn't be mass-updating your dependencies, either.
Basically, Rust got so exceptionally good at running old code on new compilers that the library ecosystem developed the habit of dropping support for old compilers too quickly for some downstream LTS maintainers. And as a library maintainer, I'm absolutely part of the problem—I don't actually care about supporting 5 year old compilers for free. On some level, I probably should, but...
> Updating a large C++ program to newest version of Visual C++, for example, can be a mess.
That's an MSVC problem. MSVC ignored the C++ specification for decades. Now it does follow the spec. So a lot of non-standard code broke.
The transition was pretty ugly, as I'm pretty sure you've figured out. The permissive mode never gave warnings for non-standard code, so you couldn't just fix warnings as they came up. You had to do a fix the world update, which are a dickpain in large codebases. The transition was relatively fast; VS2017 introduced the standard compliant parser, and C++20 requires it.
Honestly MSVC is such a mess. I have a hunch that before this decade is out, Microsoft will just replace it with Clang a la Internet Explorer/Edge/Chromium. That's why the switch to the standard compliant parser was so rushed; they're trying to force everyone to write standard compliant code so that Clang can compile it.
This is quite under-appreciated. I've lost count of how many times NPM or PIP broke because of "2000 lines of error dumps". The only occasion is happens with Rust has to do with TLS, and I learned to handle that one quick. Otherwise, "cargo install" is the best package manager out there.
I think the answer is that when you break compatibility for an old compiler, you rev the major version. The folks on the old compiler then get to use the old library forever. Yeah, they can pin (and should) but it makes it a little cleaner.
Except that so few people are using older compilers, that tracking exactly when features were introduced becomes burdensome to the developer. For languages that have a few large releases every decade, such as C++, this is reasonable. As a C++ developer, I can remember that `std::unique_ptr` requires C++11, `std::make_unique` requires C++14, structured bindings require C++17, etc. It's tedious, but doable, and can be validated against a specific compiler in CI.
For languages with more frequent releases, that just isn't feasible. The let-else construct [0] was stabilized in 1.65 (Nov 2022), default values for const generics were stabilized in 1.59 (Feb 2022) [1], and the const evaluation timeout [2] was removed in 1.72 (Aug 2023). These are all reasonable changes, released on a reasonable timescale. However, when writing a `struct Position<const Dim: usize = 3>`, I don't check whether my library already uses language features that post-date 1.59, and I don't consider it a backwards-compatibility breakage to do so.
Incrementing the major version for a change that is source-compatible on most compiler versions (e.g. the first use of let-else in a project) just isn't justified.
Is it a breaking change to not support an old compiler when the new compiler is the same major as the old one was? Transitive dependency upgrades that bump minor don't trigger major version bumps in downstream users. Why should compilers be different?
I've never had any notable issues. I've had far fewer compatibility issues with compiling crates than I have compiling C++ libraries or installing Python packages.