How are range expressions implemented "under the hood"

My question boils down to how it is possible that range expression map to std::range equivalences as shown here:

https://doc.rust-lang.org/reference/expressions/range-expr.html

For example:

1..2;   // std::ops::Range

My natural thought would be to think that they are overloaded however, this page:

https://doc.rust-lang.org/book/appendix-02-operators.html

Does not show the range operators .. and ..= as being "Overloadable"

I'd appreciate any clarity that could be provided on this topic.

Thank you!

The mapping is here.

3 Likes

As Hyeonu implied, the Range* types are special and known to the compiler. If you check the definition of, say, RangeFrom, you can see that there's a "magic" #[lang="RangeFrom"] annotation which tells the compiler that this type correspond to a builtin language item of the same name.

Note that this works because Range has a lang_item attribute, which marks standard library types that interact with built-in compiler features.

Thank you very much!

I did notice the "lang" attribute and thought there may be some "magic" going on there. I'm curious as to why this tactic was taken over an approach that would leverage the standard operator overload construct. (I.e. making .. and ..= "overload-able" and then doing it that way.)

My guesses are it's one or more of the following: historic-- i.e.operator overloads were not available when the feature was developed; an optimization is provided that would otherwise be impossible; permits broader flexibility in the usage syntax than available with operator overloads.

I'm also wondering what is the likelihood of a future Rust version that removes the reliance on the lang attribute?

Thanks again!

Thank you!

Thanks!

Note that the operator overload traits are magic the exact same way as the range types are: they too have #[lang] attributes which tell the compiler how to lower an operator application into a trait method call. But .. and ..= are not really operators in the sense that it would make sense to allow them to be overloaded. For any a, b: T, the expression a..b must construct a range of type T.

Indeed, the range constructors are much closer to the array constructors [a, …] and [a; N] and tuple constructors (a, …). Arrays and tuples don't currently have similar (edit: library-based) types as ranges do, but in principle they could sometime in the future (there could be an Array<T, const N: usize>, which requires const generics, and a Tuple<T, ...Ts>, which would require variadic generics). But they would, again, have to be known to the compiler so it knows how to lower the […] and (…) syntaxes to their desugared versions.

7 Likes

One big problem with that is that it'd need a whole lot more type annotations in common situations.

If I've overloaded .. to be able to also return CoolerRange, then what type would a[i..j] use? Expecially if I'd implemented Index for it?

It definitely exists, just as a language construct. And called [T; N] and (T, ...Ts). Ranges don't have such construct, so they use a type.

Yes, obviously the types exist, and it's nice that the type-level syntax mirrors the term-level syntax. I just meant that in principle they could have almost-pure library implementations, or at least library interfaces with compiler magic inside. Another interesting example of compiler–library interaction is the Fn* traits, where the magic syntax FnMut(T, U) -> R desugars to FnMut<(T, U), Output=R>, although user code cannot (yet) use the latter forms in stable.

There is a difference between Fn* traits and arrays & tuples: the syntax people are used to work with functions is Fn(arg1, arg2, ...) -> Ret, but it's usually unconformable with generics (no way to express "any function" this way, only using macros). Thus we need the version that takes a tuple. However, arrays and tuples doesn't have this problem (they have similar problems - arrays until const generics landed and tuples until some form of variadic generics, but this has no obvious solution).

I wanted to extend an overdue thank you for all the fine responses to my inquiry. Everyone's input revealed areas over which I need to further my learning.

Thanks again!