Type inference with `&'static [T]`

I'm confused about some issues with type inference and slices with the 'static lifetime, which will be best illustrated by an example.

Compiling the following code:

trait MyTrait {}
impl MyTrait for &'static [f64] {}

fn takes_trait(arg: impl MyTrait) {}

fn main() {

Produces the output:

error[E0277]: the trait bound `&[{float}; 1]: MyTrait` is not satisfied
  --> src/main.rs:19:17
19 |     takes_trait(&[4.0]);
   |                 ^^^^^^ the trait `MyTrait` is not implemented for `&[{float}; 1]`
28 | fn takes_trait(arg: impl MyTrait) {}
   |                          ------- required by this bound in `takes_trait`
   = help: the following implementations were found:
             <&'static [f64] as MyTrait>

I have a vague understanding of what is going on here; somehow the trait implementation does not influence the resolution of the type of the argument, and rustc defaults to assuming that &[T] is a reference to some fixed-size array.

I'm really curious if someone could provide a more detailed explanation, though. In particular, I'm curious if this is something that might ever improve, or if there is some fundamental reason that this cannot work within rust's type system.

Thanks in advance for any guidance!

No, that's incorrect. There's no "assumption" going on; &[4.0] is a reference to a fixed-size array, &[f64; 1].

What you are confusing this with is deref coercions. A reference-to-array can be coerced to a reference-to-slice. Similarly, String can be coerced to &str, etc. In general, for any T: Deref, &T can be coerced to &<T as Deref>::Target. This does not make the types identical, though.

One more thing is that deref coercions aren't applied if generics are involved, which is exactly why your example doesn't work. Use [4.0].as_ref() instead, and it will compile.

This is because the compiler does not do any coercions when generics are involved, but your code is only valid if you perform the coercion from &[f64; 1] to &[f64].

The reason Rust doesn't perform coercions automatically in this context is that, if it did, your main function's behavior would change if somebody adds an impl MyTrait for &[f64; 1] in the future, and the authors of Rust decided that this kind of change in behavior would be bad, and therefore required you to be unambiguous about what you want.


I was about to say that this is the answer i was looking for, but then I confused myself again with the following case, which compiles fine:

fn main() {
    let arg = 5;

trait MyTrait {}
impl MyTrait for u64 {}

fn takes_trait(arg: impl MyTrait) {}

Are we 'coercing' the arg from the default i32 integer type to a u64? My understanding is that we're not; the method is influencing type resolution.

Indeed, in this case if we add an impl MyTrait for u8 {}, compilation starts to fail, because there is no clear inference? Is this different than the possible issue of a future impl for &[f64; 1] in the first example?

There is no coercion in that example. The type of arg is u64, and there are no values of type i32 anywhere in your program.

In general, numeric types are a bit of a special case in the type inference algorithm. If only one choice works, it picks that one. If several (or no) choices work, it picks i32, even if i32 is not one of the choices that work.

1 Like

Okay, I guess my intuition would be that the "if only one choice works" heuristic would apply more generally. I still don't have a good mental model for where this might cause problems if it were possible, but I do have a better understanding of the mechanics, at least. Thank you!

If you want to compare to your original example, then the two choices are [f32; 1] and [f64; 1]. Neither choice works, so it picks the default choice f64, and indicates that it has done this by using the placeholder {float} in the error message.

1 Like