Another deref question: the priority of deref coercion

    fn main3() {
        use std::ops::Deref;
        let r = &&1;

        // i32
        let x = *(r.deref());
        // &i32
        let y = *r;

Like code above; why the x, y have different types?

The r.deref() call returns a &i32, so if you dereference that, you get an i32.

The question is why impl Deref for &i32 is selected to execute.
And why impl Deref for &&i32 is selected for let y = *r

There is no deref coercion at all in the let y = *r line.

But it satisfied

impl<T: ?Sized> Deref for &T {
    type Target = T;

    #[rustc_diagnostic_item = "noop_method_deref"]
    fn deref(&self) -> &T {

and the book says:

That substitution doesn't happen for references because references are special. Notice that the deref function you listed is actually implemented as *self, which would be an infinite recursive loop if the compiler actually made that substitution.

Anyway, I see why you are confused. There's still a question of why our *(r.deref()) example behaves differently from the *(y.deref()) in the book. To be explicit, the exact choices are the following:

let x = *Deref::deref(r);

The compiler prefers to call the method that does not involve adding an & in front to calling the method that involves adding an & in front. However in the example in the book, there's no version of Deref that takes the box by-value, so it can't make that choice there.

1 Like