Is using _ any kind of optimization?

There's a not-insignificant difference between let _foo = something() and let _ = something(). In the former the object will keep existing until explicitly dropped or going out of scope. In the latter case, the object will be dropped immediately.

Are there optimization reasons for choosing one over the other in cases like closures? I.e. do |_foo| and |_| follow the same pattern as when used in let bindings, with regards to keeping object around vs dropping immediately?

Each time I'm using a closure that takes in parameters that I do not use -- are there any compelling reasons to choose to use _ over _foo, apart from "drops earlier"?

(Asking because I happen to prefer _foo over _, because it lets the reader know what's being ignored)

2 Likes

There can be situations where using _ can allow something to compile that would not without it.

fn example(&(_, n): &(String, u32)) {
    // note that n is u32, not &u32, because we used & in the pattern
}

If you change the _ to _string, you will get “error[E0507]: cannot move out of a shared reference”, unless you change it to ref _string instead.

However, if you are using _ exactly as the entire pattern for a function parameter, the only difference between the two is in the drop timing and consequences of the drop timing (such as the period over which a lock is held, and peak memory usage).

6 Likes

For a function parameter, especially for an entire function parameter,, _ should not change the drop timing, because function parameters – if not moved out of – are not dropped before the end of the function anyway.

struct LoudDrop;
impl Drop for LoudDrop {
    fn drop(&mut self) {
        println!("dropping");
    }
}

fn foo() {
    println!("calling");
    bar(LoudDrop);
    println!("done\n");
}
fn bar(_: LoudDrop) {
    println!("in body");
}
fn foo2() {
    let f = |_x| {
        println!("{}", "in f = |_x| { … }");
    };
    let g = |_| {
        println!("{}", "in g = |_| { … }");
    };
    println!("calling f");
    f(LoudDrop);
    println!("done\n");
    println!("calling g");
    g(LoudDrop);
    println!("done\n");
}

fn main() {
    foo();
    foo2();
}
calling
in body
dropping
done

calling f
in f = |_x| { … }
dropping
done

calling g
in g = |_| { … }
dropping
done

I'm not sure if there are ever effects _ can have on the relative drop order between different function parameters, or certain cases of _ in the interior of a more complex overall pattern (assuming both variants do compile in the first place).

1 Like

Also note that let _ = … does not really mean “drop immediately” in all circumstances. It means “drop immediately” if the right hand side is a value expression; but if it’s a place expression, it doesn’t really do anything. So the effect on drop timing can be the opposite as well. E.g.

let long_lived = String::from(…);
{
  // short block
  let _inner = long_lived;
  // drops _inner
}
// now long_lived is gone, and its value already dropped
exec_other_code(); // runs after destructor call on the value long_lived used to have

vs

let long_lived = String::from(…);
{
  // short block
  let _ = long_lived; // binds nothing, no-op
  // no drops happen
}
// now long_lived is still initialized, and its value not dropped
exec_other_code(); // runs before destructor call on the value long_lived has

So really, the meaning of _ patterns is consistently the same, but it simply isn’t “drop immediately” but rather “don’t move the value in the first place”. For value-expressions, this means the value stays in a temporary and is dropped “immediately” for let, after all, because those temporaries in a let are scoped only to the (evaluation of the) let statement itself. For function parameters, the question of “how long does the function parameter (or parts of it live) if not moved into an explicitly named function parameter” is answered by looking into the reference…

here is the most relevant section – ah! – and it even does provide the example for “case where _ does effect relative drop order between parts of function parameters, for _ in a larger overall pattern”!

struct PrintOnDrop(&'static str);
impl Drop for PrintOnDrop {
    fn drop(&mut self) {
        println!("drop({})", self.0);
    }
}

// Drops `y`, then the second parameter, then `x`, then the first parameter
fn patterns_in_parameters(
    (x, _): (PrintOnDrop, PrintOnDrop),
    (_, y): (PrintOnDrop, PrintOnDrop),
) {}

fn main() {
    // drop order is 3 2 0 1
    patterns_in_parameters(
        (PrintOnDrop("0"), PrintOnDrop("1")),
        (PrintOnDrop("2"), PrintOnDrop("3")),
    );
}

So I guess technically we could argue that when writing |_foo| instead of |_| that does have the effect of introducing the binding/variable _foo, which is dropped immediately before the implicit actual/original “function parameter” itself is being dropped (which then is a no-op, because it had been moved out of); whereas |_| does not create such a binding and at the end of the function just the implicit place logically holding the “function parameter” itself is dropped. Just those 2 behaviors are effectively the same [and will always be the same if matching the whole function argument against a single binding pattern or _ pattern].

7 Likes

I stand corrected. It never occurred to me to think about what the drop order is for the anonymous places function parameters are kept in before they are matched by the parameter list, and so I just unconsciously assumed that they would behave like let <user-specified pattern> = { magic!() }; and drop after the match but before the body. But that is not so, and it's even documented in the reference. And, this way is more flexible, because it lets you use patterns that borrow as well as patterns that move (though I can't immediately think of a case where this would be valuable).