Why is Clippy set up to convert a simple if to a map

A strange Clippy suggestion:

warning: manual implementation of `Option::map`
   --> src/viewer/regionindex.rs:265:13
    |
265 | /             match vlinkopt {
266 | |                 Some(vlink) => Some((*vlink).clone()),
267 | |                 None => None
268 | |             }
    | |_____________^ help: try this: `vlinkopt.map(|vlink| (*vlink).clone())`
    |
    = note: `#[warn(clippy::manual_map)]` on by default
    = help: for further information visit https://rust-lang.github.io/rust-clippy/master/index.html#manual_map

Why would you want to take a clear, simple conditional expression and complicate it with a map and a lambda? This is less efficient unless the compiler has to be really clever. It looks like too-cute functional programming.

I would definitely expect the compiler to be that level of "really clever", and be very surprised if those compiled to different assembly.

While rust is not a purely functional programming language, there are a lot of functional constructs that would be considered idiomatic, including this one. If you don't agree with the suggestion, you can configure clippy not to provide it.

9 Likes

The compiler should be plenty clever enough for that. Rust/LLVM is very aggressive about inlining and monomorphising so functional style is often just as fast as imperative. Rust iterators are often the fastest way of doing loops and often use far more complicated lambdas. The iterators can have tricks to reduce bounds checks or other iterator specific optimizations typically in their fold implementation, which is what can enable them to perform better than a basic imperative loop.

Arguably clippy should of suggested Option::cloned for this particular example.

11 Likes

Small nit: The monomorphization isn't something where the compiler would even have a choice. Generic code in Rust is always monomorphized, there isn't even any sensible alternative that the compiler could do besides monomorphization, at all. Calling a generic function like map with a closure like this always creates a fresh, monomorphized instance of map for the closure.

Also, after this monomorphization, the call to map is the only call to this particular monomorphized instance of map, and the single call to the closure inside of map is the only call to that particular closure type. Furthermore, the closure type here is not exposed in any public interface so both the closure and monomorphized map are private to the crate. If rustc/LLVM ever doesn't inline a private function when there's only a single call throughout the entire crate to it, then I wouldn't consider rustc an “optimizing” compiler at all.

That's - like - the most basic, least debatable kind of optimization ever, so I disagree with the need for a “very aggressive”ly optimizing compiler here.

7 Likes

I probably should of phrased it better, since always is distinctly beyond very aggressive. Always monomorphizing was a design choice of the Rust language which is what I meant by Rust in Rust/LLVM, the Rust compiler would be the obvious way to interpret that.

There aren't many languages other than Rust where FP style code is a zero cost abstraction, which makes new users jump to the conclusion that it has overhead. For example python wouldn't inline a lambda being an interpreted language so it would have additional overhead. The FP poster child Haskell can add overhead to enable lazy evaluation.

I also disagree for the need for a very aggressive optimizer here. So the fact that Rust has one should make these trivial cases a non concern. It does a great job with far more complicated examples.

1 Like

This still doesn't answer the original question about why would Clippy be tuned that way by default.
I guess the simple answer would be: Rust may look like C family descendant but you have to remember that's all a ruse — it's constructs may look almost like a weird C++, but this doesn't change the fact that Rust started from OCaml, which is very much a functional language.
Thus, of course, functional programming would be an ideomatic way of writing Rust. It's shorter and more concise, after all.
Thus I guess the answer to the original question is: for most experienced rustaceons reading an expression with map and lambda is easier, not harder (if you see a map then you know expression would do something with the interior of vlinkopt and then it's easy to see what would it do, there are no need to read the whole expression at once to understand what it does).
But, of course, if you prefer to write code differently then you can always disable that suggestion.

9 Likes

Note that clippy is intentionally highly opinionated. There's a reason that rustc doesn't have this by default. It's totally reasonable to allow clippy lints you don't like.

That said, that looks exactly like the kind of place where I'd encourage the use of map in a code review.

8 Likes

I’ve recently been thinking that we are missing some syntax here. Options and Results are ubiquitous, and one of the biggest sources of lambdas. Lambdas are not ergonomic: as a high-order construct, they create extra cognitive load, and they also don’t compose nicely with ? and .await.

This state seems unsatisfactory to me: using basic types should’ve require high order functions, unless you are a purely function languages where fns are syntactically lightweight.

I wish we had an analog of for for Option/Result. Kotlin/TypeScript’s null-coalescing ?. is one such bit of syntax, but it is very biased towards OO method call syntax. The more general (Kotlin) ?.let {} is a mouthful similar to rusts map.

3 Likes

Scala’s for comprehensions are pretty nifty. At least in Scala 2 they don’t even require HKTs or other advanced type system features but rather are a purely syntactic transformation. With try_*/ControlFlow it would be even possible to support break inside a comprehension. (So, technically for comprehensions are just syntactic sugar for the usual bunch of higher-order functions, so might not readily convince someone who’s sceptical of the performance of HoFs…)

Well, the match in the OP is try { vlinkopt?.clone() }†, which I think is pretty good, actually.

† Assuming nightly and either a context that infers Option, or a hypothetical future that fixes the annotation problem for try. And it might also need another *, but hopefully autoderef will do the right thing for it.

6 Likes

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.