I've just learned to use scan! What exciting times! But I am curious, have I used it well?
That is, I claim this is the "best way to sum all the fibonacci numbers less than 4 million, that are divisible by 2."
fn main() {
let sum: u32 = (1..).scan((1,1), |state, _| {
let temp = state.0;
state.0 = state.1+state.0;
state.1 = temp;
//println!("debug: {}, {}", state.0, state.1);
Some(state.0)})
.take_while(|&x| x < 4000000 ) // && x % 2 == 0)
.filter(|x| x % 2 == 0)
.sum();
println!("{}",sum);
}
You're not using the original iterator (1..), so I think you could do better. Also, scan allows for early termination which is also not going to happen with an infinite sequence. iter::repeat_with makes more sense here I think:
let sum: u32 = std::iter::repeat_with({
let mut state = (1, 1);
move || {
let next = (state.1, state.0 + state.1);
std::mem::replace(&mut state, next).0
}
})
.take_while(|&x| x < 4000000)
.filter(|x| x % 2 == 0)
.sum();
It's not needed per sé. Using the 3 assignments you had before works fine too. But I think this more clearly represents the intent of the closure being a function (state_in) -> (state_out, result).
let sum: u32 = std::iter::repeat_with({
// create a new scope where we declare all captures
// this is to prevent polluting the enclosing scope
// with variables that should only be used by the closure
// the last two fibonacci numbers
// this needs to be maintained across calls to the closure
// so it must be captured
let mut state = (1, 1);
// move closure so we dont have references to this scope
move || {
// calculate next fibonacci number
let next = (state.1, state.0 + state.1);
// update the state with the next fibonacci number
// and get the old value
// we only care about the first number in the pair
// in order to get the whole fibonacci sequence
std::mem::replace(&mut state, next).0
}
})
// take all fibonacci numbers smaller than 4,000,000
.take_while(|&x| x < 4000000)
// keep only even fibonacci numbers
.filter(|x| x % 2 == 0)
// sum up what's left (even fibonacci numbers smaller than 4 million)
.sum();
As someone who is familiar with iterators, I actually think it's quite elegant. As for the performance, it's probably the same as the corresponding loop.
And, referencing @zicog and @Alice's comments, as someone beginning to understand iterators:
After a little time with them, I find them to be sort of like collecting Pokemon.
They're pretty weird to look at at first.
You don't really know what they do until you go catch one and try to use it (presumably from the documentation, or your buddies here).
And sort of the most important, each has a few different moves you can use it for, and iterators and concepts it teams well with.
An example of that would be the way jethrogb used repeat_with in combination with the move keyword and a closure to capture state. That's a higher level move (pun intended) than I would have thought to use, but now that I understand it, I think I could explain it fairly simply.
Personally, I find using mem::replace to return the previous iteration's .0 to be a bit too subtle readability-wise. I find starting state one step before and assigning directly to state a bit easier to read.
fn main() {
let sum: u32 = std::iter::repeat_with({
let mut state = (0, 1);
move || {
state = (state.1, state.0 + state.1);
state.0
}
})
.take_while(|&x| x < 4000000)
.filter(|x| x % 2 == 0)
.sum();
println!("{}", sum);
}
Having done some trawling through the Rust documentation I now see what is going on in that code.
Sadly in I see no elegance there. It's about 30% more lines of code than are required to do the job. It's weighed down with obfuscation.
What do I mean by "weighed down with obfuscation"? Well, by that I mean the number of facts and concepts a reader needs to know and understand in order to figure out what the code does, over and above those required to get the job done. Over and above those familiar form almost any other language. In this example the conceptual baggage is:
iter::repeat_with
move ||
mem::replace
take_while
filter
sum
Not to mention the use of anonymous functions and closure anyway.
With my new found understanding of these Rust things I would at least like to get rid of the "move ||" and the "mem::replace", which seem to be totally redundant, and make the thing a bit shorter and clearer:
let mut state = (0, 1);
let sum: u32 = std::iter::repeat_with(|| {
state = (state.1, state.0 + state.1);
state.0
})
.take_while(|&x| x < 4000000)
.filter(|x| x % 2 == 0)
.sum();
Or is there something significantly different in doing that I don't know about?
If you want elegance I have it in spades, the old school way:
let mut sum : u32 = 0;
let mut state = (0, 1);
sum = loop {
state = (state.1, state.0 + state.1);
if state.0 >= 4000000 { break sum; }
if state.0 % 2 == 0 { sum += state.0 }
};
Short, sweet, blindingly obvious to anyone familiar with any language. Of course meaningful variable names would be helpful.
The variant where you store the previous 2 sequence elements in the state is definitely more elegant than what I came up with. And yes, in this toy example you can get away without the scoping/move operator, but if you wanted to create a function like fn fibonacci() -> impl Iterator<Item=u32> you still have to do that.
I actually think the straight-line version that @ZiCog wrote is less understandable. I think the functional program more clearly describes the intent of the algorithm. But that's usually a difference of opinion between proponents of functional and imperative programming.
The task only requires run of the mill everyday language features like assignment, arithmetic, loop and conditional. As such it just seems odd to me that requiring the reader to know and understand 8 or 9 concepts/language features over and above what is required to do the job is preferred by many. It just looks like complexity for the sake of it to me.
I can well see that things like anonymous functions, closures, repeat_with and so on have valid uses where they will make code simpler and more expressive. This does not seem to be one of them.
But, that's just my opinion. Maybe one day I'll get used to this in the Rust world, when it stops tripping me up I'll forget it was ever a problem.
I feel that somewhat misses my original goal, to find good functional approaches to the problem. The blog post I wrote is actually an exploration of imperative, functional, recursive, and mathematical styles of approaching problems.
You could for instance, observe that every third fibonacci is even, and follows the sequence
E_i = 3E_{i-1} + E_{i-2}
Thereby shortening computation substantially, or you could simply multiply by the golden ratio 1.618^3 (since every third) and round to get rid of stateful computation altogether. The point is not that functional programming is better, but that different problems admit different styles of thought and dismissing any out of hand for their concept baggage can be shortsighted.
Ah yes, I think that in an excellent idea. Sadly I could not read the blog as the code formatting is all a mess.
Of course there are many ways to tackle any given problem, some with different software/language approaches, some with fundamentally different algorithms. I feel that taking those fibo short cuts are mathematical tricks rather than programming style tricks. Perhaps we should not confuse the two.
Generally I'm on the look out for performance and clarity of code. As such I'm not won over by the functional style yet. So far it has proved less clear and slower for me.
Now, if some kind soul could show how to perform the "loop blocking" problem in Rust in a way that a) Readable and b) Comparable in performance to C I might start to warm up to the Functional approach.
There are a whole bunch of different Rust approaches to the loop blocking problem in that repo. The fastest and clearest are not Functional. They are all slower than C. Any suggestions are welcome.
Now you have state "polluting" the outer namespace / scope. Which is indeed a very classic C pattern, and a very annoying thing to do: we as humans prefer local reasoning; when I see a (mut) binding captured by a move closure within a scope, I know I can forget about state outside that cloure, whereas your code still has that "trailing state" which one can use after the iteration is done (which can be quite interesting to have in other contexts, but the cognitive burden of having to pay attention to whether state is used afterwards (when it isn't) adds up with each and every such pattern).
Scoping is thus elegant in that it reduces the number of variables in (the outer) scope.
mem::replace becomes more natural once you use it more, but some people could argue that we are kind of back to the x = (y = z) pattern (note: that's not what mem::replace does), whereby within one line / assignment there are two values being moved around; it does require a bit more mental effort.
Mathematical elegance usually rhymes with conciseness, but since in the programming world one should indeed avoid being excessively terse, I can see why the mem::replace pattern might legitimately be perceived as counter-productive.
is clearer than the mem::replace alternative. However I do not think that this
is clearer than the first snippet I quoted. I mean, you have to consider both the ordering of the statements inside the loop, and it's a quite weird placement of the break-condition: in fact this cannot easily be rewritten as a while loop without repeating one of the lines. If you try to rewrite as a while loop, you can rather easily end up with having the two ifs reordered, which results in a quite subtle difference in behavior where exactly one fibonnaci number >= 4000000 is included in the sum.
I think perhaps the difference is that the iterator version is more about saying "what I want to do" rather than "how do I want to do it": I want the sum of the even fibonnaci numbers less than 4000000, and in my mind, the number of steps to translate that objective into code is shorter for the iterator.