# Feedback wanted: Countdown solver

Hello. After the requisite hair-pulling, I have successfully implemented my first non-trivial algorithm with Rust, and I thought I'd present it here for feedback. First the code, then a few points where I think it ought to be possible to do better.

``````use std::rc::Rc;
use std::cell::RefCell;
use std::cell::Ref;

struct Node<T> {
elem: T,
}

trait Stack<T> {
fn push(&mut self, elem: T);
fn push_node(&mut self, node: Rc<RefCell<Node<T>>>);
fn pop_node(&mut self) -> Option<Rc<RefCell<Node<T>>>>;
}

fn push(&mut self, elem: T) {
let new_node = Rc::new(RefCell::new(Node {
elem: elem,
next: None,
}));

self.push_node(new_node);
}

fn push_node(&mut self, node: Rc<RefCell<Node<T>>>) {
(*node).borrow_mut().next = self.take();
*self = Some(node);
}

fn pop_node(&mut self) -> Option<Rc<RefCell<Node<T>>>> {
self.take().map(|node| {
*self = (*node).borrow_mut().next.take();
node
})
}
}

enum Op {
Empty,
}

impl Op {
fn neutral_value(&self) -> u32 {
match self {
Op::Empty => panic!(),
Op::Multiplicative(_) => 1
}
}

fn combine(&self, a: u32, b: u32) -> u32 {
match self {
Op::Empty => panic!(),
Op::Multiplicative(_) => a*b
}
}

fn realize(&self, a: u32, b: u32) -> Option<u32> {
match self {
Op::Empty => panic!(),
Op::Additive(_) => if a>b { Some(a-b) } else { None },
Op::Multiplicative(_) => if a%b == 0 { Some(a/b) } else { None }
}
}
}

value: u32,
op: Op,
}

fn new_primitive(val: u32) -> Self {
ReadyValue { value: val, op: Op::Empty }
}

fn new_composed(val: u32, is_multiplicative: bool) -> Self {
ReadyValue { value: val, op: if is_multiplicative { Op::Multiplicative(None) } else { Op::Additive(None) } }
}
}

struct Contribution {
reverse: bool,
}

struct ResolveContext {
goal: u32,
}

#[derive(Clone)]
#[derive(Copy)]
struct AvailabilityIndexing {
count: usize,
composed_count: usize,
other_composed_count: usize,
}

fn resolve(goal: u32, primitives: &Vec<u32>) {
let mut local_context = ResolveContext { goal: goal,
available: None,
free_list: None};

let mut count = 0usize;

for prim in primitives {
local_context.available.push(Contribution { reverse: false, value: ReadyValue::new_primitive(*prim) });

local_context.free_list.push(Contribution { reverse: false, value: ReadyValue::new_composed(0, false) });
count += 1;
}

let available = AvailabilityIndexing {count: count,
composed_count: 0,
other_composed_count: 0};

for phase in 0..=1 {
explore_addition_of_right_node(&mut local_context, &available, None, if phase == 1 { Op::Multiplicative(None) } else { Op::Additive(None) });
}
}

struct RightNodeContext<'a> {
inherited_recruitment_bookmark: &'a Rc<RefCell<Node<Contribution>>>,
}

available: & AvailabilityIndexing,
called_recruitment_bookmark: Option<&Rc<RefCell<Node<Contribution>>>>,
operation: Op) {
let mut local_availability = *available;
let no_other_left = local_availability.other_composed_count == 0;
let neutral_value = operation.neutral_value();

let mut work_node = resolve_context.free_list.pop_node().unwrap();

let mut op_case = |is_same| {
if is_same {
local_availability.composed_count += 1;
} else {
local_availability.other_composed_count = local_availability.composed_count;
local_availability.composed_count = 1;
}
};

let local_context = RightNodeContext { inherited_recruitment_bookmark:
if let Some(concrete) = called_recruitment_bookmark.as_ref() {
match resolve_context.available.as_ref().unwrap().borrow().elem.value.op {
Op::Empty => panic!(),
Op::Empty => panic!(),
Op::Additive(_) => { op_case(true); concrete }
Op::Multiplicative(_) => if no_other_left {
op_case(false);
} else {
// TODO: figure out how to remove duplication
resolve_context.free_list.push_node(work_node);
return;
}
}
Op::Multiplicative(_) => match operation {
Op::Empty => panic!(),
op_case(false);
} else {
// TODO: figure out how to remove duplication
resolve_context.free_list.push_node(work_node);
return;
}
Op::Multiplicative(_) => { op_case(true); concrete }
}
}
} else if no_other_left {
op_case(false);
} else {
// TODO: figure out how to remove duplication
resolve_context.free_list.push_node(work_node);
return;
}
};

local_availability.count += 1;
if (local_availability.count - local_availability.composed_count) < 2 { // don't bother
// TODO: figure out how to remove duplication
resolve_context.free_list.push_node(work_node);
return;
}

(*work_node).borrow_mut().elem.value.op = operation;

resolve_context.available.push_node(work_node);
iterate_possible_left_nodes(resolve_context, &local_context,
&local_availability,
local_context.inherited_recruitment_bookmark,
0, 0, neutral_value, neutral_value);
work_node = resolve_context.available.pop_node().unwrap();

resolve_context.free_list.push_node(work_node);
}

fn iterate_possible_left_nodes(resolve_context: &mut ResolveContext,
right_node_context: &RightNodeContext,
available: &AvailabilityIndexing,
recruitment_loc: &Rc<RefCell<Node<Contribution>>>,
forward_count: usize, reverse_count: usize,
forward_total: u32, reverse_total: u32) {
let mut local_availability = *available;
let mut local_recruitment_loc = recruitment_loc;
let mut backing: Rc<RefCell<Node<Contribution>>>;

local_availability.count -= 1;

loop {
if let Some(mut candidate) = to_discard.next.pop_node() {
local_availability.other_composed_count
= available.other_composed_count - match (*candidate).borrow().elem.value.op {
Op::Empty => 0,
Op::Multiplicative(_) => 1,
};
for phase in 0..=1 {
let reverse = phase != 0;
(*candidate).borrow_mut().elem.reverse = reverse;

let new_forward_count = forward_count + if reverse { 0 } else { 1 };
let new_reverse_count = reverse_count + if reverse { 1 } else { 0 };
let new_forward_total = if reverse { forward_total } else { resolve_context.available.as_ref().unwrap().borrow().elem.value.op.combine(forward_total, (*candidate).borrow().elem.value.value) };
let new_reverse_total = if reverse { resolve_context.available.as_ref().unwrap().borrow().elem.value.op.combine(reverse_total, (*candidate).borrow().elem.value.value) } else { reverse_total };

match &mut resolve_context.available.as_ref().unwrap().borrow_mut().elem.value.op {
Op::Empty => panic!(),
};

iterate_possible_left_nodes(resolve_context, right_node_context,
&local_availability, local_recruitment_loc,
new_forward_count, new_reverse_count,
new_forward_total, new_reverse_total);

for _ in 0..1 {
if (new_forward_count == 0) || ((new_forward_count + new_reverse_count) < 2) {
break; }

let raw_res = resolve_context.available.as_ref().unwrap().borrow().elem.value.op.realize(
new_forward_total,
new_reverse_total);

if let Some(res) = raw_res {
resolve_context.available.as_ref().unwrap().borrow_mut().elem.value.value = res;

if local_availability.count == 1 {
if res == resolve_context.goal {
print_result(&resolve_context.available.as_ref().unwrap().borrow().elem.value);
}
} else {
for phase2 in 0..=1 {
Some(right_node_context.inherited_recruitment_bookmark),
if phase2 == 1 { Op::Multiplicative(None) } else { Op::Additive(None) });
}
}
}
}

candidate = match &mut resolve_context.available.as_ref().unwrap().borrow_mut().elem.value.op {
Op::Empty => panic!(),
};
}

let temp = Rc::clone(&candidate);
(*local_recruitment_loc).borrow_mut().next.push_node(candidate);
backing = temp;
local_recruitment_loc = &backing;
} else { break; }
}
}

println!("{}", string_recursive(value));
}

fn string_recursive(value: &ReadyValue) -> String {
let continuation = |link: Option<&Rc<RefCell<Node<Contribution>>>>, is_multiplicative| {
let mut res = String::new();
let mut is_first = true;
let mut num_rev = 0usize;

res.push_str("(");

loop {
if let Some(concrete) = iter {

if (*concrete).borrow().elem.reverse {
num_rev += 1;
} else {
if !is_first {
res.push_str(if is_multiplicative { " *" } else { " +" });
}
is_first = false;
res.push_str(" ");
res.push_str(&string_recursive(&(*concrete).borrow().elem.value));
}

let step1 = (*concrete).borrow();
let step3 = step1.next.as_ref();
let step4 = step3.map(|inner| { Rc::clone(inner) });
drop(step3);
drop(step1);
backing = step4.as_ref().map(|inner| { Rc::clone(inner) });
iter = backing.as_ref();
} else { break; }
};

if num_rev > 0 {
res.push_str(if is_multiplicative { " /" } else { " -" });
if num_rev > 1 {
res.push_str(" (");
}
}

is_first = true;

loop {
if let Some(concrete) = iter {

if (*concrete).borrow().elem.reverse {
if !is_first {
res.push_str(if is_multiplicative { " *" } else { " +" });
}
is_first = false;
res.push_str(" ");
res.push_str(&string_recursive(&(*concrete).borrow().elem.value));
}

let step1 = (*concrete).borrow();
let step3 = step1.next.as_ref();
let step4 = step3.map(|inner| { Rc::clone(inner) });
drop(step3);
drop(step1);
backing = step4.as_ref().map(|inner| { Rc::clone(inner) });
iter = backing.as_ref();
} else { break; }
};

if num_rev > 1 {
res.push_str(" )");
}
res.push_str(" )");

res
};

match &value.op {
Op::Empty => value.value.to_string(),
}
}

fn main() {
resolve(4471u32, &vec![75u32, 10u32, 2u32, 6u32, 7u32, 1u32, 1u32]);
}
``````

In case it's unclear, this solves the Countdown numbers round (except here I put a 4-digit goal and 7 primitives, because computers are just too fast to be able to properly profile this in the 3-digit goal and 6 primitives case) while taking into account the operations being commutative and associative to limit redundancies in the output. The algorithm internally uses linked lists, with somewhat tangled ownership: this is why the indirection between cons cells relies on `Rc<RefCell<>>`.

The weak points, in no particular order:

• At line 41, I have an integrated `enum Op` rather than, say, `Option<(bool, Link<Contribution>)>` because it's simpler at times, but it also results in some complexity where I'm sure there could be better ways.

• At line 145 you can see an attempt to repeat myself less often, but it's partial (I couldn't manage to factor the `recruitment_bookmark` itself into that), and even then is more awkward than I'd like (I'd prefer two closures depending on the case, rather than one with an additional Boolean parameter)

• At line 230 I had to shorten the lifetime of the `Ref` I get from the `RefCell`, because otherwise I couldn't perform the switcheroo at the end of the loop, and even then that code is more complicated than I'd like. I'm looking for more elegant solutions

• At line 257 you can see one of the techniques I pulled off to be able to avoid repeating my cleanup code; unfortunately I couldn't apply it in other places, as then rustc complained of constant variables being assigned twice and other consequences of interpreting this as a loop. I'm sure I could use a custom type implementing drop instead, but it is worth it for bookkeeping of my local variables?

• At line 265 you can see I had to split in two what could otherwise be a single-line condition, because for some reason if it is in a single "if" the compiler seems to assume the temporaries (the `Ref` from the explicit `borrow()`, in particular) have to last the entirety of the if block, while the same compiler is able to perfectly accept that they don't need to if the same temporaries are computed as part of an earlier statement.

• At line 311, the loop has more lines managing the loop induction variables than it does performing actual work! Now, this case is more complicated than the earlier ones where we'd pop and push, because we browse in read-only fashion and so have to content with multiple ownership, but even then I didn't expect I'd have to `Rc::clone` twice per loop! But if I don't, the borrow checker won't let me overwrite `backing` unless I've dropped everything else (apparently owing to the funky lifetime of Ref) at which point I can't update `backing` any more. Doubling `Rc::clone` was an act of desperation, and I'm still surprised it worked; thankfully this isn't hot code, but what if it was?

1 Like

A couple of observations:

• Formatting is sometimes weird/non-idiomatic.
• `&Vec<T>` is useless and actually harmful in interfaces; use a slice `&[T]` instead. (With a `&Vec`, you have to allocate a vector just to be able to pass a reference if you have anything but a vector. But this is completely unnecessary if you have something that you can convert into a slice without an intermediate allocation.)
• The constructors of `ReadyValue` hurt readability; instead of going through an intermediate boolean, you should just construct an `Op` because it has more descriptive variant names.
• I changed `Link` from being an optional to just an `Rc<RefCell<_>>`; this allows it to be reused at other places, removing some cruft.
• You have very long lines. You should break up expressions more so that the code is easier to read. You are also explicitly dereferencing the `Rc` unnecessarily before calling `borrow()` and `borrow_mut()`.
• Closures such as `.map(|inner| { Rc::clone(inner) })` are unnecessary. This is the same as `.map(Rc::clone)`.
• The representation of operations is not very elegant. A better one would be to split primitives and binary operations into variants of an `enum`.
• For pretty-printing expressions, you could just use a general integer precedence level, instead of manually checking every level (literal numbers, multiplicative, and additive operations).
• Your logic for generating expressions seems very complicated â€“ does it use some sort of optimization, or is it pure brute force?

Anyway, I have a slightly improved version here, and I've written up a substantially cleaner and around 3x shorter version here.

3 Likes

"solves": What about solutions not using all numbers. That involve internal fractions or negatives to get goal.

Here's a playground after running `rustfmt` and removing an unused import, in case others are interested. The line numbers changed, so I left comments like `"weak point one" line 41`.

Any particular reason why? If the code wasn't so complicated, the first thing I'd do for review is try to replace your stack-like linked list with a `Vec`. I also suspect you could then get rid of the `Rc<RefCell<_>>`s and a lot of your borrow dances.

Speaking of complicated, your functions are quite long and have no comments. It would take quite awhile to figure out what it's doing and offer more in-depth advice that maintained your algorithm.

In general, I think you're pushing too hard on borrowing no matter what and reusing variables. Part of the appeal of `Rc` is that it's cheap to clone. You could also do with some helper methods to avoid lots of repetition, e.g.

``````impl ResolveContext {
fn available_op(&mut self) -> RefMut<'_, Op> {
let rm = self.available.as_ref().unwrap().borrow_mut();
RefMut::map(rm, |x| &mut x.elem.value.op)
}
}
``````
Then elsewhere...
``````                let new_forward_total = if reverse {
forward_total
} else {
resolve_context
.available_op()
.combine(forward_total, (*candidate).borrow().elem.value.value)
};
let new_reverse_total = if reverse {
resolve_context
.available_op()
.combine(reverse_total, (*candidate).borrow().elem.value.value)
} else {
reverse_total
};

match &mut *resolve_context.available_op() {
Op::Empty => panic!(),
};
``````

The rest of this comment are just from spot-checking the six areas you called out, not a more comprehensive review.

I think this is similar to the line 311 below, but have to admit I didn't suss out how exactly the items are being iterated.

It's an aspect of `if let`, not `if`. `if let` acts like a `match` and the temporaries last the entire block (including any `else` blocks).

Mostly your juggling drop order in combination with borrows that need to share a lifetime (because you assign them to `iter`) and thus can't be constrained to the loop (so you need to get them into `backing`).

Here's a version that avoids the double-cloning.

``````        while let Some(concrete) = iter {
let concrete_borrow = (*concrete).borrow();
// Use `concrete_borrow` instead of `(*concrete).borrow()`
// everywhere else in the loops, then...
let step3 = concrete_borrow.next.as_ref();
let step4 = step3.map(|inner| Rc::clone(inner));
drop(concrete_borrow);
backing = step4;
iter = backing.as_ref();
}
``````

But better yet would be to define `Iterator`.

``````fn step<T>(r: &Rc<RefCell<Node<T>>>) -> Option<Rc<RefCell<Node<T>>>> {
r.borrow().next.as_ref().map(Rc::clone)
}

struct IterExample<T>(Option<Rc<RefCell<Node<T>>>>);
impl<T> Iterator for IterExample<T> {
type Item = Rc<RefCell<Node<T>>>;
fn next(&mut self) -> Option<Self::Item> {
let this = self.0.take()?;
self.0 = step(&this);
Some(this)
}
}

// ... back in that closure ...

// iter and backing go away
// ...
// all the dance at the end goes away
}
``````

I don't know how good I would call it, but

``````    let maybe_crb = called_recruitment_bookmark.as_ref().and_then(|concrete| {
let rc_op = &resolve_context
.available
.as_ref()
.unwrap()
.borrow()
.elem
.value
.op;
match (rc_op, &operation) {
(Op::Empty, _) | (_, Op::Empty) => unreachable!(),
local_availability.composed_count += 1;
Some(concrete)
}
_ => None,
}
});

let early_return =
!no_other_left || local_availability.composed_count >= local_availability.count;

let inherited_recruitment_bookmark = match maybe_crb {
Some(concrete) => concrete,
_ if early_return => {
resolve_context.free_list.push_node(work_node);
return;
}
_ => {
local_availability.other_composed_count = local_availability.composed_count;
local_availability.composed_count = 1;
}
};

let local_context = RightNodeContext {
inherited_recruitment_bookmark,
};
``````

In response to the easier questions:

• First, I apologize for the lack of comments. The thing is, what this needs is not so much inline comments than a big comment block at the top with multiple ASCII-art diagrams to detail the design, because everything falls from that. For instance, the main redundancy avoidance mechanism is that intermediate results (i.e. the `ReadyValue`s where `Op != Empty`) as built as part of layers: two intermediate results are in the same layer when there will ultimately be as many intermediates between either of them and the final result. This implies that they have the same `Op`, as the Op always alternates when going from one operation to its non-primitive constituents, if any. As a result, for instance, building an intermediate value with `Op` different from the one at the top of the `available` stack implies we start a new layer; and that cannot happen if there are intermediate results left to recruit two layers down (the `other_composed_count != 0` case) because then they wouldn't "really" be two layers down. Primitives can be recruited at any time and so are not pre-organized as being part of a layer, even if they end up being part of one eventually.

• I also apologize for the aphazard indenting. I strived to be consistent, especially as the compiler uses indenting as hints of my intent when reporting on e.g. mismatched braces. But when late in the process I had to convert `while` loops to `loop` + `if {} else { break; }` (and sometimes back when that did not help), I have to admit I did not always update indenting.

• The ancestor of this algorithm worked directly with binary operations and built binary trees where nodes could be each of +-*/, and this is what I consider brute force; the algorithm I present is subtler, but its does perform an exhaustive search of the solution space without attempting to e.g. check more likely paths first or whatever. But the goal is to present all the possible solutions forâ€¦ reasons.
• Good point, this is actually a special case solver (all primitives must be consumed); for a true solver, you'd need a parent function that tries this algorithm with every possible subset of the primitives vector. You could instead modify the algorithm to have it compare every intermediate with goal, but then you'd have massive duplication of results.

As for negative and fractional intermediates, that could be very much be performed by making the algorithm be generic over an `Algebraic` trait instead of hardcoding u32 (in my experience, allowing negative intermediates isn't very interesting: you can always map those solutions back to all-positives ones; but fractional is worthwhile; you need the numerator and denominator to be `u64` in practice).

• Initially (in another language where I wanted to experiment in, too) that was because "to see if I could"; but as I implemented it I found out it delivered two key properties: I can perform all allocations at once in the setup and then save myself the trouble of allocating anything afterwards, and all actions performed are O(1) (even here with reference counting and runtime borrow checking). Granted, N is never going to get big anyway (though with multithreading I hope to get it up to 9, maybe 10), but for some targets, that matters a lot.

Formatting is important, but so is not spending time getting it right yourself. Just run `cargo fmt` on your project and all files will be reformatted to the standard Rust formatting.

Thanks for these; in particular, "scanning the topmost op" is so common I ought to have factored it out. However, I have now done so as an impl on (a new trait on) Link (and two functions: one that returns RefMut and one Ref) rather than ResolveContext, as in the latter case the borrow is on ResolveContext and that interferes with borrows on ResolveContext::free_list; the parts of ResolveContext needn't be any more consistent with one another than two local variables of a given function, so better not implement anything on ResolveContext itself.

As for Rc, I was attempting to lean on borrows as being "the Rust way" and only clone when strictly necessary (especially as I assume the compiler is not going to collapse clones with dropsâ€¦ though maybe the LLVM backend will), but indeed why bother when Rc::clone is thereâ€¦ any inefficiency there is going to be dwarfed by the runtime checking of borrows in RefCell anyway.

At line 311 I indeed should have considered an iterator; the thing is, I eschewed at line 230 because I modify the linked list as part of the loop, which wouldn't be compatible with iterator semanticsâ€¦ and then I kept trying to do the same thing at line 311 where I definitely browse read-only, and failed to reconsider an iterator. Or at the very least a helper function which would have better contained the lifetimes (how do you recognise a C programmer? He's the one who tries to solve a mess by piling on statements, rather than moving them out).

That's neat already. I realise now I shouldn't have stopped before Chapter 18 (though in my defense, when I got to "Fearless Concurrency" I though "Well, maybe the rest I can get to laterâ€¦")

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.