Sorry for the vague title, couldn't think of a good way of phrasing it. Basically, I have the following setup:
A trait T implemented for a struct S, and a struct Node, set up in the following way:
trait T{
fn foo(&self);
}
struct Node {
objects: Vec<Box<dyn T>>,
start: usize,
end: usize,
left: Box<dyn T>,
right: Box<dyn T>
}
impl T for Node {
fn foo(&self){
println!("T implemented for Node");
}
}
impl Node{
fn new(objects: Vec<Box<dyn T>>, start: usize, end: usize)-> Self{
let l = end-start;
let (left, right) = match l {
1 => {
(objects[0], objects[0])
},
2 => {
(objects[0], objects[1])
},
_ => {
let mid = objects.len()/2;
(Box::<Node>::from(Node{objects:objects, start: 0, end: mid, left: objects, right:objects}),
Box::<Node>::from(Node{objects:objects, start: mid, end: l, left: objects, right:objects}))
}
};
Node{objects:objects, start:0, end:l, left: left, right:right}
}
}
So, the Node
struct basically tries to implement a balanced tree, and the most generic type of child of a given node (held in the left
and right
fields) is Box<dyn T>
(for leaf nodes), but will usually be Box<Node>
.
Assuming I can guarantee objects
always has length at least 1, the issue is in the match expression. The first two cases return (Box<dyn T>, Box<dyn T>)
types (or references to them, I'm not worried about that just yet, but the default branch returns (Box<Node>, Box<Node>)
.
My question is how I can get the compiler to recognise that T
is implemented for Node
, and so these aren't really distinct types?