The compiler throws the error:
expected trait object dyn A, found struct B
for the line _b: Vec::from([Box::new( B {} )]), but doesn't mind the same type when I change to the vec! macro. I can't begin to wrap my head around why this is. Ideas?
BTW, I'm doing this in the process of creating a GUI system inspired by Flutter, where layout widgets, like Row and Column have a vector of children.
trait A {}
struct B {}
impl A for B {}
struct C {
_b: Vec<Box<dyn A>>
}
fn fails() {
let _c = C {
_b: Vec::from([Box::new( B {} )])
};
}
fn works() {
let _c = C {
_b: vec![Box::new( B {} )]
};
}
fn main() {
fails();
works();
}
The short version is that Vec::from gives you Vec<Box<B>>, and vec! gives Vec<Box<dyn A>>. You need to unsize the Box before creating the Vec.
The vec! macro works because it does <[_]>::into_vec, which resolves based on the requires output type to <[Box<dyn A>]>::into_vec. Now the box in the slice unsizes match the type, and everything works out.
When you call Vec::from, type information is flowing strictly from the inside out, so you get [Box<B>; 1], and no unsizing coercion.
If you specify a concrete type of Box<dyn A> somewhere, rather than letting type inference fail in this case, it will all work out.
I believe this should be considered a type inference failure, and could be reported on the rust repo as such.
Thank you @CAD97 and @RedDocMD for shedding some light on the matter. I see now that the reason for the error isn't obvious. I do not however, understand terms like unsizing Boxes and type coercion, so I'll spend some time reading through the docs and possibly create an issue with the Rust team afterwards.
Well, Box<B> and Box<dyn A> are two separate types, and converting from Box<B> to a Box<dyn A> is called an unsizing coercion. It sometimes happens automatically, and sometimes not.
trait A {}
struct B {}
impl A for B {}
struct C {
_b: Vec<Box<dyn A>>
}
fn fails() {
let _c = C {
_b: Vec::from([Box::new( B {} ) as Box<dyn A>])
};
}
fn works() {
let _c = C {
_b: vec![Box::new( B {} )]
};
}
fn main() {
fails();
works();
}
Using an explicit type cast got rid of the error. Thanks guys.
Array decay in C is a kind of unsizing coercion. When you have an array of a definite sizechar a[5]; and you pass it to a function void f(char *a) that takes a pointer to an array of indefinite size, the compiler implicitly converts a to a pointer to its first element. (This, as you may know, causes many confused people to say silly things like "arrays and pointers are the same in C".)
In doing this implicit conversion (=coercion), the C compiler is "erasing" the array's size. Before the coercion the size is known at compile time; after the coercion, it is only known at runtime (and that only if you actually keep track of the size in a separate variable). A similar thing happens in Rust when you coerce &[u8; 5] to &[u8] - the major difference being that Rust actually keeps track of the runtime-known size for you, by making &[u8] a fat pointer that contains both the data pointer and the size.
Coercing Box<B> to Box<dyn A> is similar, too: B has a compile-time-known size, and Box<B> is a pointer to it, which when necessary can be converted to a fat pointer to the unsized type dyn A. (Slices [T] and trait objects dyn Tr are the two kinds of unsized types in Rust.) The compiler "erases" (forgets) the concrete type B and only remembers that the type is something that implements A. The actual implementation (along with the size of B) is stored in a vtable, a pointer to which is added to the fat pointer Box<dyn A>.