fn main() {
let my_string = String::from("hello world");
// first_word works on slices of `String`s
let word = first_word(&my_string[..]);
let my_string_literal = "hello world";
// first_word works on slices of string literals
let word = first_word(&my_string_literal[..]);
// Because string literals *are* string slices already,
// this works too, without the slice syntax!
let word = first_word(my_string_literal);
}
My questions is, why is this one still working?
This one is not included in example codes but I tried by myself and it's compiled without any issue:
fn main() {
let size_of_ref_to_string = std::mem::size_of::<&String>();
let size_of_slice = std::mem::size_of::<&str>();
println!("the size of &String{}", size_of_ref_to_string); //8 bytes
println!("the size of &str {}", size_of_slice);//16 bytes
}
We can answer this question from the perspective of memory size, if you have learned c or cpp, you can regard &String as a pointer, which just stores the beginning address of memory allocated for this String.
A string slice is a reference to part of a String
As for slice, we call it fat pointer. Fat means it has something extra besides address info, which is the length(size of the memory allocated) of the part refered by this slice.
Thanks for your reply, but what makes me confused is "expecting a &str but received a &String", why does compiler let this happen? Deref coercoins should be the correct answer.
I don't know where the "expecting a &str but received a &String" error you mentioned came from, but it can happen if generics are involved. Generally, Rust will only perform deref coercions if the target type is not generic.