Unexpected behaviour in egui with [Option<Struct>; 2]

Consider these 3 simplified wrapper cases:

#[derive(Default, Clone)]
struct Xample {
    field: String,
}

fn take_xample(x: &mut Xample) {
    x.field = "Hello".to_string()
}

fn main() {
    let x = Xample::default();
    let mut wrapper1: [Xample; 2] = [x.clone(), x.clone()];
    let mut wrapper2: [Option<Xample>; 2] = [Some(x.clone()), Some(x.clone())];
    let mut wrapper3: Vec<Option<Xample>> = vec![Some(x.clone()), Some(x)];
    
    for i in 0..2 {
        take_xample(&mut wrapper1[i]);
        take_xample(&mut wrapper2[i].as_mut().unwrap());
        take_xample(&mut wrapper3[i].as_mut().unwrap());
    }
    for i in 0..2 {
        println!("{}", wrapper1[i].field);
        println!("{}", wrapper2[i].as_ref().unwrap().field);
        println!("{}", wrapper3[i].as_ref().unwrap().field);
    }
}

The actual function is an egui interface with widgets that are used to modify the data in the struct.
The first case was working properly but I wondered if it weren't better to put the structs in Options to save on memory. Case2 however didn't work. The widgets in the interface couldn't be modified anymore. Changed the array to a vector and it worked again as expected.

Don't understand why. Is there an explanation for this behaviour?

if you use reduced example code, please ensure at least the type signature can illustrate the actual erroneous case, otherwise, you'd better provide actual code in question.

without the actual code shown, I can only guess, but if the function is generic, the auto deref cocersion might interference with type inference. in case you didn't realize, in your example, the &mut in front of the arguments is redundent for case2 and case3 ( as_mut() already gives the type Option<&mut Xample> and then unwrap() just gives &mut Xample). it works because of auto deref, and it might show different behavior if take_xample() is generic.

depending on what T type you put into Option<T>, it may or may not actually save memory. for example, an empty String will not allocate on heap, and String is applicable to niche optimization, so Option<String> and String would consume exactly the same amount of memory in practice.

use the type system correctly, e.g. for the semantic implications, not for the supposedly "micro-optmizations". except for very few low hanging fruits, optimization should always be based on measured performance data.

1 Like

I suspected as much. Anyway thanks for looking into it.

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.