Ref to Ref conversion for nested enums

Hi to All, guys :slight_smile:

I'm in the situation where i got multiple composed enums, like this :

enum Aenum {
    B(Benum),
    C(Cdata),
}

enum Benum {
    F(Fdata),
    G(Gdata),
    H(Hdata),
}

struct Fdata {
    number: usize,
    char: char,
    payload: String,
}

struct Gdata {
    number: usize,
}

struct Hdata {
    number: usize,
    payload: String,
}

struct Cdata {
    payload: String,
}

impl From<Cdata> for Aenum {
    fn from(value: Cdata) -> Self {
        Aenum::C(value)
    }
}

impl From<Benum> for Aenum {
    fn from(value: Benum) -> Self {
        Aenum::B(value)
    }
}

impl From<Fdata> for Benum {
    fn from(value: Fdata) -> Self {
        Benum::F(value)
    }
}

impl From<Gdata> for Benum {
    fn from(value: Gdata) -> Self {
        Benum::G(value)
    }
}

impl From<Hdata> for Benum {
    fn from(value: Hdata) -> Self {
        Benum::H(value)
    }
}

and i can convert now Fdata to Aenum in this way :

let f = Fdata {
        number: 1,
        char: 'c',
        payload: "Hello!".to_string(),
    };

let a_example = Aenum::from(Benum::from(f));

now i want to convert from &Fdata to &Aenum in this fashion:

fn convert(f_ref: &Fdata) -> &Aenum {
    todo!()
}

but the only solution(in safe rust at least) that i found is to build a specular enum structure like this :

enum AenumRef<'a> {
    B(BenumRef<'a>),
    C(&'a Cdata),
}

enum BenumRef<'a> {
    F(&'a Fdata),
    G(&'a Gdata),
    H(&'a Hdata),
}

impl<'a> From<& 'a Cdata> for AenumRef<'a> {
    fn from(value: & 'a Cdata) -> Self {
        AenumRef::C(value)
    }
}

impl<'a> From<BenumRef<'a>> for AenumRef<'a> {
    fn from(value: BenumRef<'a>) -> Self {
        AenumRef::B(value)
    }
}

impl<'a> From<& 'a Fdata> for BenumRef<'a> {
    fn from(value: & 'a Fdata) -> Self {
        BenumRef::F(value)
    }
}

impl<'a> From<& 'a Gdata> for BenumRef<'a> {
    fn from(value: & 'a Gdata) -> Self {
        BenumRef::G(value)
    }
}

impl<'a> From<& 'a Hdata> for BenumRef<'a> {
    fn from(value: & 'a Hdata) -> Self {
        BenumRef::H(value)
    }
}

And now i can convert from ref to ref like this :

fn convert_r<'a>(f_ref: &'a Fdata) -> AenumRef<'a> {
    AenumRef::B(BenumRef::from(f_ref))
}

My question is: is this the better way to convert from ref to ref in a situation like this or exist a better and less verbant way to do this ?

Thanks in advance for your time, guys

1 Like

I don't have a direct answer to your question. But I wonder whether you're creating unnecessary abstractions by using the From trait here. If the body of each conversion is really just calling the constructor, I suggest using the constructor directly instead of the from or into methods. This is no more verbose to use, and avoids the creation of all the From implementations, for both refs and values.

1 Like

Thank you for your answer!
I have to go a little bit off topic in order to specify the reason why i've used that abstraction.The reason is because i want to use values structs in this way too :

fn use_enum<I : Into<Aenum>>(value: I) {
    let a_enum = value.into();
    // here i use a_enum 
}

I see. I was about to suggest macros as the only solution, and then I remembered that there are crates that provide these macros. In particular:
https://docs.rs/derive_more/latest/derive_more/derive.From.html#example-usage

This seems to work (it compiles):

use derive_more::From;

#[derive(From)]
enum Aenum {
    B(Benum),
    C(Cdata),
}

#[derive(From)]
enum Benum {
    F(Fdata),
    G(Gdata),
    H(Hdata),
}

struct Fdata {
    number: usize,
    char: char,
    payload: String,
}

struct Gdata {
    number: usize,
}

struct Hdata {
    number: usize,
    payload: String,
}

struct Cdata {
    payload: String,
}

let f = Fdata { number: 1, char: 'c', payload: "Hello!".to_string() };

let a_example = Aenum::from(Benum::from(f));

#[derive(From)]
enum AenumRef<'a> {
    B(BenumRef<'a>),
    C(&'a Cdata),
}

#[derive(From)]
enum BenumRef<'a> {
    F(&'a Fdata),
    G(&'a Gdata),
    H(&'a Hdata),
}

fn convert_r(f_ref: &Fdata) -> AenumRef<'_> {
    AenumRef::B(BenumRef::from(f_ref))
}

with this dependency in Cargo.toml

derive_more = { version = "2", features = ["from"] }
1 Like

There is no solution to go from actual &Fdata to actual &Aenum, as they have different layouts and sizes, etc.

4 Likes

Thanks! This is very nice since make me spares a lot of boilerplate code about struct to enum and enum to enum conversion, but the problem regarding the twin implemetations for enums remains which has the most impact on my model.

Wetworked but working :

use std::mem;

enum Aenum {
    B(Benum),
    C(Cdata),
}

enum Benum {
    F(Fdata),
    G(Gdata),
    H(Hdata),
}

struct Fdata {
    number: usize,
    char: char,
    payload: String,
}

struct Gdata {
    number: usize,
}

struct Hdata {
    number: usize,
    payload: String,
}

struct Cdata {
    payload: String,
}

const HELLO: &str = "ciao";

fn main() {
    let f_data = Fdata {
        number: 0,
        char: 'a',
        payload: HELLO.to_string(),
    };

    let a_enum_ref: &Aenum = unsafe { mem::transmute(&f_data) };

    let payload = match a_enum_ref {
        Aenum::B(Benum::F(data)) => data.payload.as_str(),
        _ => "no result",
    };

    assert_eq!(payload, HELLO);
}

PS: In more complex situations this approach doesn't work

This is definetly UB, although miri apperently doesn't catch it? I guess the enum variant is put in the padding bytes and they apparently get zeroed, which is exactly the variant you want. (miri should still catch it)
Without the string in the struct this segfaults.

3 Likes

Thanks for the reply! Agree with that, in fact i can add unused fields in order to make layouts matches everytime but, for obvious reasons, is not exacly what i want ! :slightly_smiling_face:

Yep, definitely unsound. Slap repr(C) in there and run Miri (under tools, upper right), and recognize that the default representation is unspecified and subject to change (e.g. could be the same as repr(C)).

Not with the default representation.

With the deterministic representations, you could perhaps line up some padding bytes to be where the discriminants are, but that would still be unsound as padding is generally uninitialized data. Or you could expand a reference on the assumption it's inside the nested enum, but that's also UB via violating provenance (not to mention obviously invalid when the assumption is false).

3 Likes

If you can say more about exactly how you will use the ref->ref conversions at a somewhat higher level, there will probably be a new set of suggestions for how to meet your needs without using unsafe.

1 Like

Very agree, the use of this approach is not advisable and you need a deterministic representation in order to make it work everytime. Not the point here, i tried to challenge in the direction of a more creative working solution despite the fact that is orrible under many points of view. Think that i will vote your previous post as the solution. Seems that the twin enums approach is the only acceptable way to proceed here(in my case at least).

Sadly, I can't.At this point,essentially it would be enought for me to mantain a type equivalence(or implicit conversion) between &Aenum and AenumRef (as it happens for &str and &String for example) but this again seems impossible to do(if you think i'm wrong please let me know). So i have to use both &Aenum and AenumRef(think for example when filtering from a iterator) and make use of a conversion method between &Aenum and AenumRef.

Commonly you always use a reference when you don't need ownership, and only use a value when you do need ownership. This can reduce the number of APIs needed.

It also helps quite a bit if you can use the Copy bound (or Clone, if it is not too expensive) since then an API that uses references can convert to an owned value when necessary.

1 Like