How to declare two generic types are different?

I'm trying to "inherit" an upstream struct by using "composition", like:

struct Patched<O, E>(O, E);

// as_ref to O and E repectively

impl<O, E> AsRef<O> for Patched<O, E> {
    fn as_ref(&self) -> O {
        &self.0
    }
}

impl<O, E> AsMut<O> for Patched<O, E> {
    fn as_mut(&mut self) -> &mut O {
        &mut self.0
    }
}

// won't compile, conflicting implementation for `Patched<_, _>`
// impl<O, E> AsRef<E> for Patched<O, E> {
//     fn as_ref(&self) -> &E {
//         &self.1
//     }
// }


// deref to origin type to make existing code work without modify
impl<O, E> Deref for Patched<O, E> {
    type Target = O;

    fn deref(&self) -> &Self::Target {
        &self.0
    }
}

impl<O, E> DerefMut for Patched<O, E> {
    fn deref_mut(&mut self) -> &mut Self::Target {
        &mut self.0
    }
}

But the following code works as expected.

struct Pair(i32, u32);

impl AsRef<i32> for Pair{
   ....
}

impl AsRef<u32> for Pair{
   ....
}

How can I declare the two generic type O and E are different?
proc_macro may work out, is there any solution without macros? e.g. where O!=E

Update1:

I solved this somehow, by wrapping O and E into different container types which both deref to inner object.

macro_rules! container_struct {
    ($container_name: ident) => {
        #[derive(Debug, Clone)]
        #[repr(transparent)]
        struct $container_name<T>(T);

        impl<T> Deref for $container_name<T> {
            type Target = T;

            fn deref(&self) -> &Self::Target {
                &self.0
            }
        }

        impl<T> DerefMut for $container_name<T> {
            fn deref_mut(&mut self) -> &mut Self::Target {
                &mut self.0
            }
        }
    };
}

Then the struct Patched turn into

container_struct!(TO);
container_struct!(TE);


struct Patched<O, E>(TO<O>, TE<E>);

impl<O, E> AsRef<TE<E>> for Patched<O, E> {
    fn as_ref(&self) -> &TE<E> {
        &self.1
    }
}

impl<O, E> AsRef<TO<O>> for Patched<O, E> {
    fn as_ref(&self) -> &TO<O> {
        &self.0
    }
}

Then use container type Deref to turn into O adn E respectively.
It should fail to compile when call as_ref in condition of O==E.

You can't. In general, negative reasoning (negative trait bounds, type-level inequality, negative impls) are HardTM.

The newtype pattern you solved this with is probably the best you can do.

2 Likes

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.