Can I "ignore" associated types when implement a trait?

Please check the P.S. part for more details.

Let's say we have a trait like this:

trait Foo {
    type MyType;
    fn test(&self, v: &Self::MyType);
}

and in my case, the v parameter will not be used for most types, in other words, most implementation of Foo will act the same no matter what type of MyType is:

impl Foo for char {
    type MyType = ??? ;

    fn test(&self, _v: &Self::MyType) {
        println!("hello");
    }
}

So I modify my code like this:

struct MyTypeImpl {
    id: usize,
}

trait Foo<T> {
    type MyType;
    fn test(&self, v: &Self::MyType);
}

impl<T> Foo<T> for u32 {
    type MyType = MyTypeImpl;

    fn test(&self, v: &Self::MyType) {
        println!("hello my type {}", v.id)
    }
}

impl<T> Foo<T> for char {
    type MyType = T;

    fn test(&self, _v: &Self::MyType) {
        println!("hello");
    }
}

fn main() {
    test_fn::<MyTypeImpl>();
}

fn test_fn<T>() {
    let t1 = MyTypeImpl { id: 1 };
    let i: u32 = 1;
    i.test(&t1);

    let c: char = 'c';
    c.test(&t1);
}

but this code will not compile:

error[E0282]: type annotations needed
  --> src/main.rs:33:7
   |
33 |     i.test(&t1);
   |       ^^^^ cannot infer type for type parameter `T`

So is there any way I can just ignore the associated type? Will the new feature GATs do this?

P.S. Only codes seems not clarify what I met and what I want, here are more details:
I have a serialization crate, which declare a trait that will decode bytes into certain types:

pub trait DeSerialization: + Sized {
    fn decode(ptr: &mut *const u8, ctx: &DecodeContext) -> Result<Self, DecodeError>;
}

And I have implemented a lot of basic and common types, so there's no need to let user implement them by themselves, all of these types do not need ctx to decode byte streams.

But there are some cases users want to manually implement a type, usually a custom struct. The DecodeContext save some other information, for types cannot be decoded just by bytes (this is weird, but I won't explain here, that' too complicated and have nothing to do with this question).

But I found that the DecodeContext I defined will never meet all the requirements, so I want to let user define it, that why I want to use an associated type:

pub trait DeSerialization: + Sized {
    type DecodeContext;
    fn decode(ptr: &mut *const u8, ctx: &Self::DecodeContext) -> Result<Self, DecodeError>;
}

Then it turns to the problem I asked, remember that I implemented a lot of basic or common types before, if I change the trait, all the implementations will need a type, too. Since all these trait implements are pre-defined by me, the associated type must not be a certain type, but somehow to be generic.

The Marcos will help here, but that's the last way I want to go.

Another way is to give up associated types, while pass a pointer, than cast it to the type user defines, that also a way I don't want, I'm trying to avoid using unsafe code as much as possible.

You don't really want to make your trait generic. You're not using the parameter in the u32 case and that's why the compiler had to ask you which T you meant. The fact that you put a parameter on test_fn too doesn't matter; generics aren't unified because they have name, similar to how variables and the parameters of functions you call aren't tied together because they have the same name.

Something to not do

You could make your example work by doing this, but again, it's not really what you want. You'd have to specify that unused parameter everywhere.

fn test_fn<T>() {
    let t1 = MyTypeImpl {};
    let i : u32 = 1;
    <_ as Foo<T>>::test(&i, &t1);

    let c : char = 'c';
    c.test(&t1);
}

You can just use a dummy type like () when you don't care. Sadly associated type defaults are not a stable thing, but you can use a helper trait for a similar effect.

You don't really want to make your trait generic.

In my case I do really want my trait somehow to be generic, so I can't use a dummy type even if the associated type defaults is stable, because I do want to send a MyTypeImpl type (or any other types) as the parameter, even the char implementation doesn't use it. If I set the type of char implementation to (), then the parameter type won't match.

Your example do compile, and I agree the syntax is too wired, is there any better solution to make the code not that odd?

Just use the unit type () in impls which do not use this associated type? You can even add a method to improve ergonomics for such cases:

trait Foo {
    type MyType;
    fn test(&self, v: &Self::MyType);
    fn test_unit(&self)
    where Self: Foo<MyType = ()>
    {
        self.test(&());
    }
}

I am not sure I understand your use-case. How about introducing your custom wrapper type and using it instead of the unit type?

struct Ignore<T>(T);
trait Foo {
    type MyType = Ignore<Self>;
    fn test(&self, v: &Self::MyType);
}

I suspect this is prioritizing the wrong things and still don't really recommend it, but you could just ditch the associated type and use the generic trait parameter instead. Then you can call implementers that don't care with a reference to anything.

// Generic but no associated type 
trait Foo<T: ?Sized> {
    fn test(&self, v: &T);
}

// Some types implement for specific things
impl Foo<MyTypeImpl> for u32 { /* ... */ }

// Other types implement for any thing
impl<T: ?Sized> Foo<T> for char { /* ... */ }

// Because `&T` is a function parameter now, the compiler can figure it out
fn test_fn() {
    let t1 = MyTypeImpl { id: 1 };
    let i: u32 = 1;
    i.test(&t1);

    let c: char = 'c';
    c.test(&t1);
}

Playground. This pattern does make sense if u32 needed to test more than one specific type, say, but if all your use cases are "test one specific type" or "test no specific types", the associated type that might be a dummy () makes more sense IMO.

But I could just be missing context.

2 Likes

Hi, thanks for helping, but the generic trait couldn't help, in my example, I need to access the field i in MyTypeImpl, if it is a generic, I can't access that field.

The question seems weird, I'm adding more details to explain what I met and what I want.

In @quinedot’s example, you can freely access the fields of MyTypeImpl within the impl Foo<MyTypeImpl> for u32 block because it’s defined as specific to that particular type.

You won’t be able to do that with the more generic implementations (e.g. impl<T: ?Sized> Foo<T> for char) because the typepassed in won’t necessarily be MyTypeImpl, but I don’t think that’s necessary given your problem description.

1 Like

I'm sorry I seem to misunderstand your suggestion, as @2e71828 explains, your suggestion does solve my problem on the example, I need to test on my real project to see if it will work as well.

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.