Language inconsistency when implementing a trait?

Consider this code below, which defines a trait FooOps, an associated type Foo::Bar, a type RcBar that can be substituted for the associated type, and 2 TryFrom impls to interconvert between Rc<T> and RcBar<T>. This code compiles fine.


use serde::{Deserialize, Serialize};
use serde_derive;
use std::convert::{TryFrom, TryInto};
use std::rc::Rc;

pub enum FooError { }

pub trait FooOps: Sized + PartialEq {
    type Bar: PartialEq + Clone + std::fmt::Debug
        + Serialize
        + for<'de> Deserialize<'de>
        + TryFrom<Self, Error = FooError>
        + TryInto<Self, Error = FooError>;
}


impl<T> FooOps for Rc<T>
where T: FooOps + PartialEq + Clone + std::fmt::Debug
    + for<'de> serde::Deserialize<'de>
    + serde::Serialize
{
    type Bar = RcBar<T>;
}

impl<T> TryFrom<RcBar<T>> for Rc<T>
where T: Clone + PartialEq + FooOps + std::fmt::Debug {
    type Error = FooError;

    fn try_from(delta: RcBar<T>) -> Result<Self, Self::Error> {
        delta.0.try_into().map(Rc::new)
    }
}


#[derive(Clone, Debug, PartialEq)]
#[derive(serde_derive::Deserialize, serde_derive::Serialize)]
pub struct RcBar<T: FooOps>(<T as FooOps>::Bar);

impl<T> TryFrom<Rc<T>> for RcBar<T>
where T: Clone + PartialEq + FooOps + std::fmt::Debug {
    type Error = FooError;

    fn try_from(rc: Rc<T>) -> Result<Self, Self::Error> {
        rc.as_ref().clone().try_into().map(Self)
    }
}

// impl<T> FooOps for Box<T>
// where T: FooOps + PartialEq + Clone + std::fmt::Debug
//     + for<'de> serde::Deserialize<'de>
//     + serde::Serialize
// {
//     type Bar = BoxBar<T>;
// }

// impl<T> TryFrom<BoxBar<T>> for Box<T>
// where T: Clone + PartialEq + FooOps + std::fmt::Debug {
//     type Error = FooError;

//     fn try_from(delta: BoxBar<T>) -> Result<Self, Self::Error> {
//         delta.0.try_into().map(Box::new)
//     }
// }


// #[derive(Clone, Debug, PartialEq)]
// #[derive(serde_derive::Deserialize, serde_derive::Serialize)]
// pub struct BoxBar<T: FooOps>(<T as FooOps>::Bar);

// impl<T> TryFrom<Box<T>> for BoxBar<T>
// where T: Clone + PartialEq + FooOps + std::fmt::Debug {
//     type Error = FooError;

//     fn try_from(rc: Box<T>) -> Result<Self, Self::Error> {
//         rc.as_ref().clone().try_into().map(Self)
//     }
// }


fn main() { }

The inconsistency comes in when I try to do the same for Box<T> as I do for Rc<T>:


use serde::{Deserialize, Serialize};
use serde_derive;
use std::convert::{TryFrom, TryInto};
use std::rc::Rc;

pub enum FooError { }

pub trait FooOps: Sized + PartialEq {
    type Bar: PartialEq + Clone + std::fmt::Debug
        + Serialize
        + for<'de> Deserialize<'de>
        + TryFrom<Self, Error = FooError>
        + TryInto<Self, Error = FooError>;
}


impl<T> FooOps for Rc<T>
where T: FooOps + PartialEq + Clone + std::fmt::Debug
    + for<'de> serde::Deserialize<'de>
    + serde::Serialize
{
    type Bar = RcBar<T>;
}

impl<T> TryFrom<RcBar<T>> for Rc<T>
where T: Clone + PartialEq + FooOps + std::fmt::Debug {
    type Error = FooError;

    fn try_from(delta: RcBar<T>) -> Result<Self, Self::Error> {
        delta.0.try_into().map(Rc::new)
    }
}


#[derive(Clone, Debug, PartialEq)]
#[derive(serde_derive::Deserialize, serde_derive::Serialize)]
pub struct RcBar<T: FooOps>(<T as FooOps>::Bar);

impl<T> TryFrom<Rc<T>> for RcBar<T>
where T: Clone + PartialEq + FooOps + std::fmt::Debug {
    type Error = FooError;

    fn try_from(rc: Rc<T>) -> Result<Self, Self::Error> {
        rc.as_ref().clone().try_into().map(Self)
    }
}



impl<T> FooOps for Box<T>
where T: FooOps + PartialEq + Clone + std::fmt::Debug
    + for<'de> serde::Deserialize<'de>
    + serde::Serialize
{
    type Bar = BoxBar<T>;
}

impl<T> TryFrom<BoxBar<T>> for Box<T>
where T: Clone + PartialEq + FooOps + std::fmt::Debug {
    type Error = FooError;

    fn try_from(delta: BoxBar<T>) -> Result<Self, Self::Error> {
        delta.0.try_into().map(Box::new)
    }
}


#[derive(Clone, Debug, PartialEq)]
#[derive(serde_derive::Deserialize, serde_derive::Serialize)]
pub struct BoxBar<T: FooOps>(<T as FooOps>::Bar);

impl<T> TryFrom<Box<T>> for BoxBar<T>
where T: Clone + PartialEq + FooOps + std::fmt::Debug {
    type Error = FooError;

    fn try_from(rc: Box<T>) -> Result<Self, Self::Error> {
        rc.as_ref().clone().try_into().map(Self)
    }
}


fn main() {

}

It rejects the TryFrom impls with these compile errors:

error[E0119]: conflicting implementations of trait `std::convert::TryFrom<BoxBar<_>>` for type `std::boxed::Box<_>`:
  --> src/main.rs:59:1
   |
59 | / impl<T> TryFrom<BoxBar<T>> for Box<T>
60 | | where T: Clone + PartialEq + FooOps + std::fmt::Debug {
61 | |     type Error = FooError;
62 | |
...  |
65 | |     }
66 | | }
   | |_^
   |
   = note: conflicting implementation in crate `core`:
           - impl<T, U> std::convert::TryFrom<U> for T
             where U: std::convert::Into<T>;
   = note: downstream crates may implement trait `std::convert::From<BoxBar<_>>` for type `std::boxed::Box<_>`

error[E0119]: conflicting implementations of trait `std::convert::TryFrom<std::boxed::Box<_>>` for type `BoxBar<_>`:
  --> src/main.rs:73:1
   |
73 | / impl<T> TryFrom<Box<T>> for BoxBar<T>
74 | | where T: Clone + PartialEq + FooOps + std::fmt::Debug {
75 | |     type Error = FooError;
76 | |
...  |
79 | |     }
80 | | }
   | |_^
   |
   = note: conflicting implementation in crate `core`:
           - impl<T, U> std::convert::TryFrom<U> for T
             where U: std::convert::Into<T>;
   = note: downstream crates may implement trait `std::convert::From<std::boxed::Box<_>>` for type `BoxBar<_>`

error[E0210]: type parameter `T` must be covered by another type when it appears before the first local type (`BoxBar<T>`)
  --> src/main.rs:59:6
   |
59 | impl<T> TryFrom<BoxBar<T>> for Box<T>
   |      ^ type parameter `T` must be covered by another type when it appears before the first local type (`BoxBar<T>`)
   |
   = note: implementing a foreign trait is only possible if at least one of the types for which is it implemented is local, and no uncovered type parameters appear before that first local type
   = note: in this case, 'before' refers to the following order: `impl<..> ForeignTrait<T1, ..., Tn> for T0`, where `T0` is the first and `Tn` is the last

error: aborting due to 3 previous errors

Some errors have detailed explanations: E0119, E0210.
For more information about an error, try `rustc --explain E0119`.
error: could not compile `playground`.

I haven't yet been able to figure out why, especially since other than the name the code for Box<T> is a verbatim clone of the code for Rc<T>, which is accepted.

From this I infer that there is a difference between Box<T> and Rc<T> somewhere, maybe even with the blanket TryFrom impl. But I can't pin it down exactly. This is frustrating because atm it seems like an inconsistency in Rust's behavior.
Can anyone tell me what precisely is going on there to make rustc accept the code for Rc<T> but reject that same code for Box<T>?

I can tell you what the difference is, but not why:

#[fundamental]

Box has it, Rc doesn't.

RFC 1023 describes it, but it's a bit beyond me at the moment. All I remember is that it changes the coherence rules in some way.

3 Likes

Thank you for this answer. I'm going to escalate this to internals.rust-lang.org because I never would have guessed this, and the RFC isn't overly clear on why it exists in the first place.

Basically fundemental allows foreign generic types to be treated as types you created for the purpose of writing impls under some conditions.

If there is 1 type parameter, as long as you created that type, you can treat fundemental types as your own for the purposes of writing impls.

I.e. Box<MyType> is considered a local type, same as &mut MyType

Local types also have a some negative reasoning for unimplemented traits, which is why your example works for Box

How I wish that were true. If Box<T> was treated as a local type, then the issue shouldn't have popped up in the first place: The trait is foreign but the type is local, where exactly is the orphan rule violation?

No, it treats Rc<T> and Arc<T> as local, but definitely not Box<T>. And therein lies the inconsistency.

Ahhh, looks like my understanding of fundamental was wrong, or at least incomplete!

This is also why I opened the issue on IRLO.

It's incredibly unclear to me:

  1. why #[fundamental] is needed in the first place (I found the RFC rather unhelpful, it alludes to some kind of balance between the crate that defines a fundamental type and the code that consumes it, but it's not really made all that explicit)
  2. How I can get around it. Newtyping Box<T> won't cut it because it would be equal to saying "in order to use this library, you need to treat Box<T> as toxic and use OtherBox<T> instead" which can cause impedance mismatch in consuming code.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.