Could you please explain why the trait doesn't work for OsStr/OsString/Path/PathBuf?
My understanding is that OsString has less requirements than String so...
use std::ffi::{OsStr, OsString};
use std::path::{Path, PathBuf};
pub trait AnyIsFull<T>
where
Self: AsRef<[T]>,
{
fn is_full(&self) -> bool;
}
impl<T, U> AnyIsFull<T> for U
where
Self: AsRef<[T]>,
{
fn is_full(&self) -> bool {
!self.as_ref().is_empty()
}
}
fn main() {
let x = &[1, 2, 3];
assert_eq!(x.is_full(), true);
let x = vec![1, 2, 3];
assert_eq!(x.is_full(), true);
let x = "abc";
assert_eq!(x.is_full(), true);
let x = String::from("abc");
assert_eq!(x.is_full(), true);
let x = OsStr::new("abc");
// trait bound were not satisfied
// assert_eq!(x.is_full(), true);
let x = OsString::from("abc");
// trait bound were not satisfied
// assert_eq!(x.is_full(), true);
let x = Path::new("abc");
// trait bound were not satisfied
// assert_eq!(x.is_full(), true);
let x = PathBuf::from("abc");
// trait bound were not satisfied
// assert_eq!(x.is_full(), true);
}
I don't see any implementation of AsRef<[T]> for OsStr, OsString, Path or PathBuf. OTOH, str and String do implement AsRef<[u8]>, satisfying the trait bounds of your generic trait implementation. Not sure if AsRef<[u8]> was left out on purpose on the aforementioned types or if they just aren't implemented (yet) due to lack of interest.
Edit: Seems to me like going from OsStr[ing] to a byte slice and back is not as easy as I thought it might be, which is probably the reason why there is no AsRef<[u8]> implementation for those types.
Specify the generic parameter in the implementation. It doesn't really matter what it is, because it isn't used (this is also why type annotations are required to call the method).
The AnyIsFull implementation for OsStr doesn't have to be generic. Making it non-generic will get rid of the inference error (same as parasyte suggested whilst I was still typing):
use std::ffi::OsStr;
pub trait AnyIsFull<T>
{
fn is_full(&self) -> bool;
}
impl<T, U> AnyIsFull<T> for U
where
Self: AsRef<[T]>,
{
fn is_full(&self) -> bool {
!self.as_ref().is_empty()
}
}
impl AnyIsFull<()> for OsStr
{
fn is_full(&self) -> bool {
!self.as_encoded_bytes().is_empty()
}
}
fn main() {
let x = &[1, 2, 3];
assert_eq!(x.is_full(), true);
let x = vec![1, 2, 3];
assert_eq!(x.is_full(), true);
let x = "abc";
assert_eq!(x.is_full(), true);
let x = String::from("abc");
assert_eq!(x.is_full(), true);
let x = OsStr::new("abc");
// type annotations needed
assert_eq!(x.is_full(), true);
}
use std::ffi::{OsStr, OsString};
use std::path::{Path, PathBuf};
pub trait AnyIsFull<T>
{
fn is_full(&self) -> bool;
}
impl<T, U> AnyIsFull<T> for U
where
Self: AsRef<[T]>,
{
fn is_full(&self) -> bool {
!self.as_ref().is_empty()
}
}
impl AnyIsFull<u8> for OsStr {
fn is_full(&self) -> bool {
!self.as_encoded_bytes().is_empty()
}
}
impl AnyIsFull<u8> for Path {
fn is_full(&self) -> bool {
self.as_os_str().is_full()
}
}
fn main() {
let x = &[1, 2, 3];
assert_eq!(x.is_full(), true);
let x = vec![1, 2, 3];
assert_eq!(x.is_full(), true);
let x = "abc";
assert_eq!(x.is_full(), true);
let x = String::from("abc");
assert_eq!(x.is_full(), true);
let x = OsStr::new("abc");
assert_eq!(x.is_full(), true);
let x = OsString::from("abc");
assert_eq!(x.is_full(), true);
let x = Path::new("abc");
assert_eq!(x.is_full(), true);
let x = PathBuf::from("abc");
assert_eq!(x.is_full(), true);
}
How is it possible that you don't get conflicting implementation error?
EDIT:
If I for example change this to tuple, I totally get the error. note: upstream crates may add a new impl of trait std::convert::AsRef<[u8]> for type (&std::ffi::OsStr, _) in future versions
U is Sized in the first implementation, and OsStr is not in the second, so they don't overlap today. And Sized is fundamental, so if it's not implemented, downstream can assume it won't be implemented in the future.