I think we need a official standard organization

We need a official standard organization to standardize a lot of api, such as log api, collection api, relational databases operation api, asyc api, etc, like psr in PHP.

The standard api just provide trait, annotation, common macro and little metadata struct but not implement, the implement is done by third-part library.


The reason is:

  • Convenient upper library author. For example, I want to write an upper-level network library. In order to make the whole community use, I need to provide features to implement the trait of the mainstream network library. Now it is tokio, actix, async-std, and there may be more in the future. But if there are standard traits, tokio, actix, async-std to implement these traits, then my upper network library only needs to use standard traits, instead of having to adapt features separately.

  • Convenient to the underlying library author. Nowadays, many underlying libraries are self-contained. If they want to be easily used by the community, they must form influence and form a set of standards themselves. This is more difficult. But if there is a set of standard traits, and the library authors implement the set of standards, then the library using the standard trait can easily switch to a better underlying library without major changes.

  • User-friendly. First, there is an official standard, so you don't need to learn more standards for third-party libraries, which greatly reduces the cost of learning. The second is to make it easier to choose, because all upper-level libraries use a uniform official trait, so you only need to consider the performance of the third party, regardless of implementation.

For logging, the log crate is basically this - it's just an interface that any logging engine can hook into.

For async, the standard library defines a Future trait that can be executed by third party libraries like Tokio and Runtime.

Not sure about equivilents for the others.

The idea is generally good. But who should be responsible for this and how do you think should it be organized?

The rust project was initialized by Mozilla Research. Do you think, Mozilla should be responsible for this? As far as I know, Mozilla actually tries to minimize their responsibility and expect Rust to be community driven.

So, what are your further thoughts on this standardization organization?

Edit:
In case, if you are looking for categorized crates, you should give lib.rs (former crates.rs) a try. It is a community driven alternative to crates.io

Maybe this helps you better to find proper crates.

I know, but too scattered, look like unofficial.

I kind of get the feeling that this is suggesting that certain crates become kind of a "standard library" without actually being in std. The problem with this as discussed time and time before is that the standard library and compiler is developed by a team whose focus is just that, and if we pile on another 10 crates (Where the purpose of a crate is to be flexible in comparison to the standard library) then we'll have these crates being as stable-dependent as the standard library.

4 Likes

RustCrypto org aims to develop set of traits to describe cryptographic primitives, see:

For image processing I suggested a new image-core crate, which could be used for generic image buffer struct and image encoder/decoder traits, but at the moment this idea haven't found much traction.

I'm not sure what problem is this solving? Rust already has de-facto standards for these things and they're doing well.

Why would an organization do it better? Why would anyone follow the recommendations set by the org?

Why should users take usability hit of extra layers of abstraction? Why should users and crate authors accept limitations of officially-blessed APIs? (standards can't move quickly, and universal APIs struggle to be more than lowest-common-denominator).

9 Likes

Maybe the problem is one of documentation? There are some ad-hoc collections of current community standards (and alternatives). But they aren't always easy to find and are often left to grow stale.

If you're unfamiliar with a particular area it can take awhile to discover what's available and how it all fits together.

2 Likes

For many years now I have been determined not to invest any time and effort in learning or creating significant work in any language that does not meet a few criteria:

  1. Must be available from multiple vendors.

  2. Must be in use by a large community.

  3. Must conform to a standard actively supported by the vendors and that large community.

  4. Must be cross-platform. Available for use on common operating systems and machine architectures, all the way don to bare-metal.

That pretty much leaves me only C/C++ and Javascript now a days.

Why is all this?...

Because over the years I have had to learn many languages that have come and gone: Algol, Coral, PL/M, Lucol, Java... Seemed that every project I worked on required learning a different language. Personally I came to the conclusion that this was a massive waste of time and mental effort on my part. Mostly those languages brought nothing conceptually new to the table. Just having to learn a whole new syntax and semantics and environment that does the same stuff in a little different way before being able to get on with the job in hand. I have also seen how it becomes a massive expense as companies end up having to rewrite things when their language becomes obsolete or abandon products altogether.

But then comes Rust... For a long time I ignored it. It's new, it does not meet most of the criteria above. Another flash in the pan? Why am I here at all?

Well, Rust is the first language I have seen for ages that does actually offer something new. Never mind all that high falutin' functional programming from Haskell or whatever, no, the important thing is correctness and safety. Something that shamefully other programming languages have ignored despite decades of development and ever increasing complexity, cough, C++. Whilst at the same time being understandable and usable. Something we have not seen since Ada.

As such I think it would be great if Rust was available from multiple vendors, GCC, Microsoft, others. At that point perhaps some kind of "standard", like C/C++, becomes necessary or at least desirable.

Given that Rust is still pretty new and things are in a state of flux I suspect it's a bit too early to be thinking of such a standard. I think standards should arise as a consensus from the vendors/users, codifying best practice and what works, rather than be imposed from top down out of the blue

A premature standard could be a killer to progress, freezing in APIs and such that turn out to be not so good (see spmc debate) or be ignored as progress occurs anyway...

Err... sorry for the long essay.

3 Likes

So, all people don't agree this advise.