In years gone by I used to argue in favour of standards. I did not want to use languages that did not have an authoritative standard like ISO, multiple vendors working to that standard and wide user base support. Why? Because I like the idea software I write, or the companies I work for write, can be be deployed on a wide range of operating systems and hardware architectures. Can be used with compilers from multiple vendors. This has all kinds of advantages in the durability and longevity of ones creations. Clear economic benefits. As such Ada, C, C++, Javascript were in, almost anything else was out.
It might be useful to think about where such industrial standardisation evolved from. As far as I can tell it all started in the dawn of the industrial revolution. It became clear that it would be more efficient and profitable for everybody if they worked to common standards of measurement and produce parts to common design standards. One of the earliest examples being the standardisation of threads on nuts and bolts. With a standard like the "British Standard Whitworth" for screw threads a manufacturer could order nuts and bolts from different nut and bolt makers and be sure they would fit together. An obvious advantage in ensuring supply and promoting competition. Then we get into the whole metrication thing. And so it grows.
My view of standards was based on that history. Except the nuts and bolts were now language compilers and users programs. Clearly having multiple vendors building compilers to the same standard was a good thing for the language users and compiler vendors alike.
In recent years though I have begun to doubt that view of standards in software, especially language standards.
For a start software is not a physical thing like nuts and bolts. And especially Open Source software is very different than closed. A single Open Source implementation of a language is usable by everyone, things like Clang, GCC and Rust support just about every platform one can imagine. Arguably it would be better, more efficient, to have everyone working on perfecting that single implementation than many closed source vendors all trying to provide standard compliant offering. Already there we have all the advantages of closs-platform, multi-architecture support. And the lonegevity assurance.
Then I see we have international standards for things that are pretty much never used. BASIC, Pascal, and the like. Languishing, unmaintained, far behind what the users of those languages actually use.
Then I wonder what is the point of a C or C++ standard? Those standards leave so much that is "implementation defined" or down right undefined. The ongoing train wreck that is the ISO C++ standard is painful to watch. Language like Ada and Javascript fare better in that respect.
And of course, the standards don't ensure quality. A bolt made to a modern metric thread standard does not ensure that the materials of which it is made are up to the job.
From my experience of embedded and safety critical software, those who are serious about safety and corrctness don't just get their compilers from any old vendor that claims standards compliance. No they go to Greehills and the like with long track records of conformance and quality assurance. Those who are not so serious advertise their standards compliance as marketting material.
So what about Rust? Despite my former insistance on standards here I am using Rust.
I have no worries about that platform and hardware support. I have no doubt Rust will be with us for a long time. Especially with the likes of MS and Intel getting on board.
I suspect it would be good to have a formal specification. Such that the likes of GCC and Microsoft can build Rust support into their compilers correctly.
Who should be the gardians of that standard? Isn't that what the Rust Foundation is for? Things like ISO, I think not.
Sorry for what grew into a rambling essay folks.