This one looks handy!
cargo-about, because it lets you create the file you're legally obliged to ship with your executable according to most licenses' text. Also, it checks crates against your list of approved licenses. Where there's a choice of license, it only includes the license you've approved. By default it generates HTML, but with some fiddling about with the
about.hbs it can produce nice plain text output too. Saved me a lot of time.
When you use
std::fs, you may get very unhelpful errors like
The system cannot find the file specified. (os error 2)
that don't show the path of the file on which the error occured.
fs-err wraps the functions in
std::fs and adds the path information to the error while staying compatible with
Saved me the pain of adding the error context myself everywhere.
Nominating https://github.com/lukaslueg/volkswagen/tree/HEAD/volkswagen because it's funny.
Not sure if this has been featured before:
For concatenating idents!
From the readme: "A user-friendly PDF generator written in pure Rust."
That's from the official unicode consortium!
That definitely gives me some ideas for the time crate! The formatting is (unsurprisingly) based on the Unicode standard, bit it is something to look at.
0.1 release was just announced: https://github.com/EmbarkStudios/rust-gpu/releases/tag/v0.1
Online demo can be found here:
Allows applications to better declare their intentions programmatically for harder-to-abuse interfaces.
Vec but with strongly typed indices, e.g. instead of
Vec<CpuUsage> // Indexed by CPU core you can do
TiVec<CpuCoreId, CpuUsage>. It prevents accidentally using the wrong index type, which is a common source of bugs.
Despite how useful it is (and IMO should be in std), it has 0 stars and only 300 downloads.
Rustc uses something similar: IndexVec in rustc_index::vec - Rust
Somehow this has never been suggested (or I didn't find it ). A neural network inference library, written purely in Rust for models in ONNX, NNEF and TF formats. Very extensive support for many different operators and comparable speed to the onnxruntime Python package.
Also compiles to WebAssembly and has JS bindings (https://github.com/bminixhofer/tractjs).
IMO the most important crate for working with Neural Networks in Rust. Support for inference of other ML models such as Decision Trees is also coming up.