What's everyone working on this week (38/2020)?

New week, new Rust! What are you folks up to?

I am wrapping up my color blindness simulation project.
The CLI application is released to crates.io but the GTK GUI needs more attention and a DEB package.

5 Likes

Picking back up work on my Rust bindings to TensorRT for GPU accelerated AI model inference.

Really pushing to have an MVP soon that's usable by the community!

@themoose5 : This looks interesting. Can you explain what TensorRT provides? I'm looking at https://docs.nvidia.com/deeplearning/tensorrt/api/index.html#apihttps://docs.nvidia.com/deeplearning/tensorrt/api/index.html#api and https://docs.nvidia.com/deeplearning/tensorrt/api/c_api/index.html and I can't figure out what it does.

Is TensorRT some library that (1) only works with Tensorflow / Pytorch or (2) some library that allows us to do Tensor ops on CUDA w/o Tensorflow / PyTorch or (3) ... ?

I am working on RStats, my first public crate that does stats, machine learning and data-analysis, from the ground up. Early days yet.

More or less all of the above :grinning:.

TensorRT provides the ability for most any model to be run on the GPU via CUDA. It isn't restricted to a specific framework. It can parse most of the common model serialization formats (Uff, Onnx, Caffe) to create a TensorRT engine. That engine goes through some optimization during creation to better run on the GPU. Things like layer fusion, conversion to float 16 or int 8 data, kernel tuning, etc.

I'm doing the "Acronym" exercism in Rust on exercism.io.

It seems simple at first- "write a function that takes a string and returns the acronym for that string", but then there are all these edge cases it has to support!

for example:

"" --> ""
"Portable Network Graphics" --> "PNG"
"HyperText Markup Language" --> "HTML"
"___SomeWacky Thing!" --> "SWT"
"all lowercase too" --> "ALT"

Anyway, let me know if anyone wants to join virtually and do exercises together! :grinning: