Neuronika was developed by a colleague of mine and me for a university project. (we are both master's students in CS). The framework offers auto-differentiation and dynamic neural networks, pretty much like Pytorch does.
We have implemented some of the most common layer components such as a dense layer, a dropout layer, a GRU, an LSTM and 1d-2d-3d-CNN. Atm, however, we lack pooling layers. (and much more).
Neuronika also offers loss functions, optimisers, computational graphs, tensors and data utilities.
We had so much fun developing it.
Speed-wise we registered performances comparable with PyTorch. Do note that in our benchmarks we used the matrixmultiply-threading feature flag. For example: this plot has been obtained by performing 1000 forward and backward passes on a MLP with topology (in_size, 75, 50, 1) and ReLU non linearities, averaging times on 10 runs.
GPU support is far on the horizon.
Feel free to benchmark it and check also the docs.
We are not Rust's experts by any means and this project really made us try our best.