Hi fellow rustaceans

I've recently developed a Python gradient-based optimization technique which I want now to translate to Rust. In Python land, I used:

As expected, when moving to Rust I want to implement as fewer things from scratch as possible. In place of numpy, I am planning to use the great nalgebra and in place of scipy, I'm planning on using argmin which looks like it would do the trick. However, I can't find any similar package to autograd which can offer vectorized automatic differentiation. There are a couple of non-vectorized packages available in Rust, but I fear that using my optimization in non-vectorized form, will be too computationally expensive, so I want to avoid that.

Do you know of any vectorized automatic differentiation packages that I haven't been able to find? (or maybe some extension to nalgebra?). Reverse-mode differentiation would be preferable but forward-mode will also be OK.

Some more notes:

- Non-vectorized automatic differentiation in my current experiments increases compute time by an order of 10^3 (or more precisely 82.1ms -> 13200 ms) in my Python tests
- Using numerical gradients also isn't an option as that has a similar penalty of 10^4 as above (or more precisely 82.1 ms -> 122000 ms) in my Python tests