Are any vectorized automatic differentiation packages available?

Hi fellow rustaceans

I've recently developed a Python gradient-based optimization technique which I want now to translate to Rust. In Python land, I used:

  • numpy for vectorization
  • scipy for the optimization
  • autograd for automatic differentiation

As expected, when moving to Rust I want to implement as fewer things from scratch as possible. In place of numpy, I am planning to use the great nalgebra and in place of scipy, I'm planning on using argmin which looks like it would do the trick. However, I can't find any similar package to autograd which can offer vectorized automatic differentiation. There are a couple of non-vectorized packages available in Rust, but I fear that using my optimization in non-vectorized form, will be too computationally expensive, so I want to avoid that.

Do you know of any vectorized automatic differentiation packages that I haven't been able to find? (or maybe some extension to nalgebra?). Reverse-mode differentiation would be preferable but forward-mode will also be OK.

Some more notes:

  • Non-vectorized automatic differentiation in my current experiments increases compute time by an order of 10^3 (or more precisely 82.1ms -> 13200 ms) in my Python tests
  • Using numerical gradients also isn't an option as that has a similar penalty of 10^4 as above (or more precisely 82.1 ms -> 122000 ms) in my Python tests
1 Like

I don't have any idea, but I saw this the other day, which you may find relevant.

I had the same problem a few years ago.
There are two approaches for differentiation and vectorization:

  1. Compile time (via proc_macro)
  2. Type-system based (traits + generics)
  3. Runtime (using a JIT)

The compile time approach has severe limitations due to the lack of type knowledge at that stage.
The type-system approach has a tendency to explode in generics and usually ends in sadness.
The runtime approach works, but you have to implement a compiler and optimizer and also need runtime code generation.

Hey, I just found something that I think should do the trick recently:

https://github.com/raskr/rust-autograd

Looks really neat. It says it supports Reverse-mode automatic differentiation.

Also useful, a potentially really awesome combo I am just trying out is combining it with evcxr, which is a Jupyter kernel for Rust:

1 Like

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.