I've been working with ndarray for the past month as i was learning rust, because it was the main result from google when looking for numpy equivalent. But now that my project is becoming more and more difficult with ndarray, I wondered if nalgebra would be better, but i can't find a full comparison.
My main need is to things like finding peaks in vectors or interpolating between points to make graph or simply fitting data, etc...
I know of polars, but this is more generic and does not give me the matrix abilities i'm looking for.
My work flow in python was to store everything in pandas dataframe (so I can call each column as need) and turn the column into numpy arrays, do the calculation and making a new dataframe.
I had the same question not long ago. My conclusion so far is that neither is best, they are just different. Some things are easier or more ergonomic in one, and viceversa.
If your application relies on arrays with more than 2 dimensions, then ndarray may be your only alternative.
I have been using both on different applications: nalgebra to solve system of linear equations (for linear algebra related stuff is quite easy), ndarray when working with arrays of higher dimensions (powerful slicing methods).
I guess at the moment, you have to get comfortable using both (or the may others that exist out there).
I'm feeling you, I'm trying to translate my current work from ndarray to nalgebra, it seems easier in term of calculations, however, i'm facing a simple problem that ndarray-csv is a thing, but i don't find an equivalent for nalgebra.
nshare should be a usefull tool, but currently it is not working since it isn't using the latest version of nalgebra hence preventing the conversion. Maybe i'll just fork it for my project. A lot of unsafe code though...
I started a project for which nalgebra was a perfect fit. Only later I found out that the plotting library with the features I needed only supports ndarray. So, I ended up using both on the same project.