I wanted have a BLAS library that is easy and fast to compile, also learn how to implement math functions on a computer, so I started this library.
This library is very new, I just finished implementing sgemm, It has only single precision level1 routines, sgemv, and sgemm. Right now sgemm is slightly faster than numpy(OpenBLAS) on my cpu (Ryzen 2200g). All functions use avx and fma so it only supports thoose cpus and I would like to support more cpus, also have implementations of all BLAS functions.
I am 3rd year bachelors degree student and new to BLAS, so I would like knowlegable people to criticize and help me build this library. Also adding more tests and benchmarks or only running benchmarks on your cpu would be good.
Looks very nice! Current BLAS alternatives in Rust (I'm thinking of https://crates.io/crates/blas) can be quite cumbersome if you don't have the required libraries installed on your system.
This could be a nice alternative, especially if you manage to get on par performance with openblas. Looking forward to seeing how this evolves. Any plan for doing LAPACK as well?
I would like to do LAPACK aswell but my linear algebra knowledge isn't that good (I don't even know what LAPACK is used for). I will develop blasoxide all summer. I might do LAPACK after.
I also want to make a high level linear algebra library with matrix and vector types because I think current libraries are hard to understhand with many types and complicated apis with generics and traits.
Note for people who want to contribute:
I found these books http://www.ulaff.net/. If you want to learn how to implement linear algebra on a computer and contribute to blasoxide I think Linear Algebra: Foundations to Frontiers is a great book. I am reading it currently to learn. I find implementing fast BLAS functions very satisfying.