Do you have recommended Rust libraries that supports multiple LLMs as a hub?

So far I got is mistral.rs and candle - they run on premise models. But not sure if there're any good ones can be used to call different online models (like OpenAI's).

How do you feel about ; and may i ask your feedbacks on mistral.rs / candle / etc. in general?

Thanks!

To make rest api calls I like Reqwest for async, and ureq for blocking

Not sure about specialized libraries but IMHO isn’t writing your own API calls the move? More customizable, faster compile times