First post here, and apologies in advance for what may be a noob question. I'd like to better understand how shared libraries can be used to reduce overall binary size.
For context, I'm developing for a constrained environment with a couple of MBs of available storage, so keeping binary size in check is a priority. I'd like to run a few concurrent processes that share a common set of interfaces, with each performing some logic. In this particular case, all processes connect to an MQTT broker, and do work on different topics.
My original vision for this was to have the MQTT interface and connection/publish/subscribe mechanics abstracted away in a shared library. And then have separate binaries that use that library for the MQTT interface, and themselves only contain their specific (and quite minimal) logic.
I've had a go at implementing this, following a structure along the lines of Rust package with both a library and a binary? - Stack Overflow. I've taken things to "the extreme", putting virtually all code in the library - each of the binaries then only import and call a single function from the library. I've also applied the main points from GitHub - johnthagen/min-sized-rust: 🦀 How to minimize Rust binary size 📦 to reduce binary size (up till and including "Abort on Panic").
The challenge I have is this: what I observe is that the binaries are still 400-500KB each - not much less than the size before I moved the logic into the library. The library is just ~70KB. This doesn't make a lot of sense to me, but maybe I'm missing something? Maybe there's a sort of lower bound on the binary size that I'm just not going to get below (at least for code based on
std)? Happy to share my code if helpful.
If I can't get the binaries to be below ~100KB each via this "shared library" approach, then I'm probably going to need to tackle this a different way - e.g. put everything into a single multi-threaded executable. Any pointers?