Suppose I want to create a workspace that has two crates:
- CLI crate that deals with the whole interface between user and data (stuff like fuzzy search or get methods)
- storage crate that fetches periodically a JSON or any other data outside and supplies the CLI crate with a
How would I implement that? What are the alternatives? Is my logic going to the right direction? I am thinking that the storage crate can be builded once per week or month and released a new version and the CLI crate versioning do not depend on the storage crate, since we often want to update the underlying data but not the CLI funcionality.
Putting it in an extra crate seems fine to me.
Then how the data crate would send the data to the CLI crate?
just a regular
pub data_structure in the data crate and a
use data_crate::data_structure in the CLI crate?
I would either define a
static global containing the data, or a function that returns it. The exact shape would depend quite a lot on the nature of your data.
It would be a scraped JSON or messagepack.
I don't want to have to scrape the website everytime a user installs the CLI crate. I am sorting of thinking in a cache thing that I can update from time to time (by scraping the website for new entries).
Generally there are two approaches:
- Store the scraped file in the
src/ directory and include it with the
include_bytes! macro. Provide a method that parses it at runtime.
- Store the scraped file next to the
Cargo.toml file and use a build script to parse it a compile time and generate a constant with the appropriate value.
Thanks, I will explore those alternatives!
This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.