I'm building a Rust app that needs to download thousands of files from S3 storage, process them, and publish a report. I want to have a cache on the local system with a size cap and remove old files when the limit is reached. How can I accomplish this?
Even having an app outside of the Rust app
I tried to use rclone with local mount and s3 as remote, but the main problem is fetching new data when the s3 remote is updated the local stayed outdated until poll/cache time expires