We're building a web application with a relatively vanilla rust/actix/diesel/juniper stack, and just rounded the corner to trying to do continuous deployment. We're using Google Cloud, so the main option is Cloud Build.
But, while locally the application compiles in 3-4 minutes, remotely it times out well past the 10 minute hard limit in GCP, and some of the early cheatcodes for making that deploy faster - reducing optimization trying to cache dependencies, for example - isn't working out.
Are any folks successfully doing continuous deployment with GCP or something else and being able to automate their compiles in a reasonable non-timing-out timeframe?
I'm doing them with GitLab and deploying Docker images to Heroku. It's not fast, but it works well enough.
When I say "not fast", if the Cargo.toml files are unchanged then it takes 23 minutes from start to end. That includes:
Build stage - in parallel
Build/Test Rust code
Build/Test UI code
Build Docker container for Rust code
Build Docker container for UI code
Build Docker container for E2E Tests
Test stage
Run E2E Tests. This is the three Docker containers from above, plus Postgres and Selenium
Deploy stage
Deploy Rust Docker container to Heroku
Deploy UI code to Netlify
Out of this, the majority of the time is building the Rust code. I was caching the compiled dependencies, but the cache file was so big that it's actually faster to re-compile them every time than it is to upload/download the cache file!