What's the best way to upload files with file sizes up to multiple GB to an Actix driven webserver?
Background:
I've got a POST route that just reads bytes from the body.
#[post("/{id}")]
async fn post_bytes(
web::Path(id): web::Path<String>,
bytes: actix_web::web::Bytes
) -> impl Responder { /* snip; use the bytes */ }
I raised the payload size with
App::new()
.data(web::PayloadConfig::new(4 * 1024 * 1024 * 1024)) // 4 GB
but when posting more than ~1GB, the application crashes with the following error message.
memory allocation of 1073741824 bytes failed
error: process didn't exit successfully: `target\debug\a.exe` (exit code: 0xc0000409, STATUS_STACK_BUFFER_OVERRUN)
This seems to be a bug in Actix. But I still have to upload the files. So, I thought maybe I'm on the wrong path and there is a better way to do it overall.