Memory leaks on alpine

I was running a small project on hyper.sh and noticed my containers died.
This was happening because the memory kept growing with use.

I put my observations and potential reproducible on the repo here.

Those benchmarks might not be very accurate, but they are accurate enough to reproduce the memory leak on hyper.sh and containers being killed.

Has anyone seen similar? Any suggestions and pointers?

Thanks :pray:

1 Like

I would recommend trying to minimize it further. The repro currently depends on 6 crates. Minimizing it down to 1 or 0 would make it clearer where to look for the leak.

The following is sufficient to reproduce. Memory usage slowly grows to 50 MB.

extern crate actix_web;
use actix_web::{server, App};

fn take_data(_data: String) -> &'static str {
    "3000000bytes"
}

fn main() {
    server::new(|| App::new().resource("api/take_data", |r| {
        r.post().with_config(take_data, |cfg| {
            cfg.0.limit(4 * 1024 * 1024);
        })
    }))
    .bind("0.0.0.0:7000")
    .unwrap()
    .run();
}

This seems to just be the same as https://github.com/actix/actix-web/issues/426 and https://github.com/actix/actix-web/issues/439.

Thanks peeps, I've mentioned my example in that actix issue.

does not leak with system allocator

================= Iteration 1
CONTAINER ID        NAME                  CPU %               MEM USAGE / LIMIT     MEM %               NET I/O             BLOCK I/O           PIDS
8f0a4a2620d3        naughty_stonebraker   0.06%               2.043MiB / 1.952GiB   0.10%               3.01MB / 7.88kB     0B / 0B             28
================= Iteration 2
CONTAINER ID        NAME                  CPU %               MEM USAGE / LIMIT     MEM %               NET I/O             BLOCK I/O           PIDS
8f0a4a2620d3        naughty_stonebraker   0.04%               2.105MiB / 1.952GiB   0.11%               6.02MB / 15.6kB     0B / 0B             28
================= Iteration 3
CONTAINER ID        NAME                  CPU %               MEM USAGE / LIMIT     

...

================= Iteration 27
CONTAINER ID        NAME                  CPU %               MEM USAGE / LIMIT     MEM %               NET I/O             BLOCK I/O           PIDS
8f0a4a2620d3        naughty_stonebraker   0.05%               1.949MiB / 1.952GiB   0.10%               81.3MB / 240kB      0B / 0B             28
================= Iteration 28
CONTAINER ID        NAME                  CPU %               MEM USAGE / LIMIT     MEM %               NET I/O             BLOCK I/O           PIDS
8f0a4a2620d3        naughty_stonebraker   0.05%               1.934MiB / 1.952GiB   0.10%               84.3MB / 250kB      0B / 0B             28
================= Iteration 29
CONTAINER ID        NAME                  CPU %               MEM USAGE / LIMIT  

leaks with default allocator

================= Iteration 1
CONTAINER ID        NAME                CPU %               MEM USAGE / LIMIT     MEM %               NET I/O             BLOCK I/O           PIDS
ee83ae7ec901        angry_mahavira      0.04%               14.27MiB / 1.952GiB   0.71%               3.01MB / 7.81kB     0B / 0B             28
================= Iteration 2
CONTAINER ID        NAME                CPU %               MEM USAGE / LIMIT     MEM %               NET I/O             BLOCK I/O           PIDS
ee83ae7ec901        angry_mahavira      0.05%               20.53MiB / 1.952GiB   1.03%               6.02MB / 17.2kB     0B / 0B             28
================= Iteration 3
CONTAINER ID        NAME                CPU %               MEM USAGE / LIMIT     MEM %     

...

================= Iteration 19
CONTAINER ID        NAME                CPU %               MEM USAGE / LIMIT    MEM %               NET I/O             BLOCK I/O           PIDS
ee83ae7ec901        angry_mahavira      0.07%               56.8MiB / 1.952GiB   2.84%               57.2MB / 167kB      0B / 0B             28
================= Iteration 20
CONTAINER ID        NAME                CPU %               MEM USAGE / LIMIT     MEM %               NET I/O             BLOCK I/O           PIDS
ee83ae7ec901        angry_mahavira      0.04%               57.43MiB / 1.952GiB   2.87%               60.2MB / 175kB      0B / 0B             28
================= Iteration 21
CONTAINER ID        NAME                CPU %               MEM USAGE / LIMIT     MEM %    

i dont know how to debug this issue

So that means it's a problem with Rust not actix, right?

I don’t have enough experience to answer this question

Does the leak reach a plateau? This looks like it's probably just jemalloc being less aggressive about releasing memory.

4 Likes

yes, seems stabilizes around 82mb