Memory leak when continously creating tokio runtimes in a loop

Hi all!

I have an issue in a work project where I am calling an external crate's sync API inside a loop. This sync API wraps an async API by instantiating a tokio runtime. I believe reqwest (now) does the same for its blocking API.

It turns out there is a leak somewhere; 2.1 MB are lost on every iteration. In my main project this is not acceptable as the long running daemon eventually gets killed.

I have reported the issue upstream (Leak when establishing connection · Issue #47 · inejge/ldap3 · GitHub) but since I was able to isolate the problem in minimum working example (GitHub - nbigaouette/ldap3-issue47: Reproducing `ldap3` issue #47) I though more eyes might help out, specially since the problem might be related to tokio for which I don't have much experience.

Using XCode's Instruments I was able to identity "something" that is probably the actual leak. Instruments reports the following trace for the leak:

   0 libsystem_pthread.dylib _pthread_create
   1 ldap3-issue47 std::sys::unix::thread::Thread::new::h79e748abea128147 /rustc/4fb7144ed159f94491249e86d5bbd033b5d60550/src/libstd/sys/unix/thread.rs:68
   2 ldap3-issue47 std::thread::Builder::spawn_unchecked::h95189f616cf10fe0 /rustc/4fb7144ed159f94491249e86d5bbd033b5d60550/src/libstd/thread/mod.rs:492
   3 ldap3-issue47 std::thread::Builder::spawn::hf68b1678e299a624 /rustc/4fb7144ed159f94491249e86d5bbd033b5d60550/src/libstd/thread/mod.rs:386
   4 ldap3-issue47 tokio_reactor::background::Background::new::h5885f389600cb91f /Users/nbigaouette/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-reactor-0.1.12/src/background.rs:75
   5 ldap3-issue47 tokio_reactor::Reactor::background::h5d76ddcc46db80a8 /Users/nbigaouette/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-reactor-0.1.12/src/lib.rs:364
   6 ldap3-issue47 tokio::runtime::threadpool::Runtime::reactor::hdaead26ebbdfa985 /Users/nbigaouette/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-0.1.22/src/runtime/threadpool/mod.rs:177
   7 ldap3-issue47 tokio_core::reactor::Core::remote::h8826eae730572eeb /Users/nbigaouette/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.17/src/reactor/mod.rs:186
   8 ldap3-issue47 tokio_core::reactor::Core::handle::hbd9908d05d2bbd80 /Users/nbigaouette/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-core-0.1.17/src/reactor/mod.rs:167
   9 ldap3-issue47 ldap3::conn::LdapConn::with_settings::h1730c6883a363cea /Users/nbigaouette/.cargo/registry/src/github.com-1ecc6299db9ec823/ldap3-0.6.1/src/conn.rs:210
  10 ldap3-issue47 ldap3_issue47::main::h0a190d59a5364acc /Users/nbigaouette/codes/rust/ldap3-issue47/src/main.rs:21
  11 ldap3-issue47 std::rt::lang_start::_$u7b$$u7b$closure$u7d$$u7d$::h69ac2f4255696cee /rustc/4fb7144ed159f94491249e86d5bbd033b5d60550/src/libstd/rt.rs:67
  12 ldap3-issue47 std::rt::lang_start_internal::_$u7b$$u7b$closure$u7d$$u7d$::he1d982e0f3b81e28 /rustc/4fb7144ed159f94491249e86d5bbd033b5d60550/src/libstd/rt.rs:52
  13 ldap3-issue47 std::panicking::try::do_call::h7c88c220bfff6b21 /rustc/4fb7144ed159f94491249e86d5bbd033b5d60550/src/libstd/panicking.rs:303
  14 ldap3-issue47 __rust_maybe_catch_panic /rustc/4fb7144ed159f94491249e86d5bbd033b5d60550/src/libpanic_unwind/lib.rs:86
  15 ldap3-issue47 std::rt::lang_start_internal::h94930f81f540a165 /rustc/4fb7144ed159f94491249e86d5bbd033b5d60550/src/libstd/rt.rs:51
  16 ldap3-issue47 std::rt::lang_start::h681f7852517999f7 /rustc/4fb7144ed159f94491249e86d5bbd033b5d60550/src/libstd/rt.rs:67
  17 ldap3-issue47 main
  18 libdyld.dylib start

Could anyone more knowledgeable in tokio help me out in this?

Thanks!!

Might be related to this issue on tokio 0.1.22: tokio::sync::Lock leaks memory · Issue #2237 · tokio-rs/tokio · GitHub
Seems like ldap3 has released 0.7.0-alpha7 which is based on tokio 0.2, and the above issue does not seem to affect tokio 0.2.

Thanks for the suggestion!

I've re-run the instrumentation with 0.7.0-alpha7. The results are in this branch: GitHub - nbigaouette/ldap3-issue47 at version070alpha7

I does solve a large part of the leak, but there seems to still be one, albeit smaller. Instead of leaking ~170 MB over 1 minute, it leaked a bit less than 500 kB over the same period of time. The largest stack trace is different too.

In case the leak is in Tokio, we would like to hear about it on our issue tracker too.

I've reported the smaller leak in Possible leak with TLS · Issue #2554 · tokio-rs/tokio · GitHub

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.