A nice pattern for caching async results in a hot code path?

(CW: morning post driven by idle shower musings before coffee has really had a chance to kick in)

There are lots of cases where a value might not yet be known, and will take some time to produce, but once it is it can be cached and returned immediately.

So in async code, the obvious thing is to have a future that resolves to the value, and implement an internal cache that can avoid some of the time-consuming work and resolve sooner. Consumers can .and_then() chain this future (or await! the value) and let the hidden cache take care of the details. Simple and elegant.

However, futures do no work unless polled, and thus at least as far as I understand it, this pattern (probably?) involves a bunch of deferral and another trip around the event loop, possible preemption, and other overheads just to have a future that resolves immediately.

The alternative that I can see is to have the caller cache the value, and check to see whether the result is available or the async code path needs to be taken. That can work fine, but implies a certain code structure that might not always fit. Extra layers of encapsulation can hide detail, but can't really hide whether or not there's an async result being fetched, at least not without blocking.

Have I got this wrong? Is there a way for a future / await! block to return in an immediately-resolved state, or some other way to optimise a pattern like this?

However, futures do no work unless polled, and thus at least as far as I understand it, this pattern (probably?) involves a bunch of deferral and another trip around the event loop, possible preemption, and other overheads just to have a future that resolves immediately.

A tree of futures is combined into a task, and this task is the thing that executor polls. When you poll task's top-level future, the poll is propageted to leaf futures, so one event-loop turn may poll many futures. In fact, futures will be polled until one is found which is not ready.

And, for example, the poll for loop_fn future will just poll underlying future in a loop: loop_fn.rs.html -- source. If the underlying future is ready, the loop just keeps looping, without any interaction with the event loop.

1 Like

Hm. So I was thinking mostly about avoiding having to wait for the first poll() in order to return a result I already had cached.

But if the very next thing I do with that value is chain it into another (and another ..) future, then it doesn't really matter because the one turn will poll all the way down until one of them actually needs to defer. That's a really great answer, thanks!