Hey,
I'm working on an algorithm that uses large amounts of memory.
Most of that is consumed by a single Vec of small elements. There is no way to estimate its size beforehands.
In some cases, the Vec starts out on a lucky size, that when it doubles its capacity ends up using most of the RAM, say 220GiB/256GiB. Then, the algorithm runs successfully.
But sometimes it starts at a size where it doubles to 300GiB/256GiB, which cannot be allocated. In this case, the effective limit is 150GiB.
Is there a way to make the Vec alloc all the memory available before it fails?
Or, on another thought, a way to create a Vec like Vec::with_capacity(usize)
, but with a Result
as return value instead of panicing on failed alloc?