What bounds do we have on the worst case sizes of Vec<T> and HashMap<K, V> ?
For Vec<T>, assuming only push/modification (and no pop) I believe it is O(1) + 2 * n * sizeof(T), because the 'push' on the Vec may allocate double the space.
For HashMap<K, V> do we have any bounds?
Context: dealing with data structures where storing the data itself is in the GBs, so want to calculate if the worst case fits in RAM.
I imagine HashMap will have similar bounds, with it consuming 2x the map's capacity plus/minus a key-value pair.
The version in std is based on SwissTable, and they went through how it works in a fair amount of detail at a cppcon talk a while back. That'll probably have the technical details you are looking for.
This code is probably key to finding the value. The "load" adds some space overhead, but it looks like it's still bumping up to powers of two. I didn't look deep enough to make any strong statements (such as a formula), and I don't know if there's any actual documented guarantees you can rely on.