I have a large number of
HashMaps (and also
HashSets), most of which have very few elements. Occasionally they may have many, which is why I care about scaling, but my memory use is looking poor, because their minimum size is 32 times the memory I need. I’m wondering if it might be efficient to dyamically switch between a
Vec of tuples for small numbers (the very common case). It seems a lot of work, but my memory use also seems a bit excessive. It’s hard to imagine that this hasn’t been tried. Is there something out there I could be using to improve my memory footprint?
For clarity: in my current (small) test case, I’ve got a couple of thousand
HashSetss with one element each, and a thousand
HashMaps with two elements. These take up 3 MB between them.
A related question: many of my
HashSets are sets of integers (wrapped in a
Copy/Clone struct), and it’d be lovely to not have the overhead (which it looks like
HashSet has?) of storing pointers to heap-allocated integers. In general, it seems that the optimal collections for small
Copy types would be different than for large non-
Copy types. Any suggestions?