If one’s goal is to have a “canonical” concurrent hash map, then it is unlikely that a C++ library will ever satisfy that; many developers will want their data structure libraries to be written entirely in Rust, for portability, a higher likelihood of soundness, and seamless integration with Rust generics and allocators.
My question may have conflated two problems. My goal is not to create a canonical concurrent hash map in Rust at this time. Realizing that a pure Rust solution does not exist today, I am searching out a Plan B that will allow me to gain the capability while a pure Rust solution progresses separately.
One possible approach is to use the C++ interop to work with Intel oneAPI concurrent_hash_map, and I am curious if that is a reasonable approach, or if the community has generally settled on a different Plan B for the moment.
I think this comes down to a difference in philosophy between Rust and C++.
Traditionally, the C++ committee has added more and more stuff to the standard library because the dependency story in the C & C++ world is... lacking. Most of the time people find it easier to distribute the C++ standard library rather than linking with a 3rd party library, so you end up including everything and the kitchen sink. This is how you get situations like std::regex which has massive problems that can't be fixed for ABI reasons.
On the other hand, Rust's standard library tends to be minimalist and only include
abstractions over the underlying OS (std::fs, std::path, std::thread, std::process, etc.)
common datastructures that will be used in essentially every program (Vec, HashMap, String, etc.), and
core abstractions that are needed to make the language work (Iterator, Send and Sync, Future, etc.).
For anything that might only be applicable to a specialised domain (matrices, GUIs, etc.) or where there is on one obvious implementation (regex, JSON, HTTP servers/clients, etc.) then we prefer to use crates from crates.io.
A concurrent hash map is actually a pretty specialised data structure and there are multiple ways you can implement them with various trade-offs (how many writers are allowed? when are modifications seen by other readers? etc.), so it falls into that latter category.
Adding native libraries to your dependency tree is actually a big pain because their build systems tend to be special snowflakes. So many C libraries aren't truly portable or have weird system dependencies that mean you can't use them outside of a Linux environment in practice.
For example, I can cross-compile most pure Rust projects to WebAssembly and everything will Just Work, even if they were never intended to be compiled to WebAssembly, but I'm yet to find a native library that will work out of the box.