Does the hdf5 crate support partial writes?
If the data that need to be stored fit comfortably in memory, then they can be written into an HDF5 file approximately like this:
hdf5::File::create(file_name)? .create_group(group_name)? .new_dataset_builder() .with_data(&data) .create(dataset_name)?;
But what can be done if
data is excessively large?
I'm not managing to figure this out from the API-only docs.