Adopt New HDF5 library API for chunk info in the DMR++ handler
As part of this API adoption create and maintain a test server running the new code against an existing (NASA?) S3 bucket of data files.
Sort out `dataset_rank` -
Figure out "copy" loop (or similar way to copy coord values)
Sort out filter information - find out what filter_mask means
Make sure COMPACT storage works