开发者

Read HDF5 monodimensional compound dataset in C structure in parallel

开发者 https://www.devze.com 2023-03-29 10:28 出处:网络
I have a very simple compound dataset, about 1 million rows of a compound data开发者_如何学Gotype:

I have a very simple compound dataset, about 1 million rows of a compound data开发者_如何学Gotype:

1 long, 3 doubles.

I would like to read it in parallel with MPI using a collective call, equally distributed on all the processors.

HDF5 low-level interface is very complex to use, and for such a simple dataset I wonder if I can use one of the high level API, like Lite or Table. What is not clear from the documentation is if these APIs support MPI collective read.

Could somebody write the simplest code snippet that can read this simple dataset into a C structure using HDF5 1.8?

I believe for an expert this should be trivial, but for a beginner the complexity of HDF5 (and its docs) makes it really a daunting task.

thanks.


Have you found the HDF5 parallel I/O tutorial?

http://www.hdfgroup.org/HDF5/Tutor/parallel.html

Sounds like you will also need to learn how to construct HDF5 datatypes:

http://www.hdfgroup.org/HDF5/Tutor/datatypes.html

In order to do a parallel collective call you've got to do a few things. First, you need to decompose your dataset across the processors. A little arithmatic gives you the start and count parameters you'll need. You will need to enable parallel I/O with an HDF5 property list, which is well documented in the parallel I/O tutorial. Less well documented is the flag for enabling collective I/O:

xfer_plist = H5Pcreate (H5P_DATASET_XFER);
ret=H5Pset_dxpl_mpio(xfer_plist, H5FD_MPIO_COLLECTIVE);

There's a great example at the end of this page:

http://www.hdfgroup.org/Parallel_HDF/PHDF5/ph5design.html

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号