Re: question about HDF5 parallel use of unlimited dimensions...

NOTE: The netcdf-hdf mailing list is no longer active. The list archives are made available for historical reasons.

In general, any update to the file in parallel needs to be done
thoughtfully, and generally should be done in batches rather than
small increments.

Extending by single records is inefficient in all cases, but
very costly in parallel, since it updates a global state.

On Wed, 29 Jun 2005, Ed Hartnett wrote:

> Howdy all!
> 
> I have a question about parallel programming with HDF5. Do parallel
> programmers just not use unlimited dimensions?
> 
> I noticed that H5Dextend is a collective operation. This would make it
> difficult for programs to add a record at a time.
> 
> Alternatively I could imagine a scheme where extending the dataset
> would be buffered - some large number of records could be added
> whenever the current batch is used up.
> 
> Any thoughts or comments about the use or misuse of unlimited
> dimensions with HDF5 parallel would be most welcome.
> 
> Thanks!
> 
> Ed
> 
> 

-- 
Robert E. McGrath
National Center for Supercomputing Applications
University of Illinois, Urbana-Champaign
Champaign, Illinois 61820
(217)-333-6549

mcgrath@xxxxxxxxxxxxx


  • 2005 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdf-hdf archives: