Let me make sure I understand. That counter is an HDF5 limitation,
correct?
=Dennis Heimbigner
Unidata
On 1/26/2017 3:46 PM, Dave Allured - NOAA Affiliate wrote:
All,
We traced this problem to a known bug. In NetCDF-4, the number of times
an attribute can be modified over the life of a file is currently
limited by the per-variable HDF5 attribute creation index. This is a
16-bit counter with maximum value 65535.
This becomes a problem for data sets with attributes that are updated
frequently. At the NetCDF user level, the problem shows as return
status -107, or "NetCDF: Can't open HDF5 attribute", as reported below
by Cathy Smith.
Please see this Github issue for more details. Thanks to Constantine
Khroulev for a great analysis, as well as previous reporters Heiko Klein
and Andrey Paramonov (HDF forum).
https://github.com/Unidata/netcdf-c/issues/350
We look forward to some kind of fix at the NetCDF or HDF5 level.
--Dave A.
NOAA/ESRL/PSD/CIRES
On Tue, Nov 15, 2016 at 8:45 AM, Cathy Smith (NOAA Affiliate)
<cathy.smith@xxxxxxxx <mailto:cathy.smith@xxxxxxxx>> wrote:
Hi all
We are trying to figure out why we have been getting a particular
HDF5 error when trying to change an attribute. I run update code for
datasets via cron. At random times over the las year for at least 5
separate files in different datasets with different update code, we
discovered we suddenly couldn't update the actual range attribute
for a variable as the netCDF API did not see the attribute anymore.
There was no obvious cause . Once the file was "broken" it stayed
that way . Trying to update or fix the attribute gives the errors
below.
To fix the file we either restore the file from backup or regenerate
it using nccopy on the broken file.
Someone here has done some extensive testing. They found that
Fortran, NCO, and NCL can't work on the attribute. Also, h5edit
can't. But, some HDF5 applications that view the file do not see an
error. And the file gives no error with ncdump though it doesn't
show the actual_range attribute.
Has anyone seen this error before? Any ideas on the cause come to
mind? We are debating NFS issues or library as the cause but are
leaning towards the former, at least partially.
I have a broken and working file in ncftp ftp.cdc.noaa.gov
<ftp://ftp.cdc.noaa.gov>
cd Public/csmith/netcdf
Thanks for any insight
Cathy Smith
ESRL/PSD
Commands we ran to show error:
ncdump -h precip.V1.0.2016.nc <http://precip.V1.0.2016.nc>
<http://precip.V1.0.2016.nc> <http://precip.V1.0.2016.nc> |grep act
lat:actual_range = 20.125f, 49.875f ;
lon:actual_range = 230.125f, 304.875f ;
time:actual_range = 1893408., 1900824. ;
When I try to add it back, I get
ncatted -h -O -a actual_range,precip,c,f,"100000.,-100000."
precip.V1.0.2016.nc <http://precip.V1.0.2016.nc>
<http://precip.V1.0.2016.nc> <http://precip.V1.0.2016.nc>
nco_err_exit(): ERROR Short NCO-generated message (usually name of
function that triggered error): nco_enddef()
nco_err_exit(): ERROR Error code is -107. Translation into English
with nc_strerror(-107) is "NetCDF: Can't open HDF5 attribute"
nco_err_exit(): ERROR NCO will now exit with system call
exit(EXIT_FAILURE)
and
ncatted -h -O -a actual_range,precip,o,f,"10000
0.,-100000." precip.V1.0.2016.nc <http://precip.V1.0.2016.nc>
<http://precip.V1.0.2016.nc> <http://precip.V1.0.2016.nc>
nco_err_exit(): ERROR Short NCO-generated message (usually name of
function that triggered error): nco_enddef()
nco_err_exit(): ERROR Error code is -107. Translation into
English with
nc_strerror(-107) is "NetCDF: Can't open HDF5 attribute"
nco_err_exit(): ERROR NCO will now exit with system call
exit(EXIT_FAILURE)
----------------------------------------------
_______________________________________________
NOTE: All exchanges posted to Unidata maintained email lists are
recorded in the Unidata inquiry tracking system and made publicly
available through the web. Users who post to any of the lists we
maintain are reminded to remove any personal information that they
do not want to be made public.
netcdfgroup mailing list
netcdfgroup@xxxxxxxxxxxxxxxx
For list information or to unsubscribe, visit:
http://www.unidata.ucar.edu/mailing_lists/