So it does work with fewer variables? Perhaps this is a netcdf3 issue
with record size. It's not the total size of the file that is an
issue, but the size of the record (how much data per increment of the
unlimited dimension). So if you have fewer times but more data per
time, you could be hitting the data limit. What are your variable
array sizes?
I have the same task (combining files from a parallel application),
and have used netcdf4 successfully to create combined files. If you
can use netcdf4, you might try converting the code to use it and turn
on compression for variables -- it can save a LOT of disk space. I
recently worked with a student to produce a large weather simulation,
and with netcdf4 the entire domain could be combined, but only a
fraction would fit into a netcdf3 file. (He needed netcdf3 since his
analysis software does not yet support netcdf4.) So you might see if
combining only part of the data works.
Best,
-- Ted
On Jul 31, 2009, at 3:21 AM, Giuseppe Grieco wrote:
It does not seem a problem of the size of the final assembled NETCDF
file because I was able to assemble files larger than 10 GB and in
this case it should be lower than 8 GB.
In particular, it happens that when I try to assemble the file with
a lower number of variables (for example 8 instead of 23), it goes
on correctly. It seems, so, that my NETCDF library has kind of a
combined limit on the size and the number of variables. May it be
possible? Can anyone help me?