[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[netCDF #CPE-163785]: file size does not change



Hi Jailin,

> I created a large 4D netcdf file, and append the data along the time
> dimension (unlimited),
> but the file size information (by a ls -al -h) doesn't change as more data
> are appended.
> it shows 155G.
> Anyone knows the reason?

Is this a netCDF-4 file or a netCDF-3 classic format file?  To determine the 
format,
just look at the output from running 

  $ ncdump -k filename.nc

If it's a netCDF-4 file (or a netCDF-4 classic model file), then using an 
unlimited dimension requires chunking, and it's possible to specify chunk
shapes so that only one chunk is appended to the file for a large amount of
data appended along the time dimension, as the chunk fills in.  To see whether
this is the case. it would be useful to see the chunk sizes and shapes, by
providing the output from running 

  $ ncdump -h -s filename.c

and looking at the "_ChunkSizes" attributes.  _ChunkSizes is a list of chunk 
sizes for each dimension of a variable.

If it's a netCDF-3 file, it's possible to write the data values out of order,
writing data for a large value of the time dimension first, and then appending
values along the time dimension starting with earlier times, in which case the
file size would start out large, but just fill in earlier records as earlier
times are written.

Do you have a small program that demonstrates the behavior?  It would be easier
to reproduce if you could demonstrate it with a file smaller than 155 GBytes 
:-).

--Russ



Russ Rew                                         UCAR Unidata Program
address@hidden                      http://www.unidata.ucar.edu



Ticket Details
===================
Ticket ID: CPE-163785
Department: Support netCDF
Priority: Normal
Status: Closed