[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[netCDF #PRD-617628]: Large sized netcdf-4 file



Hi Yucheng,

> I would like to read a variable out from a netcdf file and apply compression 
> on that then write everything back,
> Do you have a sample code to do that? It is a bit messy to read other 
> variables out, so I would prefer not reading
> Other variables if possible.

We don't have sample code for that, but you can use the "nccopy" utility that 
comes with netCDF version 4.2 to
compress all the variables in a netCDF file and write the result to another 
file, like this

  nccopy -d 1 input.nc output.nc

which will apply compression level 1 to the variables in input.nc and write to 
output.nc.

--Russ

> -----Original Message-----
> From: Unidata netCDF Support [mailto:address@hidden]
> Sent: Wednesday, February 15, 2012 5:47 PM
> To: Song, Yucheng
> Cc: address@hidden
> Subject: [netCDF #PRD-617628]: Large sized netcdf-4 file
> 
> Hi,
> 
> > Also, I couldn't find an example to show how to or classic_format and
> > hdf5 to create the
> > Netcdf-4 classic format.
> > nf90_create(FILE_NAME, (nf90_classic_model|nf90_hdf5) doesn't work.
> 
> Sorry, the documentation should be clearer.  You can use:
> 
> nf90_create(FILE_NAME, IOR(nf90_netcdf4, nf90_classic_model), ncid)
> 
> > The confusion from netcdf is that it is vague from the online document
> > on how to generate a
> > Netcdf4 classic - when compiling the library for this, does one need hdf5 
> > lib at all?
> 
> Yes, the hdf5 library is needed becasue a netCDF-4 classic model file is 
> really an HDF5 file with an artifact to make sure it is readable by old 
> programs lijnked to the netCDF-4 library.
> For an explanation, see this FAQ, and maybe some of the subsequent questions 
> and answers:
> 
> http://www.unidata.ucar.edu/netcdf/docs/faq.html#fv1
> 
> > Using your online example pres_temp_4D_wr.f90, I tried to modify the
> > line into call check( nf90_create(FILE_NAME, nf90_hdf5, ncid) )
> >
> > now when I look at the file sizes, it is significanly larger (7 times), I 
> > thought hdf5 should be smaller, anything wrong?
> >
> > -rw-rw-r--                  16867 Feb 10 10:55 pres_temp_4D.nc
> > -rw-rw-r--                    2784 Feb 10 10:38 pres_temp_4D.nc_nocompress
> 
> netCDF-4 files are only smaller if compression is used when writing the data.
> 
> Here's an answer to an FAQ on that:
> 
> http://www.unidata.ucar.edu/netcdf/docs/faq.html#fv8
> 
> Compression is done on a variable-by-variable basis, so it's possible to use 
> different compression levels on different variables. But if you just want to 
> compress all variables in the file using the same compression level, the 
> easiest way is using the nccopy utility, as in
> 
> nccopy -d1 foo.nc foo-compressed.nc
> 
> to use "deflation level" 1.
> 
> Note that for small files, the compressed netCDF-4 files will actually be 
> larger than uncompressed netCDF-3 (classic format) files, because the 
> underlying HDF5 format has some fixed-size overhead. But for large datasets, 
> compression will save space, at the expense of taking more time to read and 
> write the data.
> 
> I just tested the 4.1.3 nccopy on an example file for compression and got 
> this:
> 
> $ nccopy -d1 llnl.nc llnl-compressed.nc && ls -l llnl*.nc
> -rw-rw-r-- 1 russ ustaff 13802 Feb 15 14:58 llnl-compressed.nc
> -rw-rw-r-- 1 russ ustaff 32864 Oct 17 11:05 llnl.nc
> 
> --Russ
> 
> 
> Russ Rew                                         UCAR Unidata Program
> address@hidden                      http://www.unidata.ucar.edu
> 
> 
> 
> Ticket Details
> ===================
> Ticket ID: PRD-617628
> Department: Support netCDF
> Priority: Normal
> Status: Closed
> 
> 
> 

Russ Rew                                         UCAR Unidata Program
address@hidden                      http://www.unidata.ucar.edu



Ticket Details
===================
Ticket ID: PRD-617628
Department: Support netCDF
Priority: High
Status: Closed