[netcdfgroup] variable compression (deflate) failing in 4.3.2

Hi,

Something seems to have changed between Netcdf 4.3.1 and 4.3.2 where 
compression has stopped working for my application, which creates a number of 3 
and 4-dimensional variables. One of the dimensions is unlimited. I'm using the 
4.4.0 fortran (f90) interface. Ncdump -s shows the same attributes for both 
4.3.1 and 4.3.2, for example:

                QC:_ChunkSizes = 1, 10, 40, 40 ;
                QC:_DeflateLevel = 2 ;
                QC:_Shuffle = "true" ;

But the file size from using 4.3.2 is the same as when deflate is turned off. 
Very strange. I tested one of the check programs in nf_test with a larger 
array, and it seems to work as expected, however, with both 4.3.1 and 4.3.2 
(i.e., it compresses successfully with both). I changed f90tst_vars4.f90 to 
write out a 4d variable with unlimited time dimension, but it does not 
reproduce the problem.  We are also seeing a similar problem with the WRF model 
history files (even with just one time level), where it appears that some of 
the variables are getting compressed, but some are not (i.e., the files are 
little smaller (about 30%) than without deflate, but still substantially larger 
than when using an earlier version of netcdf, which are about 1/3 the 
non-deflated size).

Has anybody else noticed this? The data write out fine and no errors are 
generated (or at least not detected).

-- Ted

__________________________________________________________
| Edward Mansell <ted.mansell@xxxxxxxx>
| National Severe Storms Laboratory, Norman, OK.
|--------------------------------------------------------------
| "The contents of this message are mine personally and
| do not reflect any position of the U.S. Government or NOAA."
|--------------------------------------------------------------



  • 2014 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdfgroup archives: