Two things come to mind.
1. I assume you are writing a netcdf-4 (enhanced model) file. There is
overhead
associated with the file such that even if you wrote no data, it
would have a
noticable size. However, it should not be 73 MB!
2. Do you have fillvalue set and are there other, large variables that
you did not write?
If so, then they will implicitly be filled with the fillvalue and
written out.
=Dennis Heimbigner
Unidata
On 4/5/2016 12:44 PM, Val Schmidt wrote:
Hello netcdf folks,
I’m testing some python code for writing sets of timestamps and
variable length binary blobs to a netcdf file and the resulting file
size is perplexing to me.
The following segment of python code creates a file with just two
variables, “timestamp” and “data”, populates the first entry of the
timestamp variable with a float and the corresponding first entry of
the data variable with an array of 100 unsigned 8-bit integers. The
total amount of data is 108 bytes.
But the resulting file is over 73 MB in size. Does anyone know why
this might be so large and what I might be doing to cause it?
Thanks,
Val
from netCDF4 import Dataset
import numpy
f = Dataset('scratch/text3.nc','w')
dim = f.createDimension('timestamp_dim',None)
data_dim = f.createDimension('data_dim',None)
data_t = f.createVLType('u1','variable_data_t’)
timestamp = f.createVariable('timestamp','d','timestamp_dim')
data = f.createVariable('data',data_t,'data_dim’)
timestamp[0] = time.time()
data[0] = uint8( numpy.ones(1,100))
f.close()
------------------------------------------------------
Val Schmidt
CCOM/JHC
University of New Hampshire
Chase Ocean Engineering Lab
24 Colovos Road
Durham, NH 03824
e: vschmidt [AT] ccom.unh.edu <http://ccom.unh.edu>
m: 614.286.3726
_______________________________________________
netcdfgroup mailing list
netcdfgroup@xxxxxxxxxxxxxxxx
For list information or to unsubscribe, visit:
http://www.unidata.ucar.edu/mailing_lists/