The test is compiled as 32 bits, so size_t and int are both 32 bits.
That should not affect the test.
I'd recommend, that someone there compiles this under 32 bit linux and
checks if the results are the same.
Kari
On 4/14/2010 3:05 AM, Cedric Roux wrote:
John Caron wrote:
Kari, im forwarding this to netcdfgroup@xxxxxxxxxxxxxxxx, which deals
with the C library.
-------- Original Message --------
Subject: Looks like a bug in the netCDF NC_64BIT_OFFSET variables
Date: Tue, 13 Apr 2010 17:48:33 -0500
From: Kari Hoijarvi <hoijarvi@xxxxxxxxxxxxxx>
To: support-netcdf@xxxxxxxxxxxxxxxx
CC: caron@xxxxxxxx, "Rudolf B. Husar (E-mail)"
<rhusar@xxxxxxxxxxxx>, "Stefan Falke (E-mail)" <stefan@xxxxxxxxxxxx>,
Michael Decker <m.decker@xxxxxxxxxxxxx>, Ben Domenico
<Ben@xxxxxxxxxxxxxxxx>
Hello,
This looks like a bug in the NC_64BIT_OFFSET big variable handling.
I have a test here that creates 3000* 1000 * 2000 float variable, 24 GB
After nc_enddef, the file size is suspiciously 4 GB
After writing the data, the size is 16.0 GB (17,184,000,120 bytes),
instead of 24 GB.
Reading fails to produce expected results at strange offset 0, 516, 704
I attached my version netcdf/nc_test/nc_test.c, it has a new function
test_big_cube_without_unlimited_dim().
It should be easy to copy that over and run it.
If you are on a 64 bits computer your 'start' and 'count' arrays are
wrong.
You should define them as 'size_t start[3];' and 'size_t count[3];'
instead
of int as you now do.
If you use the gcc compiler, you should compile with the '-Wall' option
which emits some useful warnings.
If you are not on a 64b computer, I don't know. You program works for
me on a 64b host.