[netcdfgroup] Problem with netCDF "classic" files larger than 2GB on Windows

Folks,

I am having a problem writing netCDF "classic" files larger than 2GB
when building with the Microsoft Visual Studio 2008 compiler.  The same
code works fine when build on Linux, and on Cygwin.

A bit of background.  This is a project to build generic file writers
for cameras and detectors in the areaDetector package
(http://cars9.uchicago.edu/software/epics/areaDetectorDoc.html) for the
EPICS (http://www.aps.anl.gov/epics/) real-time control system.  This is
a large project, with its own build system based on gnumake.  I am
building the basic netCDF library from the same source code on all
supported platforms (Linux 32 and 64-bit, Windows 32 and 64-bit with
Microsoft compiler, Windows with Cygwin gcc compilet, vxWorks, Darwin,
Solaris, etc.).  Because I have another file writer that handles HDF5, I
am using netCDF 3.6.3, since I only want to netCDF to create "classic"
files.  I am using 3.6.3 because it is less complex than 4.x, not
requiring any HDF5 support, etc.

The application is typically streaming uncompressed images, using the
UNLIMITED dimension as the streaming dimension. Thus, each record is
small, only a few MB, and the file size limitations of the classic
format are not a problem.  Here is an ncdump of a file header created
with this file writer on Linux:

corvette:ADApp/op/adl>ncdump -h /home/epics/scratch/netcdf_test_1.nc
netcdf netcdf_test_1 {
dimensions:
        numArrays = UNLIMITED ; // (4100 currently)
        dim0 = 1024 ;
        dim1 = 1024 ;
        attrStringSize = 256 ;
variables:
        int uniqueId(numArrays) ;
        double timeStamp(numArrays) ;
        byte array_data(numArrays, dim0, dim1) ;
        int Attr_ColorMode(numArrays) ;
        double Attr_AcquireTime(numArrays) ;
        double Attr_RingCurrent(numArrays) ;
        char Attr_RingCurrent_EGU(numArrays, attrStringSize) ;
        double Attr_ID_Energy(numArrays) ;
        char Attr_ID_Energy_EGU(numArrays, attrStringSize) ;
        int Attr_ImageCounter(numArrays) ;
        int Attr_MaxSizeX(numArrays) ;
        int Attr_MaxSizeY(numArrays) ;
        char Attr_CameraModel(numArrays, attrStringSize) ;
        char Attr_CameraManufacturer(numArrays, attrStringSize) ;

// global attributes:
                :dataType = 1 ;
                :NDNetCDFFileVersion = 3. ;
                :numArrayDims = 2 ;
                :dimSize = 1024, 1024 ;
...

So the only large array is called "array_data", and in this case it is
[4100, 1024, 1024], where 4100 is the unlimited dimension.  Thus, this
file is over 4GB, and it can be written and read with no problems on
Linux and Cygwin.  It also works fine when writing files on Windows with
the Microsoft compiler, up to file sizes of 2GB.

However, when I try to write a file on Windows larger than 2GB using the
program built with the Visual Studio compiler I get the following error:

Assertion failed: pxp->bf_offset <= offset && offset < pxp->bf_offset +
(off_t) pxp->bf_extent, file ..\posixio.c, line 325

When I look at the code at line 325 in posixio.c, I see that offset and
pdxp->bf_offset are of type off_t.  I added a printf() in that code to
print the sizeof(offset) and sizeof(off_t), and it comes up as 4, not 8.


But when I look at the config.h file that comes in the win32/NET
directory in netCDF 3.6.3 it has the following:
corvette:areaDetector/ADApp/netCDFSrc>grep -C3 off_t
/usr/local/netcdf/netcdf-3.6.3/win32/NET/config.h
/* #undef HAVE_ST_BLKSIZE */

/* Define to `long' if <sys/types.h> doesn't define.  */
/* #undef off_t */

/* Define to `unsigned' if <sys/types.h> doesn't define.  */
/* #undef size_t */
--
/* The number of bytes in a size_t */
#define SIZEOF_SIZE_T 4

/* The number of bytes in a off_t */
#define SIZEOF_OFF_T 8

/* Define to `int' if system doesn't define.  */


So it defines SIZEOF_OFF_T to be 8, not 4.

I have generated the netCDF config.h file on Linux, but then edited it
to correctly (?) define things on other platforms, like _WIN32 and
vxWorks.

The compiler flags being used on Windows are illustrated in the
following output when I build:

cl -c               /nologo /D__STDC__=0 /D_CRT_SECURE_NO_DEPRECATE
/D_CRT_NONSTDC_NO_DEPRECATE   /Ox /GL   /W3 /w44355
/D_WIN32_WINNT=0x0503 -D_FILE_OFFSET_BITS=64   /MT -DEPICS_DLL_NO    -I.
-I..\\O.Common -I. -I.. -I..\\..\\..\\include\\os\\WIN32
-I..\\..\\..\\include  -IJ:\\epics\\devel\\asyn-4-17\\include
-IJ:\\epics\\devel\\calc-2-8\\include
-IJ:\\epics\\devel\\busy-1-3\\include
-IJ:\\epics\\devel\\sscan-2-6-6\\include
-IJ:\\epics\\devel\\mca-6-12-4\\include
-IJ:\\epics\\devel\\autosave-4-7\\include\\os\\WIN32
-IJ:\\epics\\devel\\autosave-4-7\\include
-IJ:\\epics\\devel\\areaDetector-1-7beta1\\include\\os\\WIN32
-IJ:\\epics\\devel\\areaDetector-1-7beta1\\include
-IH:\\epics\\base-3.14.12.1\\include\\os\\WIN32
-IH:\\epics\\base-3.14.12.1\\include       ..\\var.c

There is something I don't understand here.

Has netCDF 3.6.3 been tested to correctly write classic files > 2GB with
the Microsoft compiler?  Why am I getting the assert error?

Thanks very much,
Mark Rivers

Attachment: config.h
Description: config.h

  • 2011 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdfgroup archives: