[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[netCDF #UPK-483521]: NetCDF NC_MAX_VARS



Hi Scott,

> I'm trying to modify the maximum variable number in NetCDF, but I'm having 
> trouble. From a clean tarball, I have modified ./libsrc/netcdf.h as follows:
> 
> #define NC_MAX_DIMS     65536    /* max dimensions per file */
> #define NC_MAX_ATTRS    8192     /* max global or per variable attributes */
> #define NC_MAX_VARS     524288   /* max variables per file */
> #define NC_MAX_NAME     256      /* max length of a name */
> #define NC_MAX_VAR_DIMS 1024     /* max per variable dimensions */
> 
> The author of https://bugtracking.unidata.ucar.edu/browse/NCF-193 seemed to 
> have good results with these numbers. But when I run configure and then make 
> check, I get the following error:
> 
> make  check-TESTS
> make[2]: Entering directory 
> `/nas01/depts/ie/cempd/FAA-DDM/cmaq/ddm_v2/lib/netCDF_2/netcdf-3.6.3/nctest'
> Testing V2 API with 2 different netCDF formats.
> 
> Switching to netCDF classic format.
> *** Testing nccreate ...        ok ***
> *** Testing ncopen ...          ok ***
> *** Testing ncredef ...         ok ***
> *** Testing ncendef ...         ok ***
> *** Testing ncclose ...         ok ***
> *** Testing ncinquire ...       ok ***
> *** Testing ncsync ...          ok ***
> *** Testing ncabort ...         ok ***
> *** Testing ncdimdef ...        ok ***
> *** Testing ncdimid ...         ok ***
> *** Testing ncdiminq ...        ok ***
> *** Testing ncdimrename ...     ok ***
> *** Testing ncvardef ...        ok ***
> *** Testing ncvarid ...         ok ***
> *** Testing ncvarinq ...        ok ***
> *** Testing ncvarput1 ...       ok ***
> *** Testing ncvarget1 ...       ok ***
> *** Testing ncvarput ...        ok ***
> *** Testing ncvarget ...        ok ***
> *** Testing ncvarputg ...       ok ***
> *** Testing ncvargetg ...       ok ***
> /bin/sh: line 4: 30132 Segmentation fault      ${dir}$tst
> FAIL: nctest
> nctest_classic.nc ./ref_nctest_classic.nc differ: byte 292, line 3
> FAIL: compare_test_files.sh
> =========================================
> 2 of 2 tests failed
> Please report to address@hidden
> =========================================
> 
> This error does not occur with an unmodified netcdf.h. Should I be changing 
> something else?
> 
> I am working on a Linux cluster, and using portland group compilers. The 
> intended use of NetCDF is for storing CMAQ atmospheric simulation data.

I've reproduced the problem you're seeing with the 5-year old
netcdf-3.6.3 release, but also verified that it's apparently still in
the latest version 4.3.0 release when testing access to classic-format
files, and results in memory access errors when run under valgrind.

It appears that there may be a frame stack overflow or something
similar when NC_MAX_VARS is increased to that large a value.

I'm creating a bug-tracking ticket for this, in case you want to
follow progress on fixing it:

   https://bugtracking.unidata.ucar.edu/browse/NCF-253

--Russ

> Thanks,
> 
> Scott Boone
> 
> --
> 
> scott boone
> 
> university of north carolina
> | environmental sciences and engineering
> | city and regional planning
> | institute for the environment
> 
> 

Russ Rew                                         UCAR Unidata Program
address@hidden                      http://www.unidata.ucar.edu



Ticket Details
===================
Ticket ID: UPK-483521
Department: Support netCDF
Priority: Normal
Status: Closed