2008 Unidata NetCDF Workshop for Developers and Data Providers > The "Classic" NetCDF Data Model
3.14 Classic NetCDF Model Limitations
The classic netCDF data model used for netCDF-3 has some limitations.
Its simplicity makes it is easy to understand, but
limitations include:
- No real data structures, just multidimensional arrays and lists
- No nested structures, variable-length types, or ragged arrays
- Only one shared unlimited dimension for appending
new data
- A flat name space for dimensions and variables
- Character arrays rather than strings
- A small set of numeric types
In addition, the classic netCDF format has performance
limitations for high performance computing with very large
datasets:
- Large variables must be less than 4 GB (per record)
- No real compression supported, just scale/offset packing
- Changing a file schema (the logical structure of the
file) may be very inefficient
- Efficient access sometimes requires data to be read in the same
order as it was written
- Big-endian bias may hamper performance on little-endian
platforms
- I/O is serial in Unidata netCDF-3 (but see Argonne/Northwestern
Parallel netCDF project)
Despite these limitations, netCDF-3 is very widely used in climate
modeling, ocean science, and atmospheric science, and has been used to
represent some very complex data, e.g. the grids described in Balaji's
GridSpec:
2008 Unidata NetCDF Workshop for Developers and Data Providers > The "Classic" NetCDF Data Model