 
  
   
    2012 Unidata NetCDF Workshop  > The NetCDF Classic Data Model
 
2012 Unidata NetCDF Workshop  > The NetCDF Classic Data Model  
 
5.12 Classic NetCDF Model Limitations
The classic netCDF data model used for netCDF-3 has some limitations.
Simple and easy to understand, but: 
  - No real data structures, just multidimensional arrays and lists
- No nested structures, variable-length types, or ragged arrays
- Only one shared unlimited dimension for appending
  new data
- A flat name space for dimensions and variables
- Character arrays rather than "real" strings
- A small set of numeric types
Corresponding classic netCDF format has performance limitations:
  - Large variables must be less than 4 GB (per record)
- No real compression supported, just scale/offset packing
- Changing a file schema (the logical structure of the
  file) may be very inefficient
- I/O is serial in Unidata netCDF-3 (but see Argonne/Northwestern
  Parallel netCDF project)
- Efficient access requires data for a variable to be written and
  read contiguously
Despite these limitations, netCDF-3 is very widely used in climate
modeling, ocean science, and atmospheric science, and has been used to
represent some very complex data, e.g. grids  described in Gridspec specification.
 described in Gridspec specification.
 
 
 
  
   
    2012 Unidata NetCDF Workshop  > The NetCDF Classic Data Model
  
2012 Unidata NetCDF Workshop  > The NetCDF Classic Data Model