netCDF analysis and display

Jim Mansbridge writes:
> 
> I have just started using netcdf and was wondering if there were any user 
> friendly interactive packages for producing plots of data in netcdf files. 
> I am thinking of something where the user could simply put in the name of 
> the netcdf file and then be asked some questions about what variables 
> should be plotted, in what domain, etc.  It sounds fairly straight-forward
> to interface with something like the ncar graphics, but rather tedious to 
> write.  

> One way of doing it, at least for fairly small data sets, would be to 
> produce some matlab files from the netcdf files and then use matlab to 
> analyse the data.  I am also interested in finding out if anyone has 
> already written the software to do this

There are several groups in the oceanographic and hydrologic community at
Woods Hole and at the U.S. Geological Survey in Reston who are developing
software for analyzing and visualizing data in netCDF files.  Here's what
we have done so far (and what we are working on...)

1. Here at the USGS in Woods Hole, we have developed a netCDF/MATLAB interface
   that allows people working in MATLAB to use access netCDF files through
   MATLAB function calls.  Basically, we constructed MATLAB .m files 
   (MATLAB functions) that mimic each of the netCDF subroutines.   Thus a user
   in MATLAB can type

     mcinq('foo.cdf')

   to inquire about a netCDF file, or 

     u=mcvgt('foo.cdf','temperature',[10 10 ],[20 20])

   to get a 20x20 hyperslab of temperatrue from the file foo.cdf.  
   Once you've got it in MATLAB, of course, you can contour it, do a 2DFFT on
   it, and basically analyze it till the cows come home.

   Contact me for information on how to obtain this software.

2. At the USGS in Reston, they are developing a Model
   Output Plotting Program (MOPS) which will take 1 or 2D data from
   netCDF files and plot them using NCAR Graphics and 
   GKS.  MOPS will determine automatically what kind of plots it can make
   (contour, x-y, etc.) depending on the dimensionality of the data and 
   certain netCDF attributes that must be defined for each variable.  
   These attributes tell MOPS whether the
   data is uniformly spaced along each dimension, what the independent 
   variables are, etc.   Contact hjenter@xxxxxxxxxxxxxxxxxx for more 
   information on this project.

3. A few months ago, I realized that if would be easy to modify
   XDataSlice to work with netCDF data.  XDataSlice is a 
   public-domain orthogonal slicing and isosurface rendering program that
   was developed at NCSA to work on HDF files.   I grabbed the source code from
   ftp.ncsa.uiuc.edu and all the HDF calls are contained in a single C routine.
   All it basically does is: find out how many dimensions the HDF variable 
   has to make sure the variable is 3D, then grabs all the data!. 
   It would not take more than a few days (maybe a few hours, even) 
   for a C programmer to convert the code to read netCDF files instead (or
   in addition).   NCSA gives you permission to do anything you want to the 
   source code as long as you don't try to make money off it.  I would have
   done it myself, but: (1) I don't know C.  (2) I don't work with 3D 
   Cartesian Data (my 3D data is on an irregular grid).
   
4. Another tack for visualizing 3D data in netCDF is to get ahold of
   PolyPaint, a 3D visualization package from NCAR.  It has a rich suite
   of 3D visualization tools that operate on data in netCDF form. 
   You can get more info from boyd@xxxxxxxxxxxx.    This costs some money,
   but not much ($300 when I bought it a few months ago).

5. Yet another tack is to use a visualization environment like AVS.  We have
   written a AVS module that brings netCDF data into AVS (although at the
   moment it is somewhat hard-wired for my netCDF modeling output).  Once
   in AVS, there are hundreds of analysis and display modules available:
   image processing stuff, isosurface rendering, arbitrary slicing, alpha
   blending, streamline and vorticity calculation, particle advection, etc.
   The reason we like AVS is that is runs on many different platforms 
   (Stardent, DEC, Cray, Convex, E and S, SET, Sun, IBM, SGI, HP, FPS 
   and WaveTracer), and that it has a flexible data model capable of handling
   multidimensional data on non-Cartesian grids.   Of course, this costs more
   money, probably at least a couple of thousand, depending on the platform.

Hope this helps,

Rich Signell               |   rsignell@xxxxxxxxxxxxxxxxxx
U.S. Geological Survey     |   (508) 548-8700
Quissett Campus            |   "You need a license to dig clams...
Woods Hole, MA  02543      |      ... but anybody can have kids."


  • 1991 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdfgroup archives: