John,
I see you're on ncdigest rather than the netcdfgroup mailing list, so
you wouldn't see the appended posting until next Monday morning, since
it doesn't get digested until later tonight. I'm taking the liberty of
forwarding it to you today, since I think it might be of more immediate
interest ...
--Russ
========================================================================
Cc: Jonathan Gregory <jmgregory@xxxxxxxxxxx>
Reply-to: russ@xxxxxxxxxxxxxxxx
Hi,
Independently of our recent discussion of netCDF coordinate conventions
and the merits of "multidimensional coordinate variables" vs. "referential
attributes", Jonathan Gregory, Bob Drach and Simon Tett have just
published a draft of "Proposed netCDF conventions for climate data",
available for review at
http://www-pcmdi.llnl.gov/drach/netCDF.html
or in PostScript form from
http://www-pcmdi.llnl.gov/drach/netCDF.ps.Z
This document extends the COARDS conventions, and in particular provides
a good specification for the meaning and use of multidimensional
coordinate variables.
I've created a link to the document from the netCDF Conventions page at
http://www.unidata.ucar.edu/packages/netcdf/conventions.html
The authors have asked for feedback:
Our principal interest in proposing this convention is to facilitate
the exchange of data among climate centres. NetCDF offers an
appropriate format for this purpose, and these conventions aim to
standardise the representation of metadata sufficiently that data
from different sources can be easily compared. We recognise that
there are limits to what a standard can practically cover; we
restrict ourselves to issues which we believe to be of common and
frequent concern in the design of climate metadata. Our convention
is mostly compatible with the existing COARDS convention, but we
have extended the scope and detail.
We are aware that some climate centres are already using netCDF as
their archive format. We would be very interested to have comments
from them, for instance on how our suggestions differ from what they
do, and on what lessons they have learned from experience. We would
welcome feedback from anyone on these conventions, such as on what
has been omitted, what could be improved, and how we should carry
this proposal forward. The exercise will only be useful if it has
the support of a number of climate centres, of course. One
application which will adopt this standard is the LATS software
distributed by the Program for Climate Model Diagnosis and
Intercomparison (PCMDI), sponsor of AMIP II. LATS will have an
option to generate netCDF files which conform to this standard.
Our second interest in developing this standard is its relation to a
logical model of the data and metadata. Describing how the data
should be stored in netCDF inevitably involves considering how it is
organised logically. We hope to be make a proposal for a
language-independent data model, which could be implemented in
various programming languages as a method of handling data either in
memory or in files. If this were done, it would offer a way of
making analysis programs more easily portable.
Please send any comments you may have on the proposed standard or
any of the above to any of us. Feel free to circulate it further if
you know others who would be interested. Thank you.
Jonathan Gregory jmgregory@xxxxxxxxxxx
Bob Drach drach@xxxxxxxx
Simon Tett sfbtett@xxxxxxxxxxx
--Russ
_____________________________________________________________________
Russ Rew UCAR Unidata Program
russ@xxxxxxxxxxxxxxxx http://www.unidata.ucar.edu
========================================================================
Hi,
Among other interesting points, John Caron wrote:
> The point is, until we can embed functions (methods) in our netcdf
> files, we cant really represent the above formula in the way it is
> written. What we can do now, however, is to compute the field
> Pressure(x,y,z) and store it in the netcdf file, and it becomes a
> perfectly good coordinate function for the "altitude" coordinate of a
> georeferencing coordinate system. So the cost is that we have to store a
> 3D field, when all the info is really available in 2 1D fields (a and b)
> and 1 2D field (SurfacePressure).
>
> Which is just a long example to say that we currently have only arrays
> to represent functions.
This is one of the great promises of "executable content", using Java,
for example. Data that is stored as an object can map itself to a
canonical coordinate system using a data-specific method, or interpolate
to return the data at a particular location and time.
The necessary functions may be stored with the data as portable byte
codes that could be loaded into a convenient Java Virtual Machine
running where the data is used, or could reference a remote method that
is transparently downloaded and executed when needed to transform the
data.
There is then no need for applications to support elaborate conventions
for parameterizing dozens of coordinate systems. Instead, applications
would assume a simple Interface (in the Java sense) for georeferencing
the data, and the data Class would implement the Interface.
But this requires something beyond netCDF. Unidata is planning to
make available, with help from external developers, the interfaces
necessary to make practical such a division of responsibilities between
data and applications.
In the meantime, well-designed conventions can limit the number of
representations for coordinate systems that applications must
understand. So, we have to get by with arrays until we have functions
...
--Russ
_____________________________________________________________________
Russ Rew UCAR Unidata Program
russ@xxxxxxxxxxxxxxxx http://www.unidata.ucar.edu