NOTE: The galeon
mailing list is no longer active. The list archives are made available for historical reasons.
Hi all, As the person who chose to forward John Caron's message to the GALEON email list, I want to take a run at putting this discussion into a GALEON context, recalling that the N in GALEON stands for NetCDF. To go back to GALEON 1, our first attempt to put serve netCDF datasets via a standards-based protocol, we discovered problems due to limitations in the protocol and other difficulties due to need for more carefully defined conventions and encoding formats in the netCDF world. These problems are exacerbated in GALEON 2 as we attempt to define a path for serving a wider range of "Fluid Earth Science" data via standards based protocols. What I see in John's documents is an effort to work toward a solution starting from the netCDF side while keeping clearly in mind the standard protocol specifications. It is a DRAFT first effort to specify how the most common dataset types in the FES community can be encoded in netCDF with appropriate required extensions to the CF conventions. The attempt to define the specifications in the language of the OGC standards world has brought to light the fact that FES datasets span the realm of OGC Coverages, Features, Observations and Measurements. And there is not yet complete agreement among those realms. (smile face here). One hopes this "vigorous" discussion will lead to more clarity and coherence among those areas. But, in the meantime and in parallel, it is important to continue John's crucial work in laying out the specifications which will form the basis for one form of binary encoding in which the datasets will be encoded so they can be conveyed via a standard protocol. My own preferred approach to grasping these issues is to focus on the mechanisms by which location information is to be encoded. These range from geometric algorithms used for the output of a few forecast models which conform to the classic regular gridded coverage where the coordinates are regularly spaced along a set of orthogonal axes to the opposite extreme of lightning data where the coordinates are random in space and hence have to be included explicitly with each data point. In between there are the station observations where the locations of the observations remain fixed in space and can be encoded in an indexed table of observing station information. I believe the location information in the main FES data types can all be encoded by specifying some combination of geometric algorithms, table lookup, and explicit coordinates. That's the perspective from which I am viewing John's draft documents. But that goes beyond the scope of an email conversation so I'll see if I can make the ideas hang together is a white paper of some sort. -- Ben On Fri, Mar 14, 2008 at 4:09 AM, Luis Bermudez <bermudez@xxxxxxxxx> wrote:
Hi Simon, So my sense is that we need best practice documents regarding how interpret in different ways an observation, and how the realization of these observations could allow communities to interoperate. I don't think what you said is very clear from the public specification. But it is getting clear everyday :) One more comment: Isn't the sampling strategy part of the observing procedure ? - Luis
galeon
archives: