NOTE: The galeon
mailing list is no longer active. The list archives are made available for historical reasons.
Which brings me to Steve's email, with which I agree in broad terms. One thing that CF has that is not explicit/required in the netCDF API definition is at least the possibility of providing one standard name for each variable. (More would be better, but one step at a time....) I am sure this information makes it across the API when it is provided, but to be honest, in this day and age spending a lot of time standardizing the API, while remaining quiet about the semantics of the transported information, does not seem cost-effective. I think there might be some easy strategies for bridging that gap (mostly by insisting on CF-compliant data on the far side of the interface).
John On Jul 15, 2009, at 12:30 PM, Ron Lake wrote:
Hi,I think one needs to standardize BOTH – an access API and an encoding, AND to do this in a way that they work with one another. It is for this reason (as an example) that GML exposes the source data model (as well as acting as the data encoding for transport) so that WFS can define requests in a neutral manner. It should NOT be a matter of ONE or the OTHER. You might also look at the work of the XQuery Data Model group.RFrom: galeon-bounces@xxxxxxxxxxxxxxxx [mailto:galeon-bounces@xxxxxxxxxxxxxxxx ] On Behalf Of Steve HankinSent: July 15, 2009 12:18 PM To: Ben Domenico Cc: Unidata Techies; Unidata GALEON; Mohan Ramamurthy; Meg McClellanSubject: Re: [galeon] plan for establishing CF-netCDF as an OGC standardHi Ben,Firstly -- applause, applause! This is an important step. Thanks so much for leading it.If it is not too late, however, I'd like to open a discussion on a rather significant change in the approach. As outlined at the URL you provided the approach focuses on "CF-netCDF as an OGC binary encoding standard". Wouldn't out outcomes be more powerful and visionary if instead we focussed on the netCDF API as an OGC standard? Already today we see great volumes of GRIB-formatted data that are served as-if NetCDF through OPeNDAP -- an illustration of how the API as a remote service becomes a bridge for interoperability. The vital functionalities of aggregation, and augmentation via NcML are about exposing *virtual* files -- again, exposing the API, rather than the binary encoding.It is the ability to access remote subsets of a large netCDF virtual dataset, where we see the greatest power of netCDF as a web service. While this can be implemented as a "fileout" service (the binary encoding standard approach) -- and that has been done successfully in WCS and elsewhere -- it does not seem like the optimal strategy. It is the direct connection between data and applications (or intermediate services) -- i.e. the disappearance of the "physical" (binary) file -- which seems like the service- oriented vision. This would not eliminate the ability of the standard to deliver binary netCDF files in the many cases where that is the desired result. Simple REST fileout services are desirable and should perhaps be included as well in this standards package.David Artur (OGC representative) indicated at the meeting where we met with him in May that there were other examples of standardizing APIs within OGC. He also mentioned that with a community-proven interoperability standard the OGC process can be relatively forgiving and streamlined (fingers crossed ... lets hope). As I understand it, the most recent documents from GALEON allow for an OPeNDAP URL as the payload of WCS. So the concept of the API standard -- the reference to the file, rather than the binary file itself -- has already made its way into the GALEON work, too. I imagine there have already been discussions about this point. Very interested to hear yours and other's thoughts.- Steve ========================== Ben Domenico wrote: Hello, At the galeon team wiki site: http://sites.google.com/site/galeonteam/Home/plan-for-cf-netcdf-encoding-standardI put together a rough draft outline of a plan for establishing CF- netCDF as an OGC binary encoding standard. Please note that this is a strawman. Comments, suggestions, complaints, etc. are very welcome and very much encouraged. It would be good to have the plan and a draft candidate standard for the core in pretty solid shape by early September -- 3 weeks before the next OGC TC meeting which starts on September 28.One issue that requires airing early on is the copyright for any resulting OGC specification documents. Carl Reed, the OGC TC chair indicates that the wording normally used in such documents is:Copyright © 2009, <name(s) of organizations here>The companies listed above have granted the Open Geospatial Consortium, Inc. (OGC) a nonexclusive, royalty-free, paid up, worldwide license to copy and distribute this document and to modify this document and distribute copies of the modified version.I'm sending a copy of this to our UCAR legal counsel to make sure we are not turning over ownership and control of the CF-netCDF itself..-- Ben _______________________________________________ galeon mailing list galeon@xxxxxxxxxxxxxxxx For list information, to unsubscribe, visit: http://www.unidata.ucar.edu/mailing_lists/ _______________________________________________ galeon mailing list galeon@xxxxxxxxxxxxxxxx For list information, to unsubscribe, visit: http://www.unidata.ucar.edu/mailing_lists/
John -------------- John Graybeal <mailto:graybeal@xxxxxxxxx> -- 831-775-1956 Monterey Bay Aquarium Research Institute Marine Metadata Interoperability Project: http://marinemetadata.org
galeon
archives: